Improvisation on a live-coded mobile musical instrument using urMus


Sang Won Lee
,  Cameron Hejazi,  Bruno Yoshioka,and Georg Essl 

[snaglee, chejazi, byoshi, gessl]@umich.edu

Computer Science Engineering, University of Michigan

Live coding[1] yielded a unique practice in computer music performance, where a musician uses a programming language as a musical instrument[2,3]. Typically, the outcome of live coding is generative music (and/or visuals). In contrast, we wish to expand the scope of live coding. Here the outcome of live coding is not music but a musical instrument implemented on a mobile device(iPad). There we have a seperate instrumental player who’s improvising with the instrument being made live. Using urMus[4], a development environment for mobile music interface, the coder can transfer source code to a device over a wireless network and remotely run the digital musical instrument without physical interference in performing.

While the tradition of live coding offers to electronic musicians novel ways to be musically expressive with their laptops, there exist particular styles of music that are well presented with live coding; gradually evolving, constant tempoed, and repetitive rhythmic music. In contrast, it is hard to achieve immediate expressivity as you would with traditional musical instruments, where you produce one acoustic event with one gesture[5]. Therein lies one of our motivations in this work: to decouple an instrument player from live coding to add fluid and immediate expressivity.

In this demo, we have a instrumental performer who does not code but perform the instrument being coded over the network. Two live coders implement a simple x-y controller on one tablet while a performer improvise with it. One live coder implements the interface (background, a movable button and mapping touch events to sound control parameters) while the other focuses on building the sound synthesis algorithm. Later in the video, a live coder, who deals with sound synthesis, can participate in performance by capturing(recording) a performer’s play and looping the pattern. The instrumental performance in the video is an improvisation but the idea of the mobile instrument live coded is shared and structured in advance. 

 Initially, the interface builder starts with a set of helper function prepared (such as, createRegion(), noteNumberToFrequency() etc.), it takes a few minutes to make a button generate sound. You can see the expressive space of the instrument changing over time. Initially, pressing the button only produce a pitched-tone and later the instrument covers a wider frequency range and control over the level of distortion mapped to the x-y coordinate of the button.

An interesting question comes from the distributed nature of the ensemble. The performer’s playing gesture is dependent on the current implementation state of the instrument that the live coders create. Will executing new code interrupt the performer playing what he has in mind? Is changing the sound synthesis algorithm on the fly fine even when the performer is pressing the button? One way we present here is for the coders to communicate with the performer by chat interface via their laptops. Furthermore, another way we often used in the demo video was to let the performer execute certain code that the coder submitted (the red button labeled “RUN” in the video). In that way, the performer can “pull” the new code so that change will not interfere with his performance. In general, this is a common question in case that the outcome of live programming has elements of interactions with end users(or non-live coders). We suggested multiple ways for live coders to preserve the state of live interaction in [6].

As live coding has focused heavily on audiovisuals, we wish this work can be an expansion of the live programming to the user interaction setting. In addition, live coding will benefit the aesthetic framework of instrumental music from its flexibility. For more detailed related works, motivations, and design guidelines about live coding a musical instrument, see [6].

[1] Collins, N., McLean, A., Rohrhuber, J., and Ward, A. Live coding in laptop performance. Organised Sound, 2003. 8(3): p. 321-330.

[2] Blackwell, A. and Collins, N. The programming language as a musical instrument. In Proceedings of In Proceedings of PPIG05 (Psychology of Programming Interest Group). 2005

[3] Wang, G. and Cook, P.R. On-the-fly programming: using code as an expressive musical instrument. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). 2004. National University of Singapore.

[4] Essl, G. UrMus – an environment for mobile instrument design and performance. In Proceedings of the International Computer Music Conference. 2010. New York.

[5] Wessel, D. and Wright, M. Problems and prospects for intimate musical control of computers. Computer Music Journal, 2002. 26(3): p. 11-22.

[6] Lee, S.W. and Essl, G. Live coding the mobile music instrument. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). 2013, Daejon, South Korea (In Press)

About snaglee

seeking for inspiration
This entry was posted in Blog, Music Tech., Personal and tagged , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s