Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Seeking Out the Spaces Between:

Using Improvisation in Collaborative


Composition with Interactive
Technology abstract

T his article presents findings


from experiments into piano
and live electronics undertaken

Sarah Nicolls by the author since early 2007.


The use of improvisation has
infused every step of the
process—both as a methodol-

I
ogy to obtain meaningful results
using interactive technology and
as a way to generate and char-
acterize a collaborative musical
n seeking to create responsive “performance en- Context space with composers. The
vironments” at the piano, I explore live, performative control technology used has included
Many practitioners in the field of pre-built MIDI interfaces such as
of electronics to create better connections for both performer live performance with electronics the PianoBar, actuators such as
(providing the same level of interpretive freedom as with a make their own interfaces or in- miniature DC motors and sensor
“pure” instrumental performance) and audience (communi- struments with which they impro- interfaces including iCube and
cating clearly to them). I have been lucky to witness first-hand the Wii controller. Collaborators
vise; this is readily demonstrated have included researchers at the
many live interactive performances and to work with various by communities such as New In- Centre for Digital Music (QMUL),
empathetic composers/performers in flexible working envi- terfaces for Musical Expression Richard Barrett, Pierre Alexandre
ronments. Collaborating with experienced technologists and (NIME), inspired by leading figures Tremblay and Atau Tanaka.
musicians, I have witnessed time and again what, for me, is a such as Nicolas Collins and Michel
fundamental truth in interactive instrumental performance: Waisvisz. From what is now a hugely
As a living, spontaneous form it must be nurtured and in- broad field, performances I have
formed by the performer’s physicality and imagination as witnessed recently that seem most relevant, either through
much as by the creativity or knowledge of the composer and/ their use of physical drama or a particular technology, include
or technologist. Chikashi Miyama (“Angry Sparrow”), in his dazzling, virtuosic
Specifically in the case of sensors, their dependence on the and humorous performance on a self-made interface [1], and
detail of each person’s body and reactions is so refined as to Derek Holzer [2], whose optical discs are attached to spinning
necessitate, I would argue, an entirely collaborative approach motors and thrust under an overhead projector for instant,
and therefore one that involves at least directed improvisa- rough-and-ready multimedia effect.
tion and, more likely, fairly extensive improvised exploration. Relevant sensor performances include Atau Tanaka (Sen-
The fundamentally personal and intimate nature of sensor sors_Sonics_Sights) (SSS) [3] and Benjamin Knapp [4]; both
readings—the amount of tension created by each performer, use the BioMuse sensor system, which was invented by Knapp
the shape of the ancillary gestures or the level of emotional and Hugh Lusted [5]. Tanaka and Knapp present a fascinating
involvement (especially relevant when using galvanic skin contrast, as they use the same system to quite opposite ends:
response or EEG)—makes creating pieces with sensors ex- Tanaka is a highly gestural, physically active and expressive
tremely difficult for a composer to do in isolation. Improvisa- performer, while Knapp performs seated and—using sensors
tion therefore provides a way for performer and composer to including EEG and galvanic skin response—plays with emo-
generate a common musical and gestural language. tional readings, generating music from a quite inward control
Related to these issues is the fact that the technical and of his internal self. At MIT, Elena Jessop developed a beauti-
notational parameters in interactive music are not yet (and fully intuitive glove [6], which enables her, for example, to
may never be) standardized, thereby creating a very real and grab notes seemingly from her mouth and lengthen them by
practical need for improvisation to figure at least somewhere pulling away from the face smoothly.
in the process. The individuality of each of these performers only strength-
ens the case that improvisation is not only a way of generating
music but also the key to inventing and learning a host of new
instruments, interfaces or systems of interaction.

Sarah Nicolls (artist, educator), Centre for Contemporary Music Practice, School of Arts,
Brunel University, Kingston Lane, Uxbridge, UB8 3PH, U.K. E-mail: <info@sarahnicolls. Physicality
com>. Web site: <www.sarahnicolls.com>.
The consideration of physicality is one of the key aspects in
See <mitpressjournals.org/lmj/-/20> and <www.sarahnicolls.com> for supplemental files
related to this article. creating instrumental performances designed to give control
Co-author on Case Study 1: Richard Barrett, Wilhelm-Stolze-Strasse 30, 10249 Berlin, of the electronics to the performer. Several texts affirm the
Germany. E-mail: <richard@furtlogic.com>. importance of the physicality inherently learnt and absorbed
Co-authors on Case Study 4: Samer Abdallah, Kurt Jacobson, Andrew Robertson, Adam as part of instrumental study. John Richards’s article “Lost
Stark and Nick Bryan-Kinns, Centre for Digital Music, Queen Mary, University of London
(QMUL), Mile End Road, London E1 4NS, U.K. E-mail: c/o <adam.stark@elec.qmul. and Found” [7] has an array of excellent quotations, the most
ac.uk>. Web: <http://www.eecs.qmul.ac.uk/~nickbk>. succinct of which is Bob Ostertag’s Human Bodies, Computer

©2010 ISAST LEONARDO MUSIC  JOURNAL, Vol. 20, pp. 47–55, 2010       47
Music: “An intelligence and creativity is larly poetic, yet astute, point: “Implicit gesture and musical phrase, is a more or
actually written into the artist’s muscle in the experienced musician’s under- less complex internal representation of
and bones and blood and skin and hair” standing of the relationship between ac- the dynamics of an instrument” [9]. This
[8]. O’Modhrain and Essl make a simi- tion and sound, between performance notion that there is an imprint of the
performer’s instrument within the per-
former’s body is vital when considering
why improvisation might be necessary in
generating the language and parameters
for performing interactive instrumental
music.
To improvise means at one level to fol-
low one’s instinctive urges, the internal
reactions and responses that could be
referred to as pre-analysis in the perform-
er’s own cognitive process. If this is then
paired with a physical internal awareness
of the capabilities of one’s setup or in-
strument, then the responses will logi-
cally be faster and more innate, intuitive
and highly responsive than if one or an-
other is non-instinctive. If the performer
can intuitively know the edges of physi-
cal possibility for the sensors—where the
(a) highest and lowest readings are found
for example—then the manipulation of
these will be managed most deftly.
Instrumentalists have finely tuned sys-
tems of tactile or physical feedback (both
external, when touching keys etc., and
internal—knowing when or how to re-
lax when playing fast, for example) and
in working with interactive technology,
muscle memory gets built up in a similar
way. Also crucial to this discussion: When
creating new composed pieces with an in-
teractive setup, to have the performer
improvise with the technology means to
unlock this inner physical language, to
find both what is possible and natural
and also what is unnatural, or outside of
the natural body language: “the spaces
between pianism,” in my case. This then al-
(b) lows for the fundamental aesthetic judg-

(c)
Fig. 1. Examples of Tremblay’s score. (© Pierre Alexandre Tremblay)

48       Nicolls, Seeking Out the Spaces Between


6 Independent Audio Outputs
L Lc Rc R

Lu Ru

computer
piano

2 Audio Feed (1 omni microphone + 1 magnetic pickup)

2 Data Feed (1 control pedal + 1 MIDI note stream)

public

Fig. 2. tremblay’s schema. (© Pierre alexandre tremblay)

ment of whether to make the interactive found that I would focus on playing the so that the result gradually diverged in
control in addition to, or part of, the in- sensors, thereby turning the previously pitch and timbre from the original.
strumental playing. nearly subconscious movement into a Our pre-made parts (my part: per-
How the use of physicality may change material action. As a solo performer is forming the Lost score, and Barrett’s
the original gesture also needs consider- only one body, one mind, these cycles part: playing the new recorded version)
ation. The writings of Wanderley and of complexity and confusion perhaps were now a basis for Adrift, a consistent
Cadoz [10] on this topic are well known; begin to disrupt the artistic spontaneity continuation (into the real time of per-
their discussion is furthered by Wander- and intuitive physical sense, potentially formance) of the compositional process
ley and Miranda’s extensive 2006 study of undermining the original meaning of that gives Lost its particular structure:
new instruments: the gesture. taking basic material and interpolating
more and more inserts into it until the
The instrumental gesture . . . is ap- original material becomes almost liter-
plied to a concrete (material) object Case study 1 ally lost in its own extrapolations, distor-
with which there is physical interac-
tion; specific (physical) phenomena are Richard Barrett’s Adrift (2007) was tions, reflections, etc. Either performer
produced during a physical interaction commissioned as part of my first Arts in Adrift could interrupt her/his given
whose forms and dynamics can be mas- and Humanities Research Council part at any time and interpolate an im-
tered by the subject. These phenomena (AHRC)–funded project in 2007 [12], provised passage before continuing from
may become the support for communi-
cational messages and/or be the basis for
which sought to increase the repertoire the same point where he/she left off (as
the production of a material action [11]. for piano and live electronics. I had al- if using a pause button).
ready performed Barrett’s Lost, and this What was fascinating was how, having
The consideration of how adding a became the foundation for Adrift [13]. ingested Lost through hours of prac-
sensor to a pianist’s arm may affect both Essentially, Adrift amplified Lost into a tice, I found myself quite naturally and
the pianist’s and the audience’s relation- semi-improvised duet for Barrett and me subconsciously improvising in Barrett’s
ships to the original semiotic function (with Barrett playing his keyboard system compositional language. Helped by hav-
of the gesture was one of the main ques- using STEIM’S LiSa program). He began ing rehearsed in close proximity over sev-
tions resulting from work on Case Study the compositional process by recording eral sessions and having witnessed several
3. Although not a central issue for this my performance of Lost and chopping it performances by Barrett in his groups
article, I briefly illustrate the problem I into upwards of 70 sections. These were FURT and fORCH [14], I effectively
found here. Imagine the pianist lifting reordered and gradually shifted in pitch, internalized the physical language that
the arm away from the keyboard, per- the first ones very slightly, increasing as accompanies his music. In the live per-
haps signifying a breath between musical the piece progressed. The degree of formance [15], Barrett sat at the other
phrases. When using this gesture to gen- other processes (filtering, short delays end of the piano, facing me, in the posi-
erate data and, in turn, process sound, I and feedback) also generally increased, tion of a second pianist in a two-piano

Nicolls, Seeking Out the Spaces Between 49


work, and the amplification was local (a Case Study 2 relevant language for the piece (Fig. 1).
stereo pair was placed at either end of Pierre Alexandre Tremblay’s Un clou, This process enabled us to understand
the piano). Thus, both of us performed son marteau et le béton (2008) [17] illus- each other’s perceptive interpretations
toward each other, creating an intimate trates the use of improvisation in gener- of symbol, word and sound.
mirror image. ating material, finding a common and We had several sessions like this, grad-
Barrett’s experience of improvisation genuinely cumulative language between ually obtaining passages that we could
within the compositional process has composer and performer, and creating easily re-create through notational short-
constantly evolved; he wrote the follow- a long piece of music (around 22 min- hand and that could also be understood
ing to me after our collaboration: utes), which is rigorously composed, by someone else coming fresh to the
The more I attempt to define what im-
yet with only approximately 36 bars of score. Sections of the piece used system-
provisation is, the more it seems to slip music written on a stave. I had heard atic processes, thereby limiting the need
through the fingers; if, for example, it La Rage, Tremblay’s 50-minute suite for notation; one example is a section
is defined as those aspects of musical for free-jazz drummer and electronics, in which the computer and I built up
creation which are spontaneous or un- showing Tremblay’s ability to frame a an intensifying call and response, with
planned, you run into difficulties. So in-
stead, I prefer to think of “composition” multi-dimensional performer/machine the computer taking my notes and re-
as defining the act of bringing music interaction, combining composition and ordering and speeding them up, and I
into being, and “improvisation” as one improvisation. Tremblay and I used im- then imitating its rhythmic profile with
element among various means by which provisation from the outset, improvising new pitches. Other sections used a bias
that might be brought about. Thus, it
isn’t really a matter of bringing “impro-
together at first to get to know each other from the background to direct the im-
visation” and “composition” together, as musicians, with Tremblay on laptop provisation subliminally: for instance, a
which at first I thought it was: it is more and bass guitar. We then began the piece relatively free section with a given fixed
a question of realizing that they aren’t using bare-boned notation that I impro- electronic part allows imposition of a tar-
really two different things [16]. vised upon to test musical gestures and get on an improviser without explicitly
It is interesting to note that as the pia- specific real-time processing. Tremblay giving musical instructions.
nist, I was able to move from studying also asked me to improvise freely within Tremblay then wrote the piece, using
Barrett’s compositional language to im- some settings: over a fixed electronic either audio control signals (certain au-
provising something comparable, while part, or within some real-time process- dible pitches or the creation or absence
as composer Barrett used improvisation ing that I would subvert with my own of sound) or direct inputs (we used the
to seek out his compositional language. musical inputs. We recorded the results PianoBar—a MIDI device placed over
This cyclical process informed the proj- to use as triggers for the next session. the keys of the piano, reading the pitch
ect and helped to create a fine balance Tremblay created notation (mixtures of and velocity of each key when played)
between freedom and a stylized, con- text, conventional and guided improvi- (Fig. 2). Because the piece used different
sistent musical language. Although the sation) to show how he would interpret input messages (for example, depress-
interpreter was relatively free, the com- what I had played, which allowed me to ing keys or making vocal noises) to the
poser’s voice in fact infused all aspects feed back on what was communicated patch at different points in the piece, it
of the music. to me, building up the most efficient or felt highly responsive in performance: It

Fig. 3. Atau Tanaka wearing the EMG sensors in the positions we used. (Photo © Sarah Nicolls)

50       Nicolls, Seeking Out the Spaces Between


Fig. 4. Practicing for the collaboration with Centre for Digital Music. (Photo © Sarah Nicolls)

constantly shifted my attention, and the a piece that would combine my natural with the sensors—using my arms in mid-
process thus felt akin to performing with gestural/physical/emotional approach air to find the thresholds and to find ges-
other live musicians. Indeed, the result- to the piano—including improvisation tures that produced the right amount of
ing piece I found to be such a detailed in the final performance—with Tanaka’s muscle tension and to begin to memorize
web of interactivity, with many subtle compositional language and detailed where and how I needed to be to create
changes of technological response, that and practical knowledge of the sensors. useful signals—internalizing the “instru-
it felt very much as if I were improvis- Although we used his physical perfor- ment” of the sensors.
ing with Tremblay himself: The “perfor- mance language as a basis for my learn- Returning to the piano was difficult af-
mance environment” engaged me in a ing to “play” the sensors (i.e. his gestural ter this extended period of learning the
living, breathing way. What fascinates me shapes and tricks to create the right read- sensors. Space here is not sufficient to go
is how strict yet supple the piece is, and ings), we allowed room, using improvisa- into more detail about the process, but
I conclude that using improvisation as a tion, for my own performative language after giving a work-in-progress showing
methodology enabled this commonality to develop. to a theater-trained audience I worked
to thrive. After generating some initial motivic to restore the relationship of sensor use
and textural musical ideas through im- to pianistic gesture. The tangent of per-
provisation, I further improvised upon forming without the piano in a theatrical
Case Study 3 these while wearing the sensors to see setting however did give useful insights
Atau Tanaka’s Suspensions [18] shows what kind of data they would generate. into meanings attached to gestures and
how improvisation was used to generate In two or three initial sessions together, also raised questions about whether or
the grammatical or theatrical language I took the sensors and began to under- not it was desirable to reveal the tech-
for an interactive system. We used one stand what they did, by simply making nology [19] and how much I wanted
EMG sensor (reading electrical currents gestures and watching their resultant the audience to understand the connec-
created by muscle contraction) on each data in a patch. This soon became too tions between gesture and sound. Again,
arm (on the forearm extensor muscles) limited, so we made a frame patch with Tanaka and Knapp serve as useful ends
(Fig. 3) and a double-axis accelerometer which I could practice using sonic feed- of the spectrum in this case, with Knapp
on the right wrist. We set out to create back. From this point, I practiced purely wearing his sensors hidden beneath a suit

Nicolls, Seeking Out the Spaces Between     51


Fig. 5. Schema for Case study 4 resulting piece. (© Adam Stark)

jacket while Tanaka wears his much more referring back to earlier descriptions of ing them in real-time—and simultane-
openly, almost reveling in them. the internalized imprint of the piano in- ously make an engaging spectacle for the
For me, the single most successful mo- side my own body) that gave life to this audience.
ment in the proceedings took place when idea. This, I think, is the nub of why im-
improvising with a sampled chord “in my provisation is such a useful tool: It allows Methodology and
arm” (i.e. an EMG sensor on my arm was the performer to be responsive to the Collaborative Process
mapped to a sample). I sat at the piano moment, to the environment and to the ac- For this project, I invented PianoLab: a
and when approaching the keyboard cidents and discoveries that we intuitively research space that would allow for real-
increased the tension in my arm. I be- find. time, genuinely collaborative, evolution-
gan triggering the very, very beginnings ary research. Crucially, it placed a piano
of the sound—much like the sounds at the heart of the research environment
of breathing or bowing before a note Case Study 4 and provided room to build electronics
speaks on a string or wind instrument— In collaborating with the Centre for and house several computer stations.
however, instead of then allowing the Digital Music [20] I sought to create an As it was the first PianoLab project, its
actual pitched sound to come out of interactive instrumental performance, methods of research and implementa-
the computer, I instead played the same with flexible performer-computer inter- tion were developed as we worked, to
chord on the piano. This moment en- action that would produce live generative solve the balance between what was
capsulated for me a genuinely new computer algorithms and give the player practical or possible artistically and
approach to the piano—one that could both significant control and room to be technically.
only be enabled by this technology— surprised by a computer’s responses. We The team of four technologists and I
creating a fascinating and intimate hoped to answer some of the challenges worked for an intense week of iterative
space in which to explore the relation- discussed as far back as 1973 by Cornock prototyping. Working in the same room,
ships of the sensors to the pianistic and Edmonds [21], creating a circular with ongoing experimentation as part of
performance. performative feedback loop to make the the development process, we designed,
This scenario might never have been relationship between algorithms and the implemented and tested the technolo-
realized without improvisation, as it was physicality of the performance seamless gies with active feedback from me. Im-
the combination of Tanaka and his as- and meaningful. We wanted the per- provisation formed the bedrock for our
sistant’s ideas and computer expertise former to provide input to generative research, as the key focus was always
and my own pianistic approach (again, algorithms—responding to and modify- how the technology could be used to

52       Nicolls, Seeking Out the Spaces Between


allow the performer to contribute to live Use of Performance worked in the pressure of a real-time en-
algorithms, manipulate the feedback and as a Research Method vironment, and whether this integration
create an engaging spectacle. Prior to the final performance, we held of technology with musical performance
two pilot performances to further de- succeeded aesthetically. I made the fol-
Compositional Process velop the piece, the first of which took lowing crucial discovery: If the technol-
In an early brainstorming session we place at the end of the first week of col- ogy is already complex, then the actual
came up with the idea of using a hat to laboration. Limited by the then-current musical substance can be more direct or
house a Bluetooth triple-axis acceler- state of the technology, this performance simplified without lowering the overall
ometer (iCube Gforce 3D-3 v1.1). This was incredibly informative for us, provid- complexity of the artistry.
would give us an easily identifiable and ing instant feedback. This question of where complexity
highly performative input mechanism. For the programmers the perfor- resides is also a powerful design point:
I then asked the technologists to show mance provided the opportunity to as- If one desires the gross result to be bal-
me their current work, to get an idea of sess the usability of the technology in anced, then cases where more complex-
what might be possible in the time we the context of a unified musical piece, ity occurs at the interface level should
had (approximately 2 months to the final observing how well the necessary tran- be balanced by less complexity in the
performance, with 3 weeks allotted col- sitions between elements of the system detailed physical control: that is, some-
laborative working time).
The piece evolved as a section-by-sec-
tion improvisation; I would practice with
pre-built systems or patches and suggest
Fig. 6. First prototype of Nicolls’s piano. (Photo © Sarah Nicolls)
ideas that could be created immediately.
The use of improvisation was a vital
mechanism for me to understand what
the pre-existing software systems did and
how they might interact with the live,
acoustic piano sound. Having decided
upon the hat as a major input device, I
also wore it while exploring different pia-
nistic textures and simultaneously seek-
ing out a gestural language with my head.
Our final output was a 20-minute
work for grand piano, electronic sound
and mechanical devices created using a
MIDI controller and pedal, nine DC mo-
tors and the top hat (Fig. 4) (the per-
formance can be viewed on-line [22]).
The piano was placed in the middle of
a quadraphonic speaker system, using
contact microphones to avoid feedback.

Technology Notes
The sensors in the hat triggered an algo-
rithm, which we developed to map the tilt
of the accelerometer to a 2D parameter
space with x and y mapped to the pitch
and temporal dispersion parameters of
a granular synthesis effect (Fig. 5). For
the looping patch, analysis of the spectral
range of the live piano audio was used
as an onset and offset detector to trigger
suitable start and end points for the loop.
These loops were continuously stored as
indices of an audio buffer with a memory
of 1 minute. New onset events triggered
the playing of previously recorded loops
in a stochastic manner. The system was
designed so that the performer would be
able to fix the loop being used so that
it could provide a repetitive background
for further improvisation. The rhythmic
patterns used by the motors and piano
samples were generated using Markov
chains organized by varying degrees
of predictability, selectable by the per-
former using a MIDI controller.

Nicolls, Seeking Out the Spaces Between     53


thing with a lot of buttons or faders that It was interesting to note the differ- Acknowledgments
need to be used frequently might not be ent approaches of artist and scientist in The HCI:UK 2008 collaboration was made pos-
best paired with detailed muscle tension this context—the desire for immediate sible by a (re)Actor3 Artist in Residence Commis-
control. Similarly, if balance is sought visible or tangible elements with which sion, sponsored by the Centre for Digital Music,
Queen Mary, University of London, and produced
between predictability and unpredict- to play, measured against a detailed, by BigDog Interactive Ltd. My initial research into
ability, then placing features with results thorough and quite time-consuming interactivity was funded by the Arts and Humanities
Research Council; the development of PianoLab was
that cannot be predicted in certain “ar- process of making the technology do enabled by a Brunel Research Innovation and En-
eas” of one’s instrument can create useful what we wanted. Michael Zbyszyński terprise Fund. I am grateful for further PianoLabs,
creative springboards. [23] made reference to these paral- including those at CNMAT, Newcastle University
and the University of Cincinnati, and for input from
I also made discoveries about gestural lel demands when discussing a similar Pierre Alexandre Tremblay for Case Study 2.
control of algorithms while playing, in project with Frances-Marie Uitti at the
particular related to the advantages of Center for New Music and Audio Tech-
References and Notes
the sensors placed in the hat: They af- nologies and the need for technologists
forded physically communicative control to work much faster and on-the-fly than 1. Chikashi Miyama (Angry Sparrow) and Ben Knapp
(with Eric Lyon, Gascia Ouzounian), NIME 2009,
without impinging on or affecting the would be appropriate for securing tech- Concert 2, Friday, 5 June 2009. See <nime2009.org/
piano playing itself and thus were eas- nology for an actual performance. In concert.pdf> (accessed 28 December 2009).
ily isolated from the other elements of reality, the balance between time in the 2. Derek Holzer at noise=noise, Goldsmiths, Uni-
the music at any point. Feedback from laboratory and the time apart worked versity of London, 17 February 2009, curated and
organized by Ryan Jordan.
the audience was also immensely valu- best for technological and musical
able and was acted on in the following development. 3. SSS (Atau Tanaka, Cecile Babiole, Laurent Dail-
leau), stitched-up, sk-interfaces closing event, FACT
performance: The sensors used in the Some points I had discovered previ- (Foundation for Art and Creative Technology), Liv-
first performance were placed in an in- ously were reinforced—for example, erpool, 29 March 2008. See <www.fact.co.uk>.
conspicuous hat, and we noticed that that improvising with algorithmic sys- 4. Miyama and Knapp [1].
several audience members did not make tems can stimulate greater artistic free-
5. See <o-art.org/history/Computer/CCRMA/
the connection between the gestures and dom or range. Other discoveries were CCRMAres.html> (accessed 28 December 2009).
the sound manipulations, as they seemed new; one of the most important of these
6. E. Jessop, “The Vocal Augmentation and Ma-
intrinsic to the performance. As a result, was about the actual language of mu- nipulation Prosthesis (VAMP): A Conducting-Based
the sensors were placed in a top hat for sic in interactive music. It is a potential Gestural Controller for Vocal Performance,” demon-
the next performance, which the audi- equation: If the understanding of the in- stration, NIME Pittsburgh, 2009.
ence found very straightforward. teractivity in the audience’s perception 7. J. Richards, “Lost and Found: The Mincer,”
lends greater weight to the performative Leonardo Electronic Almanac 15, Nos. 11–12 (2008).
Available at <leoalmanac.org/journal/Vol_15/lea_
Reflections from communication, then demonstrative v15_n11_12/JRichards.asp>.
the Technologist(s) and simply made gestures or musical
8. Bob Ostertag, “Human Bodies, Computer Music,”
The experience of working toward spe- material can carry the message most Leonardo Music Journal 12 (2002) p. 11.
cific artistic goals, as opposed to scientific effectively.
9. G. Essl and S. O’Modhrain, “Enaction in the Con-
ones, was both a novel and a rewarding Overall, the project served to high- text of Musical Performance,” Interdisciplines virtual
experience. With the focus on work- light the need for and benefit of this workshop (by participants in Enactive interfaces Net-
able real-time implementations, while kind of collaborative lab-based work, work) (2004) p. 1. Available at <www.interdisciplines.
org/enaction/papers/17>.
demanding the technology produce a especially when dealing with interactive
subjectively interesting aesthetic, the pro- technology. 10. C. Cadoz and M. Wanderley, “Gesture—Music”
in M. Battier and M. Wanderley, eds., Trends in Ges-
cess led to extended discussion and large tural Control of Music (Paris: Editions IRCAM, 2000)
output from all involved. In comparison pp. 71–93.
to work developed in a laboratory, the in- Current Developments 11. E. Miranda and M. Wanderley, New Digital Musical
stant feedback from the pianist allowed At the time of writing I have begun ex- Instruments: Control and Interaction beyond the Keyboard
(Wisconsin: A-R Editions, 2006) p. 10.
quick identification of creative dead ends perimenting with a purely live sampling
and forced the focus upon the most in- scenario, where I can grab the sounds I 12. Arts and Humanities Research Council project,
May-–December 2007.
teresting ideas, with little misunderstand- am currently playing by reaching into a
ing. Development “onsite,” with the particular point in the air above the key- 13. Richard Barrett, Adrift, PSI 09.10 CD. Recorded
live at The Warehouse, London, 29 November
ability to immediately test ideas, led to board and then manipulate these with 2007. Available at <www.squidco.com/miva/mer-
the elimination of erratic technological different gestures. This research is be- chant.mv?Screen=PROD&Store_Code=S&Product_
behavior and a convergence toward the ing undertaken with Nick Gillian at the Code=12431>.
technologies of the final piece. Sonic Arts Research Centre, Belfast, us- 14. Performances: FURT, Red Rose pub, Finsbury
ing a Polhemus magnetic tracking device Park, London, spring 2007; and fORCH, Spitalfields
Festival, summer 2007. See <furtlogic.com>.
Reflections from the Performer [24]. Setting up PianoLab as a perma-
In developing interactive performance, nent space is a long-term goal, and over 15. See <web.mac.com/sarahnicolls/research/
*Barrett.html>.
the complexity of the potential relation- the next 3 years we will also be develop-
ships between gesture and sound is much ing further prototypes of the new piano 16. Richard Barrett, personal e-mail, December 2009.
better expressed in real-time demonstra- (Fig. 6) [25]. To complete the circle, 17. First performance, University of Huddersfield,
tion than in remote conversations or, this itself was the result of improvisation: U.K., 6 March 2009.
worse, written debate. For choosing the during the first PianoLab, I dismantled a 18. First performance, Huddersfield Contemporary
correct input signal, guiding it through a piano, hanging the soundboard from the Music Festival, Huddersfield, U.K., 21 November
2009. See <www.hcmf.co.uk>.
relevant process and producing a mean- ceiling; while it hung there, I began to
ingful output, I found the laboratory imagine re-attaching a keyboard to it, to 19. Whether to “reveal the magic and do the trick
anyway”—Augusto Corrieri summing up our pro-
method to be extremely liberating and create a new spatial relationship between cess for Soundwaves Festival 2007 <web.mac.com/
time-saving. keyboard and strings. sarahnicolls/research/*Augusto_Corrieri.html>.

54       Nicolls, Seeking Out the Spaces Between


20. Centre for Digital Music at Queen Mary, Univer- 25. Sunday Lunch Club series organized by Prototype Michael Edwards I kill by proxy on sumtone records
sity of London. See <www.elec.qmul.ac.uk>. Click on Theatre <www.proto-type.org>. <www.sumtone.com/recording.php?id=31>.
Research/Centre for Digital Music.

21. S. Cornock and E. Edmonds, “The Creative Pro-


Discography Manuscript received 1 January 2010.
cess Where the Artist Is Amplified or Superseded by The piano music of Niccolo Castiglioni on Metier <www.
the Computer,” Leonardo 6, No. 1 (1973) pp. 11–16. amazon.com/s?ie=UTF8&field-artist=Niccolo%20 Sarah Nicolls is a pianist specializing in
Castiglioni&rh=n:5174,p_32:Niccolo%20 contemporary music and live electronics,
22. See <web.mac.com/sarahnicolls/research/ Castiglioni&page=1>.
*machines.html> regularly performing concerti with the Lon-
Alexander’s Annexe Push Door To Exit on WARP don Sinfonietta and featured on BBC Radio
23. M. Zbyszyński, “Augmenting the Cello” (NIME Records <http://warp.net/records/releases/ 3. Nicolls also plays in Alexander’s Annexe
Paris 2006). alexanders-annexe/push-door-to-exit>. (Warp Records). Nicolls is a Senior Lecturer in
24. See experiment at <www.youtube.com/ Richard Barrett’s Adrift on psi records <www. Music at Brunel University. See also <www.
watch?v=EA90JC9PUKg> emanemdisc.com/psi09.html> sarahnicolls.com>.

Nicolls, Seeking Out the Spaces Between     55


CALL FOR PAPERS

Leonardo Music Journal 22 (2012)


Acoustics

Immersed as we are in electronically mediated sound, at the end of the day—whether it’s coming from
ukuleles or earbuds—sound reaches us through acoustic pressure. The sheer physicality of sound, and its
quirky interaction with our sense of hearing, has driven many a composer and sound artist to go back to
the “year zero” in music—before the codification of melody, rhythm and harmony—and explore funda-
mental aspects of the physics and perception of sound.

For Volume 22 of LMJ we solicit articles and artist’s statements on the role of acoustics and psychoacoustics
in music and audio art.

DEADLINES

15 October 2011: Rough proposals, queries

1 January 2012: Submission of finished articles

Address inquiries to Nicolas Collins, Editor-in-Chief, at: <ncollins@saic.edu>.

Finished articles should be sent to the LMJ Editorial Office at <lmj@leonardo.info>.

Editorial guidelines and information for authors can be found at <http://leonardo.info/Authors>.

Note: LMJ is a peer-reviewed journal. All manuscripts are reviewed by LMJ editors, editorial board members
and/or members of the LMJ community prior to acceptance.

You might also like