Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

The Preservation, Emulation, Migration, and Virtualization

of Live Electronics for Performing Arts: An Overview of


Musical and Technical Issues
ALAIN BONARDI and JÉROME BARTHÉLEMY
IRCAM

Centers for electronic music creation now face strong difficulties to reperform important works considered as belonging to their
repertoire. They realize the fragility of works based on electronic modules, subject to technology changes. Sustainability can
be achieved in four different ways that are examined in this article: preservation, emulation, migration and virtualization. We
discuss these issues, showing the first steps towards solutions at Ircam in the framework of the European project Caspar.
Categories and Subject Descriptors: H.3.1 [Information Storage and Retrieval]: Content Analysis and Indexing—Abstracting
methods; H.5.5 [Information Interfaces and Presentation]: Sound and Music Computing—Modeling, methodologies and
techniques
General Terms: Documentation, Reliability.
Additional Key Words and Phrases: Preservation, performing arts using electronics, reperformance
ACM Reference Format:
Bonardi, A. and Barthélemy, J. 2008. The preservation, emulation, migration, and virtualization of live electronics for performing
arts: An overview of musical and technical issues. ACM J. Comput. Cultur. Heritage 1, 1, Article 6 (June 2008), 16 pages. DOI =
10.1145/1367080.1367086 http://doi.acm.org/10.1145/1367080.1367086

1. INTRODUCTION
Many researchers, users, and composers, especially in electronic studios, have become aware of the
fragility of musical pieces using electronics [Teruggi 2001]. As Bernardini and Vidolin [2005] have
noted, the situation is quite paradoxical: “real-time/performed electro-acoustic music (also known as
live electro-acoustic music) is currently facing a serious sustainability problem: while its production is
indeed considered very recent from the music history point of view, several technological generations
and revolutions have gone by in the meantime”.
This sentence may be applied to all artistic works using electronics, not only to electro-acoustic
music. Correctly reperforming the most important pieces created in the studios of institutions has

This work was partially supported by the European Community under the Information Society Technologies (IST) program of
the 6th FP for RTD – project CASPAR, contract IST-033572.
The author is solely responsible for the content of this article. It does not represent the opinion of the European Community, and
the European Community is not responsible for any use that might be made of data appearing therein.
Authors’ address: A. Bonardi, J. Barthélemy, STMS, Ircam-CNRS, Institut de Recherche et de Coordination Acoustique/Musique
– 1, place Igor-Stravinsky – 75004 Paris, France; email: {alain.bonardi, jerome.barthelemy}@ IRCAM.fr.
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided
that copies are not made or distributed for profit or direct commercial advantage and that copies show this notice on the first
page or initial screen of a display along with the full citation. Copyrights for components of this work owned by others than
ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, to redistribute
to lists, or to use any component of this work in other works requires prior specific permission and/or a fee. Permissions may be
requested from Publications Dept., ACM, Inc., 2 Penn Plaza, Suite 701, New York, NY 10121-0701 USA, fax +1 (212) 869-0481,
or permissions@acm.org.
⃝c 2008 ACM 1556-4673/2008/06-ART6 $5.00 DOI 10.1145/1367080.1367086 http://doi.acm.org/10.1145/1367080.1367086
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
6:2 • A. Bonardi and J. Barthélemy

become crucial as they try to find a balance between the constitution of a repertoire and the promotion
of creation [Tiffon 2005; Roeder 2006].
Whereas the preservation of music has been studied and practiced for many years on a wide range
of mediums from manuscripts to instruments, improving the sustainability of live electronic pieces has
become a growing issue of late. Several recent publications show that many institutions are concerned. A
European project named Caspar1 (Cultural, Artistic, and Scientific Knowledge for Preservation, Access,
and Retrieval) was launched in 2006, together 17 partners, including IRCAM, on the general topic of
preservation of digital data. This project intends to address three different communities by developing
three different testbeds: one for scientific knowledge, one for cultural heritage, and one for performing
arts.
This article presents an overview of musical and technical issues raised by the preservation activities
of electronic modules for the performing arts. In the first part, we define the scope of our study, detail
the materials involved, and the underlying logic of performances. The second part is dedicated to the
features of performance works that use electronics from the point of view of preservation activities and
the implications regarding their fragility and sustainability. In the third part, we exhibit the underly-
ing paradigms of the electronic parts of this kind of creation. The fourth part gives an overview and
classification of current activities: preservation, emulation, migration, and virtualization of electronic
modules. The last part presents several leads for further developments in that field.

2. SCOPE OF THE STUDY


2.1 Definition
Electronic modules for performance are used by all arts on stage which require in electronic device of
any kind, either analog or digital, used for various activities such as signal processing or symbolic calcu-
lation. Performing arts include, of course, music, but also dance, theatre, video, interactive installations,
etc. They may involve human performers or not.
The examples in Table I are classified according to two parameters:
—the importance of live processing software, from tape diffusion with no real-time capability, to instal-
lations where almost everything is digitally calculable, and
—the involved arts: starting from the kernel of electro-acoustic music and extending to all perform-
ing arts where live electronics is important, for instance, dance, opera, theatre, image generation,
installations, etc.

2.2 Performance Materials


Performances including electronics are based on a variety of materials. We will consider them using a
single example, Jupiter, for flute and live electronics (created at Ircam in 1987) by composer Philippe
Manoury. This piece was among the first ones to use real-time processing. This is a good example as
it is still performed (not only at Ircam) twenty years after its creation. It means it has been migrated,
ported, and adapted to various technical contexts. The materials are of three types:2 a priori texts,
active modules, and technical documentation.
2.2.1 A Priori Texts. These include librettos, text fragments, scores, stage indications, etc. They
may be prescriptive (e.g., sing with forte dynamics) or descriptive (move from the right to the left of the
stage). They are the primary sources of the performers’ work.

1 The official Website of the project is http://www.casparpreserves.eu.


2 All schemata are extracted from the Mustica preservation database for IRCAM works (http://mustica.ircam.fr).
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
Preservation, Emulation, Migration, and Virtualization of Live Electronics for Performing Arts • 6:3

Table I.
Configuration Example of Work
Tape diffusion Gesang der Jünglinge by Karlheinz Stockhausen, opus 8, for electronic and concrete
sounds (1955–1956)
Solo instrument and live electronics Anthèmes II, by Pierre Boulez, for violin and live electronics (1997)
Soloists, ensemble and live electronics Répons, by Pierre Boulez, for six soloists, chamber ensemble and live electronics
(1981–1984)
Dance and Music Performance L’écarlate, dance performance designed by Myriam Gourfink, choreographer, music
by Kasper Toeplitz (2001)
Opera with live sound transformations K ,music and text by Philippe Manoury (2001)
Theatre with live sound transformations Le Privilège des Chemins, by Fernando Pessoa, stage direction by Eric Génovèse,
sound transformations by Romain Kronenberg, 2004
Theatre and image generation La traversée de la nuit, by Geneviève de Gaulle-Anthonioz, stage direction by
Christine Zeppenfeld, real-time neural networks and multi-agent systems by Alain
Bonardi, 2003
Musical and video performance Sensors Sonic Sights (S.S.S.), music/gestures/images with Atau Tanaka, Laurent
Dailleau and Cécile Babiole (performed since 2004)
Installation Elle et la voix, virtual reality installation by Catherine Ikam and Louis-Fran cois
Fléri, music by Pierre Charvet (2000)

For example, in Jupiter, Manoury has written a precise score that both indicates the instrumental
part in classical notation and the behavior of electronic modules that is often described in musical
terms. In this case, the composer delivers a kind of global score, trying to integrate all performing
aspects within this document.
2.2.2 Active Modules. As either hardware (analogic device) or software (patches), active modules
provide control and signal-processing functions. Musicians now intensively use such graphical lan-
guages as Max/MSP or PureData software to implement the required processes. In these frameworks,
patches are windows in which processes are represented by virtual boxes connected by wires. They
are metaphors of hardware devices used in physics labs. These tools enable real-time processing and
prototyping to set up or dismantle a process.
For example, the boxes in the left column of Figure 1 are subpatches that combine into a kind of
electronic orchestra: “reverb” is the module of reverberation, “fshift” is the frequency shifter (a signal
process that adds and subtracts a certain frequency to the fundamental of the input signal), and so on.
The horizontal series of figures (from 0 to 13) at the top of the patch are buttons allowing one to trigger
the beginning of the corresponding sections in Jupiter. Other boxes are faders to adjust sound levels
(input and output), start/stop buttons, etc.
2.2.3 Technical Documentation. Technical documentation often includes schemata, instructions,
drawings, test files, and such. For example, Figure 2 illustrates the loudspeaker implementation and
Figure 3 the audio-MIDI3 setup for Jupiter. This conceptual configuration will be adapted to the concert
hall where Jupiter is to be reperformed, it should help audio engineers with the practical implementa-
tion of loudspeakers.

2.3 Performance Logic


The logic of a performance is the knowledge needed to handle all materials in order to achieve the
actual performance. In the case of a work like Jupiter, the required knowledge is based on:

3 MIDI, Musical Intruments Digital Interface , http://www.midi.org.


ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
6:4 • A. Bonardi and J. Barthélemy

Fig. 1. A screenshot of the main patch of Jupiter by Philippe Manoury (musical assistant Serge Lemouton).

—musical education, the ability to read and understand a musical score;


—signal-processing education, knowledge about such elementary signal transformations as ring mod-
ulation, sampling, etc.;
—sound engineering education, needed to understand the technical configuration schemata and imple-
ment them; and
—a global performance knowledge, enabling one to put together the previous competences.
Let us note how difficult it is to design a computerized model of this logic. One may record all input and
output documents during the creation process and the performance process. One may state ontologies
of the various fields concerned: music, signal processing, sound engineering; and even a description of
reperformance activity. This will in no case provide a complete formalization of the underlying logic.

3. FEATURES OF PERFORMANCE WORKS INCLUDING ELECTRONICS


3.1 Performance Works Based on Electronics are Fragile
Performance works based on electronics are very sensitive to various factors.
(1) The fact that storage media wears out. CD-ROM, USB keys, hard disks, DAT tapes, etc., are fragile
storage means.
(2) The obsolescence of technical standards. End-user configuration patches using proprietary soft-
ware and/or hardware technologies and binary proprietary file formats are especially subject to
obsolescence.
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
Preservation, Emulation, Migration, and Virtualization of Live Electronics for Performing Arts • 6:5

Fig. 2. Loudspeaker implementation for Jupiter by Manoury.

(3) The lack of care in electronic part preservation. For many years, libraries have kept scores, but
composers have been responsible for the preservation of their own electroacoustic or live electronics
parts. This is explained by Hannah Bosma in the framework of Donemus/Near institution [Bosma
2005]. This is one of the main reasons for what Michael Longton confirmed in a survey [Longton
2004] titled “Record Keeping Practices of Composer”: it shows that 47% of composers lost files they
considered valuable due to hardware or software obsolescence.
(4) The practical conditions of performance. Changing performing conditions may produce unexpected
results. For instance, performing with slightly different devices or in different places always requires
additional time for tests and often requires modifications.
(5) When applicable, the empirical making of patches. The complexity of patches sometimes makes
them very unstable. For instance, in Max/MSP software, it is well known that priority processing
depends on the position of the objects. Therefore, moving an object in a patcher may produce strong
modifications of the results produced. Let us give an overview of this complexity with a screenshot
of a Max/MSP patch by Olivier Pasquet, musical assistant at IRCAM (Figure 4).

3.2 Performance Works Curators and Designers Look for Sustainability


The worst situation for all these artistic works is the inability to reperform them (for various reasons)
after their creation. Therefore, concerned institutions and composers are interested in sustainability.
Paradoxically, this means preserving authenticity, while, at the same time, maintaining the possibilities
for evolution.
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
6:6 • A. Bonardi and J. Barthélemy

Fig. 3. Audio-MIDI setup for Jupiter by Manoury.

ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
Preservation, Emulation, Migration, and Virtualization of Live Electronics for Performing Arts • 6:7

Fig. 4. A complex patch by Olivier Pasquet, musical assistant at IRCAM.

On the one hand, each time a work is played, performers face the issue of authenticity, trying to set a
kind of loyalty in reference to the materials and to previous reference performances (perhaps belonging
to opposite interpretation traditions), famous or not. This requires the storage of results as sound
samples in order to keep a minimal memory of previous performances, but it may be insufficient to
evaluate the reperformance. Indeed authenticity criteria seem quite difficult to define since composers
have different conceptions of it. Musical assistant Serge Lemouton reports how the same processes
of migration of Max/MSP patches (from Next environment to Macintosh) applied to several works at
IRCAM led to various reactions: composer Philippe Manoury considered the new result as too close to
the original, composer Michael Jarrell found it too far from the original.
On the other hand, composers often want to modify their works or performers intend to adapt a
work to other configurations. Maintenance of software patches is a very difficult task since they are
not structured as programs, for instance. Therefore, slight modifications of patches a few months after
completing them may become quite laborious. During an interview in September 2006, Andrew Gerzso,
who is Pierre Boulez’ musical assistant, indicated to us that important patch revisions for technical
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
6:8 • A. Bonardi and J. Barthélemy

Fig. 5. Adding two signals, with a Max/MSP patch (on the left) or a PD patch (on the right).

reasons are opportunities for musical transformations. This was the case, for instance, with Anthèmes II
for violin and live electronics (1997) by Boulez, before its recording.

4. THE PARADIGMS OF ELECTRONICS FOR PERFORMANCE


Electronics for performance are designed according to two paradigms; signal-processing and control-
engineering paradigms.
4.1 Signal Processing
One of the key points of electronic music is the deep influence of signal processing. In the past, composers
would use analog device (most often, synthetizers) and physically connect modules together, with the
output of one of them linked to the input of another one. Today the situation is the same except that
nearly all artists use graphical languages for signal processing, that are based on the same paradigm:
boxes that provide signal transformations (in a broad sense) are connected together.
Consider, for example, two patches achieving the same result, the first one in Max/MSP, the second
one in PureData. The outputs of two oscillators, one at 100Hz frequency, the other one at 200, are added,
and the result is played. Oscillators (cycle∼ or osc∼) and the addition (+∼) or product (*∼) functions
are presented as connected objects (Figure 5).
4.2 Control Interfaces
Control interfaces are based on two main elements; input devices from the industry (microphones, MIDI
keyboards, etc.), frequently completed by specific processing, are generally implemented using the same
processing engine as described before (e.g., Max/MSP/Jitter, PureData, Isadora, etc.). There are also
specific control devices that are developed for specific purposes, such as to implement new forms of
expressivity or to try to ease control for the performer. In the field of music, the phenomenon becomes so
important that a yearly international conference called New Interfaces for Musical Expression (NIME4 )
is specifically devoted to this activity.
An example of such a specific device is the MIDI flute. The MIDI flute is an instrument specifically
developed for the purpose of some musical works produced in Ircam such as . . . Explosante-fixe. . . by
Pierre Boulez (1991–1993), and the aforementioned Jupiter or La Partition du ciel et de l’enfer by
Philippe Manoury (1989). The MIDI flute is composed of a traditional flute on which numerous sensors
are added in order to generate MIDI (Musical Intruments Digital Interface) standard events like those

4 NIME, International Conference on New Interfaces for Musical Expression http://www.nime.org.


ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
Preservation, Emulation, Migration, and Virtualization of Live Electronics for Performing Arts • 6:9

Fig. 6. A screenshot of a timeline in Max/MSP software.

generated by a MIDI keyboard. Musically speaking, the objective is to enable the performer to become
independent of the electronic part: in a traditional music piece including electro-acoustics, the performer
is the slave of the electro-acoustic part as he is constrained to follow the timing of events that are either
prerecorded or strictly dependent on a time scale expressed in milliseconds. The aim is to allow the
musical performer(s) to lead the electronic parts either by specific gestures (e.g., MIDI pedal strokes)
or through more elaborate automatic MIDI event-following algorithms. This situation is generally
described as score-following. Numerous similar devices have been developed, for example, the electronic
violin by Max Mathews, or the SuperPolm by Suguru Goto.
The future of musical works using these devices is obviously extremely fragile, linked to the future of
the hardware device itself. Emulation or migration of the device is not a trivial task. One can imagine
a simple microphone, coupled with a software system that is able to detect the pitch of the instrument,
and thus able to send NoteOn/NoteOff events similar to those of MIDI. The first obvious problems are
those of pitch detection, based on audio analysis, which is not currently completely reliable, and NoteOff
event detection (the end of the note), which is reliable on a MIDI flute but not straightforward using
audio analysis. Moreover, there is a delay in the calculation of pitch, which results in a delay in the
triggering of the events, which does not exist with electromechanical devices.
The optimal solution to this problem can only be imagined if a complete description of the events and
of the resulting effects can be achieved, that is to say, in a complete virtualization of the underlying
process.
More recently, the field of computer-human interfaces has further influenced developments in all
graphical languages (Max/MSP/Jitter, PureData, etc.), including new objects for interface purposes
(e.g., multiparameter graphical interfaces). Most of these interfaces are based on common graphical
codes and on the general paradigm of mouse and window, and so the obsolescence of these interfaces is
linked to the obsolescence of these paradigms and graphical codes.

4.3 Difficulties
Electronic music’s close association with signal processing also has drawbacks.
Difficulties in handling time. Curiously, though we use patches for music purposes, no coherent time
paradigm is available, since patches were designed in the beginning as transfer functions in the field
of signal processing. For instance, in Max/MSP software, the only way to specify events according to a
predefined chronology consists of using timelines and timed cue lists, which are interfaces that enable
messages to be set, and then sent at precise times (Figure 6). Time is considered here as in a classical
sequencer. It means such temporalities as chronological time, asynchronous time, measured time, or
flexible time relations with real-time performed events can only be handled using ad hoc solutions
without any possibility of universal formalization.
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
6:10 • A. Bonardi and J. Barthélemy

Fig. 7. A screenshot of Jupiter page in Mustica database.

The restrictions due to patents. Real-time processing of an image or a sound is strongly based on the
extraction of signal descriptors; there may be patents on some of these descriptors, making the situation
more difficult to handle. Moreover, the patent owners are in position to claim that the preservation
process constitutes a form of reverse engineering, and therefore dispute it.

5. CURRENT ACTIVITIES FOR THE MAINTENANCE OF PERFORMING WORKS


5.1 Preservation
Though preservation of score manuscripts or old instruments is widely researched, preservation of
performing arts, including electronics, has only been studied for a few years. The Mustica Research
Initiative, an international project led and coordinated by the University of Technology of Compiègne,
builds on a collaboration between two contemporary music institutions (the Institut de Recherche et
Coordination Acoustique/Musique, IRCAM, and the Groupe de Recherches Musicales at the Institut
National de l’Audiovisuel, INA) by performing research on the topic of contemporary music preservation
between 2003 and 2004 [Bachimont et al. 2003]. At the end of October 2006, it contained 54 references
concerning IRCAM works. Figure 7 is a screenshot of the display for the documentation of Jupiter by
Manoury. It provides access to many sources of information about Jupiter:
—PDF documentation, generated on the fly: the main patch, the loudspeaker implementation, the
audio-MIDI setup;
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
Preservation, Emulation, Migration, and Virtualization of Live Electronics for Performing Arts • 6:11

—audio excerpts, generally in MP3 format;


—BRAHMS pages5 about Manoury and Jupiter;
—publications and CD sale.

5.2 Emulation
Emulation is certainly one of the most difficult approaches of reperformance. Bernardini and Vidolin
[2005] quote the example of Stockhausen’s Oktophonie, which requires an ATARI-1040 ST computer
that no longer exists. There are Atari emulators running on other computers but nobody knows whether
the Notator program used by Stockausen will run on an emulator, though communities of users may be
able to give some help. From this example, the authors of this article insist on the power of communities
versus the unreliability of companies.
The emulation of hardware by software devices is also difficult. Bullock and Coccioli [2005] provide
an interesting example by composer Jonathan Harvey. His piece, Madonna of winter and spring for
large orchestra and three synthetisers, requires a Yamaha TX816 for sound synthesis. The authors are
planning to emulate the synthetiser with PureData patches.

5.3 Migration Activity


Migration activity is the most wide-spread activity to achieve reperformance. Many composers have
had their works transformed from one technical environment to another. All institutions in the field
of electronic arts face migration necessities. At IRCAM, important pieces using Next computers were
moved to Macintosh machines at the end of the 1990s.
Migration can also be viewed as a new instigation for creation: composer and theoretician Jean-
Claude Risset [Risset et al. 2002] has created his new work Resonant Sound Spaces with the help of
Antonio de Sousa Dias, Daniel Arfib, Denis Lorrain and Laurent Pottier. Previous work by de Sousa
Dias [2003] has been fundamental for Risset, especially his programming of harmonic structures with
real-time tools.
During the last few years, migration has integrated the idea of avoiding proprietary solutions. Bullock
and Coccioli [2005] and, of course, Puckette [2004] have put the emphasis on the necessity of having
standard and open formats to store all files used in the process. For instance, they recommend that
PureData and Max/MSP patches be stored in ASCII text format rather than faster loading binary files;
all documentation should be written using plain text; audio contents should be saved as basic WAV
files.

5.4 Virtualization Activity


Virtualization means describing electronic modules using abstractions. At IRCAM, Andrew Gerzso
has completed an important work aimed at finding representations as independent as possible from
technical implementation for signal-processing modules in Anthèmes 2 for violin and live electronics
by Pierre Boulez. His work concerns the score as well as the technical documentation, both released by
Universal Editions (see Figure 8).
He applied his approach to three kinds of electronic modules.

—Sound processing modules. Gerzso’s approach is based on basic and standard modules, universally
known and scientifically described in a unique way, for instance, frequency shifters, comb filters, ring
modulators.

5 BRAHMS is an online database of 20th and 21st centuries composers and their works.
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
6:12 • A. Bonardi and J. Barthélemy

Fig. 8. The score of the beginning of Anthèmes 2, for violin and live electronics by Pierre Boulez. (With permission of Universal
Editions, Vienna, Austria.)

ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
Preservation, Emulation, Migration, and Virtualization of Live Electronics for Performing Arts • 6:13

Fig. 9. The symbolic schema of the beginning of Anthèmes 2 for violin and live electronics by Pierre Boulez, completed by Andrew
Gerzso. (With permission of Universal Editions, Vienna, Austria.)

—Spatialization modules. With the help of Olivier Warusfel (Head Researcher of IRCAM Room Acous-
tics Team), Andrew Gerzso has identified rather universal descriptors for which values can be induced
from current patches, for instance, source direction, level of direct source, level of first reflections.
—Score-following modules. Nothing is specified in Gerzso’s documentation except what the module
should be able to do.

These effects have been added to the score as parts, exactly as if they were instrumental parts. The
first part is notated “Inf. Rev.”, for the infinite reverberation effect. The second part is notated “Sampler
IR”, for the sampler completed by infinite reverberation, and the third part, is notated as “Sampler”, a
sampler without reverberation. These three parts each are completed by spatialization effects notated
on a single line directly below each part. The last part is dedicated to the frequency shifter.
The score follower in this score has the role of a conductor. It is responsible for triggering events at
the right time and is notated by using numbers in a circle.
As one may notice, each part is notated by using age-old music paradigms such as pentagrams, notes,
measures, and other symbols as close as possible to traditional music notation.
Any person able to read music and familiar with musical performance is able to read this score
and get a precise idea of the expected result without having any idea of its particular implementation
(Figure 9.)
At GRAME, Yann Orlarey and his research team [Orlarey et al. 2002] have explored other kind of
formalisms, using algebra to describe block-diagram languages. This led to the Faust system, which
allows for the use of the mathematical description of signal-processing functions, up to the compilation
of corresponding modules in C++ . We show an example of a part of a reverberator description in
Figure 10. It describes a monophonic reverberation using a high-level algebraic language to describe a
signal process which is given in the top yellow rectangle. To better understand, a plug schema is given
that shows eight comb filters set in parallel, followed by four serial allpass filters.
Other teams have examined the automatic generation of Max/MSP patches. Paolo Bottoni and his
colleagues in Roma University La Sapienza [Bottoni et al. 2006] have studied the use of planning
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
6:14 • A. Bonardi and J. Barthélemy

Fig. 10. A screenshot of a reverb description using Faust.

agents to transform abstract goal states specified by the performer into potentially complex sequences
of Max/MSP control messages by using a programming language named GO/Max.

6. REQUIREMENTS FOR FURTHER DEVELOPMENTS


Simple preservation is not sufficient for the long term because of technical obsolescence. Even correctly
saved, it will be quite difficult within a decade to open current files stored according to proprietary
formats.
Emulation and migration may save the work, but they often provide only a provisional solution, as
shown by Bullock and Coccioli [2005]. Both requires a lot of work, and sometimes cannot achieve the
same musical result as the original.
Virtualization appears to provide the most interesting and sustainable solution for pieces mainly
composed of software processes. Especially with open technologies, it enables one to know exactly the
algorithms used for each signal process and the handling of standard formats of files.

6.1 The Issue of Authenticity


Amongst the further developments to be explored, one of the most important is the issue of authenticity.
The question is how to guarantee a certain degree of authenticity through the necessary transforma-
tions, migrations, and virtualizations of the electronic material that is to be preserved.
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
Preservation, Emulation, Migration, and Virtualization of Live Electronics for Performing Arts • 6:15

The CASPAR project will strive to develop an adequate model, building on results of previous projects
like Interpares. Concepts like integrity, identity and provenance, even though they are useful, they are
not sufficient to solve this issue.

6.2 What Should be Kept?


The first level in preservation is to keep all available digital files related to the work, including audio
recordings, patches, documentation, scores, etc. as is currently possible with the Mustica system.
But this is not sufficient to be able to reperform the piece or even to allow for its evolution. In the
Caspar project, this issue will be studied. We have started interviewing musical assistants at Ircam
in order to get their opinion about the requirements for preservation. From these interviews and from
the first considerations about the problematics, we identify the following ideas as the most important
issues to be investigated:
—to provide as much reference material as possible for each instrument, for example, as a multitrack
recording of the piece of music to be preserved that provides reference material for each instrument
and output from the digital process;
—to set abstract descriptions of the electronic patches and of the technical implementation of the piece,
for instance, in XML format;
—to store a set of test values; for instance, test values for various inputs, either integers or signals
(e.g., white noise), that will help for in a comparision with new versions of patches in order to check
whether they are in good working condition;
—to set an ontology of signal processing that will describe all live and electro-acoustic modules found
in the repertoire;
—to use a coherent set of graphical codes to capture the performer’s gestures and behaviors;
—to provide a metric for measuring the authenticity of a reimplementation of a digital process.

7. CONCLUSION
We have explained how fragile performance works based on electronics are when reperformance is con-
sidered. Many researchers, users, and composers in the field of performance arts require sustainability,
which may be achieved through four complementary activities: preservation, emulation, migration, and
virtualization.
Preservation based on standard formats (text format for documentation, WAV for audio files, etc.)
is necessary but not sufficient, whereas emulation and migration, though sometimes useful, require a
lot of work for provisional solutions. Virtualization, which is based on abstract descriptions, is the only
way to provide a means to regenerate electronic modules. It will require ontologies of signal processing
that are not yet available. The Caspar Project will work on these aspects, especially on a contemporary
art testbed, to describe complex artistic works in their production and organization.

ACKNOWLEDGMENTS
Our thanks to Michael Fingerhut, Andrew Gerzso, Serge Lemouton, Denis Lorrain, and Philippe
Manoury.

REFERENCES
BACHIMONT, B., BLANCHETTE, J.-F., GERSZO, A., SWETLAND, A., LESCURIEUX, O., MORIZET-MAHOUDEAUX, P., DONIN, N., AND TEASLEY, J.
2003. Preserving interactive digital music: A report on the MUSTICA research initiative. In Proceedings of the 3rd Interna-
tional Conference on WEB Delivery of Music (WEB’03). Leeds, England.
ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.
6:16 • A. Bonardi and J. Barthélemy

BERNARDINI, N. AND VIDOLIN, A. 2005. Sustainable live electro-acoustic music. In Proceedings of the International Sound and
Music Computing Conference. Salerno, Italy.
BOSMA, H. 2005. Documentation and publication of electroacoustic compositions at NEAR. In Proceedings of the Electroacoustic
Music Studies Network International Conference (EMS’05). Montreal, Canada.
BOTTONI, P., FARALLI, S., LABELLA, A., AND PIERRO, M. 2006. Mapping with planning agents in the Max/MSP environment: The
GO/Max language. In Proceedings of the International Conference on New Interfaces for Musical Expression. Paris, France.
BULLOCK, J. AND COCCIOLI, L. 2005. Modernising live electronics technology in the works of Jonathan Harvey. In Proceedings of
the International Computer Music Conference. Barcelona, Spain.
DE SOUSA DIAS, A. 2003. Transcription de fichiers MusicV vers Csound au travers de Open Music. In Proceedings of the 10th
Journées d’Informatique Musicale. AFIM, Montbéliard.
GRAEF, A., KERSTEN, S., AND ORLAREY, Y. 2006. DSP Programming with Faust, Q and SuperCollider. In Proceedings of Linux
Audio Conference. Karlsruhe, Allemagne.
LONGTON, M. 2004. Record keeping practices of composers: A survey. InterPares 2 Website http://www.interpares.org.
ORLAREY, Y., FOBER, D., AND LETZ, S. 2002. An algebra for block diagram languages. In Proceedings of International Computer
Music Conference (ICMA’02). Göteborg, Sweden.
PUCKETTE, M. 2004. New public-domain realizations of standard pieces for instruments and live electronics. In Proceedings of
the International Computer Music Conference. Miami, FL.
RISSET, J.-C., ARFIB, D., DE SOUSA DIAS, A., LORRAIN, D. AND POTTIER, L. 2002. De Inharmonique à Resonant sound spaces. In
Proceedings of the Journées d’Informatique Musicale. AFIM, Marseille, Gmem.
ROEDER, J. 2006. Preserving authentic electroacoustic music: The InterPARES Project. In Proceedings of the IAML-IASA
Congress. Oslo, Norway.
TERUGGI, D. 2001. Preserving and diffusing. J. New Music Resear. 30, 4.
TIFFON, V. 2005. Les musique mixtes: entre pérennité et obsolescence. Revue Musurgia, XII/3, Paris.

Received November 2006; revised May 2007, September 2007; accepted November 2007

ACM Journal on Computing and Cultural Heritage, Vol. 1, No. 1, Article 6, Publication date: June 2008.

You might also like