Professional Documents
Culture Documents
Yago de Quay - Masters Work Plan Proposal (2009)
Yago de Quay - Masters Work Plan Proposal (2009)
0: User-driven ad-
hoc dance clubs
Yago de Quay
Abstract
Technology and theoretical models in the field of interactive media can provide a sustainable framework
for implementing interactive music in dance clubs. This research proposes investigating users’ ability to
change music to better suit their needs in a dance club environment.
1 Introduction
Generative Music 2.0 is a term defined here as a system that generates music using the Web 2.0 model of
user generated content [1]. User-driven music – Generative Music 2.0 – will aggregate users’ social
behaviors and individual attitudes towards dancing in clubs enabling users to create, modify, reconfigure
and/or share dancing music in real time through live motion sensing technologies. Moreover, Generative
Music 2.0 is expected to indirectly educate users about music composition and open a new ground for self-
expression and collective actions related with dancing.
1.1 Background
The present research proposal is built around an interactive system for dance clubs that enables user
participation while promoting creativity and networking. This project will create a system of motion
capturing (mocap) devices that will track dancers’ movements and map these movements to various
parameters in a song, thus the dancer will actually be able to interact and shape in real time the music that
is being played. The goal is not to create a new interface for music design such as gestural control of a
virtual studio, the main concern is to create a more engaging and involving atmosphere for dancers in a
nightclub setting. Instead of the usual performer (such as a DJ) vs. audience setup, this project proposes the
audience being the performers.
This project will rely upon from two successful innovations in the field of interactive media, namely motion
capture and adaptive music. Motion capture refers to capturing or tracking real body movement with
special equipment (e.g. digital video camera) and translating these movements into a digital model.
Adaptive music refers to music that adapts and/or responds to a stimulus such as action in a video game.
1.2 Issues
Technology is used as leverage for performers and can have the consequence of increasing the gap between
the artist and the audience [3]. This problem echoes the debate between User Generated Content (UGC)
and Engineered Content (EC). Engineered content, which involves no interaction and is illustrative of
modern day dance clubs and other artistic performances, lacks several features that can have a big impact
on the audience and music, and ultimately the venue where the music is being played. Engineered music, or
non-interactive music can be a problem for the following reasons [4]: First, there is the issue with the
limitation in the method of music creation. Engineered music, contrary to interactive music, only allows the
expert to have a say on the overall creative direction of a song. Second, non-interactive systems miss out on
the plethora of knowledge and inputs that other users and experts can provide. In this case, the dance club
isn’t harvesting its dancers’ musical potential. Another problem is that currently, the audience cannot share
their interpretation of a song or change it’s direction if desired. The audience is a mere spectator rather than
a participant of the experience.
This research procures solving these problems by offering an interactive music model for dance clubs. This
interactive model is not intended to substitute performers, experts or engineered content, but add a tool
under dance club’s belt, a tool that can have benefits on various businesses and the scientific community.
1
Generative Music 2.0: User-driven ad-hoc dance clubs Yago de Quay
websites, the successful strategies of “social networking as platform” have lowered barriers in interaction,
exploration and skill-building thus providing viable models for interactive media [1, 5].
Dancers and musicians that use body language as means for music composition are also exploring the state
of the art in interactive music. Dancers can directly and deliberately manipulate rhythm, pitch, timbre and
other mapped musical parameters, and at the same time the mapping of these parameters can also be
modified, in real time, by musicians. On one end a dancer provides input to a computer, and on the other a
musician decides how the input will affect musical output. This framework provides intensive interaction
between the dancers and musicians while enriching improvisational possibilities [10, 11].
2
Generative Music 2.0: User-driven ad-hoc dance clubs Yago de Quay
universal appeal. On the other hand, functional systems provide custom made interfaces (eg. virtual
instruments) in which skilled users can manipulate pre-defined parameters [13].
3 Objective
The goal of the present research is to gauge dancers’ satisfaction of interactive music/dance systems in
comparison with the non-interactive, traditional, music/dance experience.
3.2 Hypothesis
To capture dancer’s movements we propose to install mixed types of motion capture systems and
superimpose the motion analysis information. Various signals will be captured from the dancers ranging
from macroscopic (population, density, energy) to microscopic (arms, head, hips and legs).
In regards to music interaction, it should limited to pre-defined parameters set by the DJ.
Given the popularity of interactive gaming systems, it is expected that dancers will derive greater
enjoyment from music when their actions have immediate and direct influence on the music’s parameters.
The results will inform research in human computer interfaces (HCI) and help develop sustainable
interactive music models for dance clubs as well as other entertainment technologies.
4 Method
Data collection will include a mixture of qualitative and quantitative procedures. A motion capture and
interactive audio playback system will be used to gather data from nonprofessional dancers moving
spontaneously to music in a nightclub. Parameters of the music such as filter sweep, layers, sections, and
LFOs will be manipulated by dancers’ movements. Upon leaving the floor, dancers will be queried as to
how well they enjoyed the experience and the extent to which they believed their actions directly controlled
the music.
There are three questions that pertain to the constituents of this thesis. The methods on how these questions
will be approached are listed below.
4.1 Dance
What movement will be captured?
Literature
Behavior Observation Checklist
Performance Tests
4.2 Music
Which musical parameters will be interactive?
3
Generative Music 2.0: User-driven ad-hoc dance clubs Yago de Quay
Questionnaire
Opinion Surveys
Self-Ratings
Simulations
4.3 Technology
What technology is best for club environments?
Literature
Simulations
4.4 Tests
Two random sample will be created from a population of club goers. Demographic questionnaires will
filter subjects to a sample representative of club goers. The sample would then be split in to two conditions.
Under these conditions participants will repeat various sets. Each set has a length of 10min and the music
played will have one of three levels of interaction assigned randomly: Non-interactive, interactive and very
interactive.
Under condition A, participants will dance in various sets and rate the experience after every set. This will
be a blind study; they will not be told which sets are interactive.
Under condition B, different participants will dance in various sets and rate the experience after every set.
They will be told which sets are interactive.
Questionnaires and opinion surveys will rate music appreciation as well as dance appreciation.
Sample questions might include:
Behavior observation checklists will be used evaluate dancers’ movements and provide information on
what movement to capture.
Performance tests will evaluate the degree of sensitivity between the subjects’ interaction and the affected
music. This will fine tune parameters in the music software.
Likert scale will allow participants to self-rate how much control they had on the music.
Work in this field of interactive music and interactive dance has been conducted by researchers at UT
Texas, Universidade do Porto and Universidade Nova. Outside sources such as MIT and McGill
Universisty have presented papers on motion capture methods. Motion capturing hardware and software
systems are also available in Universidade do Porto, Universidade Nova and UT Texas.
4
Generative Music 2.0: User-driven ad-hoc dance clubs Yago de Quay
5
Generative Music 2.0: User-driven ad-hoc dance clubs Yago de Quay
5 Results
Hypothetical results may look like the data presented in Fig. 2 and 3.
The following graph illustrates an example of the correlation between perceived interaction and real system
interaction:
6 Implications
The implications of this research will uncover connections between mind, body and music, and create
innovative technological models for the scientific community and the outside world.
From a sociologists’ point of view this research provides a rich source of information on the dynamics of a
population. The human computer interaction (HCI) system developed in this project will discover how
humans express themselves through their bodies and contribute in fields such as body language. HCI will
also provide a fresh light on the role of music and it’s implications on the human body. Music cognition has
also much to derive from the knowledge of how humans are aware of their musical soundings such as the
preference over linear vs. non-linear music.
In this paper’s model, Disk Jockeys (DJ) are given more creative control of the music than before. It will be
up to the DJ to chose which parameters can be altered and customize how body language is translated into
musical content. He will act as a mediator and the platform in which dancers and music reciprocate.
The mapping technological models proposed in this research can also contribute to e-learning strategies by
providing a medium for instruction.
6
Generative Music 2.0: User-driven ad-hoc dance clubs Yago de Quay
The results of this research can present new avenues and resources for dance clubs and interactive media
installations. Lastly, the impact of this study will also raise more questions that can drive further research
and funding to the realm of interactive content creation.
7 References
[1] Tim O'Reilly. “What Is Web 2.0”, O'Reilly Network. http://www.oreillynet.com/pub/a/oreilly/tim/news/
2005/09/30/what-is-web-20.html. August 2006.
[2] Andrew Clark. “Defining Adaptive Music”, Gamasutra, April 17, 2007.
[3] http://www.palindrome.de/
[4] Paul Chin. “The Value of User-Generated Content”, Intranet Journal. July 3, 2006.
[5] "The growing cost of free online content", Deloitte Media Predictions.
[6] http://www.sustainabledanceclub.com/
[7] MSNBC News. “Your text message on a 40-foot screen: Fans get new ways to interact at a concert”,
http://msnbc.msn.com/id/7952252/.
[8] Andrew Boyd and Robb Mills. “Implementing an Adaptive, Live Orchestral Soundtrack”, lecture at the
Game Developers Conference, San Francisco, 2006.
[9] R. Ulyate and D. Bianciardi. “The Interactive Dance Club: Avoiding Chaos in a Multi-Participant
Environment”, Computer Music Journal, Volume 26, Number 3 (Boston: Massachusetts Institute of
Technology, 2002).
[10] Carlos Guedes. “translating dance movement into musical rhythm in real time: new possibilities for
computer-mediated collaboration in interactive dance performance”, CMC, 2007.
[11] Carlos Guedes. “Mapping movement to musical rhythm: a study in interactive dance”, pages 85-100,
2005.
[12] Wechsler, R.. “Artistic Considerations in the Use of Motion Tracking with Live Performers: a
Practical Guide", a chapter in the book: "Performance and Technology: Practices of Virtual Embodiment
and Interactivity" edited by Susan Broadhurst and Josephine Machon, Palgrove Macmillian, 2006.