An Interdisciplinary Project on Digital Musical Instruments

journal of interdisciplinary music studies
fall 2010, volume 4, issue 2, art. #10040202, pp. 17-35
The McGill Digital Orchestra: An Interdisciplinary
Project on Digital Musical Instruments
Sean Ferguson1 and Marcelo M. Wanderley2
1
2
DCS, CIRMMT, Schulich School of Music, McGill University
IDMIL, CIRMMT, Schulich School of Music, McGill University
Background in digital musical instruments. A digital musical instrument (DMI) comprises a
control surface that controls the parameters of a digital synthesis algorithm in real time. In the
Digital Orchestra Project, a three-year research/creation project, the synthesis engine was
hosted on a general-purpose computer, while the gestural control surfaces were new hardware
devices created for the project. The mapping between gestural data and synthesis parameters
was carried out through the use of custom-written software: The Mapper.
Background in music performance. From a performance perspective, a successful DMI
should allow the performer to feel that he or she has accurate control of the musical result of
their performance. This sensation results from a number of different factors, including the
responsiveness of the instrument (low, consistent latency), haptic feedback, the mapping
strategies used, and the reproducibility of musical ideas, among others.
Aims. The aim of the project was to develop and use a number of new DMIs with musical
potential comparable to that of existing acoustic musical instruments. An important goal was to
foster long-term interdisciplinary collaborations between instrument designers, composers and
performers. We also wanted to address the issue of reproducibility in the performance of digital
musical instruments by developing appropriate notation methods.
Main contribution. The Digital Orchestra resulted in of the development of several new DMIs,
from laboratory prototypes to fully-fledged concert instruments. Composers created new works
for small ensembles that included these instruments. A musical notation based on dynamic
visual elements displayed on a computer screen was developed. The project notably included
three years of intensive training on these instruments by performers who had previously already
achieved a high level of expertise on acoustic musical instruments.
Implications. The McGill Digital Orchestra presents a number of paradigms for the design,
creation and performance of digital musical instruments in the context of a long-term
interdisciplinary, collaborative environment. Based on our experience, we propose that one
effective measure for the evaluation of a digital musical instrument is its ability to reproduce a
performance of a particular piece, either by the same performer or by different performers. This
involves the ability to realize a piece based on a notated score, whether on paper or using
software-based visual feedback in a graphical environment. We suggest that this may aid in
ensuring the viability and longevity of a novel digital musical instrument. The results of this
approach to DMI design include instruments that have been used in high-profile professional
performances and that are still being actively used by several performers world-wide.
Keywords: Music Technology, Composition, Performance, Digital Musical Instruments.
•Correspondence: Marcelo M. Wanderley, Schulich School of Music, McGill University, 555 Sherbrooke
Street West, H3A1E3, Montreal, Qc, Canada, e-mail: [email protected]
• Received: 17 March 2010; Revised: 8 September 2010; Accepted: 22 September 2010
• Available online: 30 November 2010
• doi: 10.4407/jims.2010.11.002
18
S. Ferguson and M. M. Wanderley
Introduction
Digital technologies have revolutionized the way we communicate, interact and learn.
Music is another area where such technologies have had a strong impact, most
notably the use of computers for music creation and performance. In this domain, the
development and musical use of Digital Musical Instruments (DMIs)—musical
instruments comprised of a gestural controller (also called a hardware interface or a
control surface) used to control the parameters of a digital synthesis algorithm in real
time, through pre-defined mapping strategies [Miranda and Wanderley, 2006]—is a
burgeoning research field.
The design of novel DMIs has been the focus of much attention in recent years [Cook
2001; 2009] [Wanderley and Depalle, 2004] [Jordà 2005] [Malloch et al. 2006]
[Overholt 2009], most notably through the NIME (New Interfaces for Musical
Expression) Conferences. NIME started as a workshop during the ACM CHI 2001
Conference in Seattle, becoming a fully-fledged conference in 2002 with NIME02,
held in Dublin. Although before NIME a very large body of work had already been
developed in this field [Piringer, 2001] (see [Wanderley and Battier, 2000] for a
comprehensive review), NIME helped to focus the field as a research community and
created an interdisciplinary forum for the showcase of novel DMIs, with several
hundred designs presented since its inception [Marshall, 2009].
But apart from the technological aspects of device design, a major issue to be studied
in this field is the musical use of DMIs. In this aspect, unfortunately, the situation is
rather different, with only a reduced number of works that have directly addressed the
musical use of DMIs, among others [Machover, 1992][Ryan, 1992][Schloss and Jaffe,
1993][Siegel and Jacobsen 1999][Tanaka, 2000][Burtner 2002; 2003][Oore, 2005][de
Laubier and Goudard, 2006][Dobrian and Koppelman, 2006][Butler, 2008][PalacioQuintin, 2008] and [Stewart, 2010]. For studying the musical potential (composition
and performance) of DMIs, a medium to long-term project would be essential where
composers and performers can use “stable” or “mature” versions of DMI prototypes
as part of their composition and performance practices, ideally in close collaboration
with instrument designers.
The McGill Digital Orchestra was proposed as an answer to such need. The Digital
Orchestra was a three-year research/creation project that included the development
and refinement of a number of novel DMIs and the composition and concert
performance of several new musical works written for these instruments. The Project
originated from a series of interdisciplinary seminars given at McGill University by
Marcelo Wanderley, Sean Ferguson and D’Arcy Philip Gray.
The Digital Orchestra Project
The goal of the Digital Orchestra Project was to create a number of new digital
musical instruments with musical potential comparable to that of existing acoustic
musical instruments. Our approach centered on the formation of interdisciplinary
The McGill Digital Orchestra Project
19
teams of instrument designers, performers and composers. These teams participated in
the design, creation and refinement of new instruments, which were then used in the
composition and concert performance of a number of original musical works. An
important goal was to leverage the expertise of elite performers on traditional
orchestral instruments to provide ongoing feedback to the instrument designers. We
also wanted to address the issue of reproducibility in the performance of digital
musical instruments by developing appropriate notation methods.
Our focus on interdisciplinary collaboration resulted from the desire to create
instruments that would remain viable and in use following the tenure of the project.
One risk in an approach in which a single person working alone fulfills all the roles of
instrument designer, composer and performer is that the resulting instrument – though
possibly of interest in various ways – is only ever used by its inventor. The current
scene is peppered with unique and fascinating digital instruments with a performer
base of one. In the Digital Orchestra, we hoped to develop a methodology for the
process of creating DMIs that would increase the likelihood of their being adopted by
performers other than the instrument’s designer.
The project was planned to take part in three distinct phases. In year one, teams of
instrument designers freely experimented with various sensing technologies and
together with composers created a large number of prototypes of hardware and
software tools. Performers of orchestral instruments carried out testing of these
prototypes, as well as practicing on prototypes previously developed for the
interdisciplinary seminars briefly mentioned above and described in more detail in the
next section. Their impressions of the novel prototypes provided valuable feedback
for interactive development of the instruments.
In year two, three teams – each of which included performers, composers, and
instrument designers – created new hardware interfaces, synthesis engines and digital
signal processing and analysis tools based on the prototypes from year one. At the end
of this year, the goal was to finalize the tools of the Digital Orchestra to prepare for
concert use in the following year, including documentation. At the end of this year,
the composers began preliminary work on new compositions using the resources of
the project.
In the third and final year, the composers completed new works for concert
performance at the 2008 MusiMarch Festival in Montreal. During this period, the role
of the instrument designers shifted from a strictly developmental one, to include
support for the technical requirements of each of the works composed for this concert.
While the above planned structure was largely maintained, some alterations did occur.
One of the most significant was that the number of concert works that were composed
went far beyond the pieces that were performed during the 2008 MusiMarch Festival.
The three composers who participated – Sean Ferguson, D. Andrew Stewart and
Heather Hindman – composed a total of eight pieces, which received eighteen
performances in all during the duration of the project, including performances in
Canada, France and Brazil.
20
S. Ferguson and M. M. Wanderley
Background: The DMI Seminars
In March 2002, the two authors (a composer and a researcher in music technology,
respectively) and percussionist D’Arcy Philip Gray first discussed the need for an
interdisciplinary seminar that could be taken by students in music technology,
composition and performance. The initial impetus came from Wanderley, then
recently hired at McGill, who brought the DMI research to McGill’s Faculty of Music
from his earlier work at IRCAM [Wanderley and Battier, 2000]. The idea was to
move from the design of hardware interface prototypes into fully-fledged DMIs, i.e.
instruments that could be used in performances anywhere, anytime, not only in
controlled research laboratory settings.
The seminar “Digital Musical Instruments: Technology, Performance and
Composition” was first given in January 2003. During the course of the seminar,
students formed into groups consisting of at least one representative from each of the
three areas. In conjunction with lectures given by the professors, students worked on
the joint creation of new gestural controllers for digital synthesis engines. This
included the design and construction of the new devices (including both hardware and
in many cases software for digital synthesis), the development of playing techniques
for the instrument and the composition and performance of new works in an end of
term class recital.
Figure 1. A typical work session at the DMI Seminar (Winter 2006). From left to right:
instrument designer Eileen TenCate, composer Heather Hindman and performer Jonathan
Davis are shown discussing the Tralf (lower right corner of the picture), initially developed by
composer Geof Holbrook. While TenCate and Hindman discuss the mapping and sound
synthesis aspects of the piece, Davis practices with the DMI.
The seminar was repeated on three other occasions, in the Winter sessions of 2004,
2006 and 2010. Such seminars were designed to allow students to actively collaborate
during an academic term (13 weeks) on the design, performance and composition of
pieces for novel DMIs. Despite the fact that excellent work resulted in the three
opportunities when this seminar was offered (for instance, the T-Stick [Malloch and
Wanderley, 2006]), we felt that such a limited duration for this exercise did not allow
The McGill Digital Orchestra Project
21
for a complete, high-level instrument design/composition/performance cycle. In fact,
frequently there was not enough time to fully develop novel DMIs. Even when
development time was sufficient, many times performance time was minimal, i.e.
performers were only able to begin working with the final version of the instrument
shortly before the concert.
The main achievement of these seminars was perhaps the forming of a working
method for interdisciplinary, collaborative development of new DMIs. Indeed, one of
the main goals of the Digital Orchestra project was to apply this methodology on a
larger scale by spreading the activities over a longer, more appropriate time span.
Figure 2. Another view of a typical work session of the DMI Seminar (Winter 2004). At the
front (right), percussionist Kristi Ibrahim wearing custom gloves with Force-Sensing Resistors
practicing a notated score composed by instrument designer Joseph Malloch (an example of a
student taking multiple roles in a seminar). In the background, instrument designer David
Birnbaum tests a camera-based controller based on the work of Lyons and colleagues [Lyons et
al. 2003].
Remarks on DMI Design
During the DMI Seminars as well as during the Digital Orchestra Project we had
several opportunities to explore the development of a number of hardware interfaces
and DMIs from scratch. Although it may sound straightforward, the process from the
original idea to a commercially viable or musically useful realization is indeed a
complex one.
Two classic examples of this process in the literature are the Continuum and the
Méta-Instrument [Miranda and Wanderley, 2006]. In the case of the Continuum, the
initial idea—the development of a continuous sensing surface—yielded three
generations of prototypes, each using a different technological solution: from camerabased finger tracking, to conductive rubber and finally to position sensing with Hall-
22
S. Ferguson and M. M. Wanderley
Effect sensors [Haken et al. 1998]. Only the last solution proved commercially (and
musically) viable due to technological limitations of the first two. A similar case is the
Méta-instrument, which is also currently in its third generation. Although it has not
undergone an equivalent amount of technological changes, several improvements
were made over the years to allow for performance variations, for instance, the ability
to perform the instrument standing up instead of seated [Laubier 1998][Laubier and
Goudard, 2006].
One important difference nowadays, if compared to the early developments in the
80’s and 90’s, is the astonishing ease with which a gestural controller can be built,
thanks to the recent wide availability of inexpensive, user-friendly technologies such
as the Arduino platform1. Indeed, the prototyping of a device using a variety of
sensors can be completed in just a few minutes, even by inexperienced designers. But
although such prototypes can be an acceptable solution for a class project (i.e. a
proof-of-concept), they are hardly ever a reliable musical instrument that can be
played in concert venues.
Several issues need to be taken into account when improving a prototype to become a
musical instrument. For instance, one fairly common surprise when moving a
laboratory prototype based on video sensing (e.g. colour tracking) using video
cameras, out of a laboratory to a concert venue is the discovery of the high sensitivity
of cameras to ambient conditions. Indeed, sensors do not adapt as easily to light
variations as the human visual system does. In this case, a setup calibrated in a room
with a window will probably not work as initially expected in a concert hall, or even
in the same room at different moments of the day. Similarly, infrared sensors that are
much less sensitive to ambient conditions can be strongly affected by concert hall
lights, that actually also generate infrared light. Unfortunately, these and other
annoying situations are only commonly noticed at the actual concert venue, causing
major frustration to all involved: performers, instrument designers, composers and,
not the least, the audience.
Yet another gap between devices used in controlled environments and DMIs is the
choice of sensor technologies used. For instance, as discussed in [Silva et al. 2005], in
order to sense the intensity and the direction of the air jet in a transversal flute, very
accurate measurement techniques such as Pitot tubes and hot wires that could be used
in a research laboratory are not usable in a device to be played in a concert for several
reasons, including price, obtrusiveness and complexity of setup. The solution in that
case was to use a dual static pressure sensor and obtain an approximation of the
intensity and direction values of the air jet from both pressure measurements. Such a
solution worked well for a musical instrument, but would probably not be considered
for acoustic measurements where accuracy and repeatability requirements are more
stringen.
Ergonomics is another crucial factor in DMI design. For example, one gestural
controller that we initially thought had a good musical potential in this project was the
Gyrotyre [Sinyor and Wanderley, 2005]. This interface consists of a moving bicycle
wheel held by a performer, whose rotation speed and inclination are sensed by a
variety of sensors. Preliminary experiments using this device with a variety of
The McGill Digital Orchestra Project
23
mappings allowed for a flexible usage as either a solo or accompanying instrument.
When tried out by the performers of the project, though, it was not considered as a
viable option for the final concert, since the performers thought that its weight was
excessive and that it could cause injuries. This situation was not apparent when we
were using it for experiments in the laboratory, but was obvious for professional
performers who are more aware of their physical condition than are university
researchers.
Finally, robustness is important when one considers that musical instruments are
played in diverse environments for several years. For a performing musician, it is not
normally possible to have a technician available at all times to fix technical problems
during the life of an instrument, while this is not necessarily a major problem in a
research laboratory.
As can be seen from the above and for many other reasons, the gap between a
laboratory prototype and a fully-fledged musical instrument is large and indeed
several design iterations and improvements are usually needed to achieve a stable (or
mature) version of a DMI.
Examples of DMIs
In this section we will briefly describe three DMIs that were designed and used in the
project. They are the T-stick, the T-box (originally named the Tralf) and the Rulers.
Many more details on the specific DMI developments during the project can be found
in several related publications, for instance [Sinyor and Wanderley, 2005] [Malloch et
al. 2006] [Malloch, 2007] [Malloch and Wanderley, 2006] and [Pestova et al. 2009].
T-stick
The T-stick is a tube-like gestural controller equipped with capacitive
sensors measuring touch on its surface, accelerometers measuring its inclination as
well as its acceleration and deceleration, a piezo crystal measuring deformation and
shock applied to the tube, and a pressure sensor that allows the performer to
continuously control the pressure of the grip he uses to hold the device. Several Tsticks have been built (roughly 20) with various sizes and features, although all share
the common characteristics described above [Malloch, 2007].
T-box
The T-box uses ultrasound transducers to measure the position of a
performer’s hands with respect to a fixed reference. Two attachments for the hands
each containing an ultrasound transducer (facing the floor) are worn by the performer,
who then moves her hands up and down with respect to a wooden box where four
ultrasound transducers are placed (facing the ceiling). The amplitude of the signal
from the emitting transducer is measured by the receiving one, providing continuous
2
information about the vertical position of the hands in space. Lateral movements of
the hands can create discontinuities in the signal captured by the receivers due to the
24
S. Ferguson and M. M. Wanderley
narrow directional characteristics of the transducers, therefore also providing a way to
control discrete events.
Rulers
The Rulers instrument is a hardware interface composed of several metal
rulers (bars) of various lengths, developed by David Birnbaum, [Malloch et al. 2006].
When pressing down or lifting a ruler, as well as when striking them, their positions
are sensed by infrared light sensors placed under each bar. The Rulers may be played
either with the fingers or the whole arm. Criteria for the Evaluation of Digital Musical Instruments
Over the years, a number of criteria have emerged to evaluate digital musical
instruments from multiple viewpoints.
Reproducibility and Continuous Control
Musical instruments that allow a performer to express herself musically must permit
an artist to imagine a musical idea and to reproduce it consistently. If a gesture has an
indeterminate musical result, one cannot say that the performer is in control of the
instrument. Acoustic musical instruments may be ranked according to their ability to
afford the performer with accurate continuous control. For example, according to this
definition wind chimes could be said to have less expressive potential than the cello,
since the performer has less precise continuous control over the musical result. In
digital musical instruments, the equivalent of a wind chime might be the use of a
single gesture to trigger a dense sequence of chance events. We avoid designs whose
purpose is solely to initiate random sequences. Note that this is not proposed as a
means of determining the musical interest of a particular instrumental paradigm,
which may nevertheless be high even when it generates unpredictable events. Our
goal is to provide the performer with consistent, reliable and continuous control
throughout the sonic life of a musical idea in order to ensure that this musical idea
will be reproducible over multiple performances.
Our emphasis on reproducibility is intended to contribute to the potential for the
adoption of an instrument by other performers. We feel it is important to ensure that a
given instrument can allow the same piece to be played by the same performer on
different occasions and venues, or by different performers in different locations. We
believe that one of the strongest motivations for learning an instrument is the desire to
perform a work that one has heard performed previously and to which one has had a
positive reaction. Instruments which feature reproducibility as one of their attributes
may therefore have a greater chance of adoption.
Reliability
In the presentation of a new instrument in a scientific context, such as a talk during a
conference, it is not unusual for unexpected problems to arise. This experience is
sometimes referred to as the “demo effect.” In the context of a conference, most
members of the audience will be sympathetic, and the instrument can be demonstrated
The McGill Digital Orchestra Project
25
in other ways, such as a video recording. In the context of a live performance,
however, any failure of an instrument to function exactly as expected, no matter how
slight, can be catastrophic. Even minute changes can have a devastating effect on the
artistic result, whether they are explicitly noticed by the audience or not. If they are
aware of any problems, the listeners in a concert setting cannot be relied upon to have
similar scientific experience to that of the designers of the instrument, and they may
therefore not exhibit the same degree of sympathy to the plight of the performer. If
the demo effect can cause embarrassment to a researcher, the failure of an instrument
to function properly during concert can cause stress and humiliation for the performer,
and existential crisis for the composer.
All instruments – even traditional orchestral ones – have a risk of failure, such as
broken strings, dropped mallets and stuck keys. Nevertheless, we attempted to
minimize this risk in a number of ways, including: (1) freezing the addition of new
features to an instrument well in advance of the concert; (2) giving performers
extensive access to the instruments for explorations and rehearsal; and (3) having the
composers write small etudes that acted as proving grounds for their musical ideas,
means for performers to learn the instruments, and technological test beds.
Expressive Potential
Many definitions exist for musical expression3. An informal explanation from a
performance point of view is that an instrument should afford non-conscious control
of the musical result. That is, the performer should be able to modify their gestures in
an appropriate way without explicit instructions about the details of the gestures, but
based on a vivid image that conjures expressive associations. Consider the scenario of
an instrumental lesson in which a teacher attempts to influence a student’s
performance not by providing specific instructions about which gestures to use, but by
using an evocative metaphor or image (such as imagining that one is in a flower-filled
meadow on a sunny afternoon). An instrument with expressive potential will react to
the minute adjustments that this mental image causes in the performance gestures of
the student to alter the musical result in the desired fashion. In other words, the
system is capable of extracting information about very fine details of the performance
gestures.
This expressive relationship between a performer and his or her instrument is a result
of both the design of an instrument and of a highly developed performance practice.
On any instrument, musical expression is a result of the development of advanced
playing technique over a long period of study. One result of this criterion is that the
creation of an instrument that could be easily mastered by a performer within a short
period of time was not an objective of the Digital Orchestra.
The evaluation of the expressive potential of an instrument seems to us to be best
done by the performer of that instrument, by comparing it to their prior experience
with their own musical instruments. Over the course of the three years of the project,
feedback from the performers indicated that this aspect of the instrument designs
improved constantly, largely due to the performers’ input during the design process.
For example, the scaling used in a mapping might be adjusted a great deal over the
26
S. Ferguson and M. M. Wanderley
course of the development of an instrument if the performers felt that changes to the
sound were too great or too small for the gestures that engendered them. By the time
that they were performed in concert at the end of the project, performers indeed felt
that they had expressive control of the instruments.
Interdisciplinary perspectives
In this section we discuss the contributions of each of the disciplines to the overall
goals of the project.
Music Technology
DMI design was at the core of the Digital Orchestra Project, and music technologists
at McGill performed several tasks in this direction: the study and development of
novel pressure and position sensors based on paper [Koehly et al 2006], the use of a
variety of sensors available for commercial applications [Wanderley et al. 2006 –
SensorWiki] and the evaluation of the choice of sensors that best fit a musical
application [Marshall 2009]. The design of novel devices was also carried out by
technologists, but with the indispensable help of composers and performers, e.g. the
T-Stick, [Malloch and Wanderley 2006], the Gyrotyre [Sinyor and Wanderley, 2005]
and the Rulers [Malloch et al. 2006]. Another use of sensors and technology in the
project was the development of tools for gestural control of sound spatialization
[Marshall, Malloch and Wanderley, 2009], using both existing devices for capturing
gestures as well as novel designs.
Apart from the extensive evaluation and testing of a variety of sensors and their use in
the design of novel DMIs, instrument designers also developed tools for mapping the
sensor outputs to sound synthesis control inputs. This was an interesting, initially
unexpected, outcome of the project. In fact, in practice, the development of mapping
strategies by an interdisciplinary group of people is not an obvious task for the
following reasons: 1) people do not all have the same technical knowledge to
approach mapping strategies (i.e. good programming experience with Max/MSP in
this case), and 2) in a group, the opportunity to easily create and destroy mappings on
the fly simultaneously by different people playing versions of the same controller is
essential (contrary to the case of a single instrument designer). The solution to this
issue is called the Mapper [Malloch, Sinclair and Wanderley, 2008].
The Digital Orchestra Mapper was designed as a decentralized network for the
management of peer-to-peer data connections using the Open Sound Control
communication protocol.
An intuitive graphical user interface was developed for dynamically creating,
modifying, and destroying mapping connections between control data streams and
synthesis parameters (cf. figure 3).
Another software tool that proved essential in the project was the Digital Orchestra
Toolbox [Malloch et al. 2010], a large collection of Max/MSP objects that help the
design of novel DMIs by providing tools for data conditioning and processing4.
Figure 3. The current graphical interface for the Mapper, developed in Max/MSP. On the left side (Sources), controller
parameters are displayed. Raw sensor outputs – in the case of the Soprano T-Stick used in this figure, the values of
accelerometer x, y and z axes, the piezo crystal, the pressure sensor and the capacitive sensors – and higher-level
parameters are available, such as information about the grip or the amount of energy used by the performer. On the right
side (Destinations), the inputs of the synthesis algorithm are displayed, in this case, granular synthesis inputs. The middle
part of the figure allows the designer to create connections between both sides, effectively establishing mappings between
controller and synthesis variables. Furthermore, as seen in the top part of the figure, designers can condition the mappings
defining an expression describing the behaviour one wants to achieve, limiting input and output ranges, etc. A full
description of the functionalities of the Mapper is presented in [Malloch et al. 2008].
The McGill Digital Orchestra Project
27
28
S. Ferguson and M. M. Wanderley
Composition
An important requirement for the longevity of an instrument is the existence of a
compelling repertoire of works that serves to motivate future performances. One of
the principal goals of including composers in the development process and including
performances in a professional context was to attempt to create viable works of art
with value beyond simply demonstrating the capabilities of the instruments.
In order to create a corpus of music, it became apparent that the development of an
appropriate system of notation was crucial. In our experience, three different
approaches are used: (1) a metaphorical representation of the resulting sound; (2) a
graphic representation of the type of gesture desired; or, (3) a symbolic system with
no obvious links to either gesture or sonic result. In practice, the type of notation that
we have used includes a combination of traditional musical notation with the addition
of new graphical elements representing any or all of the above three approaches.
In Figure 4, for example, small circles of various types in the part for the FM Gloves
[Marshal, 2010] (indicated on the middle staff as “gl.”) act as a metaphorical
representation of short sounds created by tapping gestures of the fingers against the
thumbs. Note that the same notation of sonic result is used in the cello part on the top
staff, which uses different gestures to create the same result. The small curved arrow
in the first measure of the glove part, on the other hand, graphically represents the
desired gesture–a rotation of the hand–but does not indicate the sounding result.
Figure 4. An excerpt of the score for The Long and the Short of It, for cello and two digital
musical instruments, by Heather Hindman.
In Figure 5, the length of the vertical bars on the three-line staff is a graphical
indication of the gesture of gripping the t-stick instrument. The bars represent the
body of the instrument and the length of the bars indicate the grip to be used. In this
case the vertical bars are at their longest, indicating that the entire instrument should
be gripped by both hands. The figure in the circle above the barline indicates, among
other things, the physical orientation of the instrument; in this case it is to be held
vertically. The square grids above the staff are a kind of tablature notation that
The McGill Digital Orchestra Project
29
indicate two separate instances of a synthesis algorithm that are combined to create
the voice of the instrument. The position of each icon in the two-dimensional space of
the grid indicates the variation of two control parameters for the algorithm. The
performer is to interpolate between the instances of this notation using whatever
gestures will create the desired result. The change of the star between the first two
grids in this example indicates that only one of these parameters is to change, while
the lack of change to the circle indicates that the control parameters of the second
instance are not to change. Note that these indications are symbolic and do not
correspond directly either to the gesture to be used nor the type of sound that is
created. In performance, the user is able to confirm that he or she is accurately
producing the desired effect by referring to a matching graphical interface. Composer
D. Andrew Stewart developed this dynamic interface to be displayed on a computer
screen and used by the performer to provide visual feedback during the rehearsals and
performance of a piece: see Figure 6.
Figure 5. Notation of the t-stick part for Catching Air and the Superman.
Figure 6. Examples of the dynamic graphical user interface for t-stick notation developed by
D. Andrew Stewart for his work Catching Air and the Superman. The three grids indicate the
state of the display at each of the three points notated in the score in Figure 5. Only one grid is
visible on the screen.
One unexpected development that took place during the Digital Orchestra Project was
the role taken by composers in collaborating in the development of the mapping of
gestures to the parameters of the digital synthesis algorithms that served as the voices
of the instruments. In our original conception, mapping was placed within the domain
30
S. Ferguson and M. M. Wanderley
of the instrument designers. In practice, however, the designers created powerful tools
for mapping (e.g. The Mapper, the Digital Orchestra Toolbox), but the responsibility
for the actual implementation was shared with the composers. We thus came to see
mapping ultimately as being as much an artistic as a technological activity.
As indicated by its title, an important compositional goal of the Digital Orchestra
Project was the integration of digital musical instruments into both large and small
ensemble settings. This included the combination of different digital musical
instruments into a chamber ensemble, as in D. Andrew Stewart’s Sounds Between
Our Minds for three DMIs, or into an ensemble including both digital and traditional
acoustic musical instruments, as in Heather Hindman’s The Long and Short of It, for
an ensemble of two digital instruments and cello, or in Sean Ferguson’s Ex Asperis,
for two gestural controllers and large chamber ensemble. One of the greatest
challenges of this approach was the blending of the voices of the different instruments
in a coherent manner. Different techniques were used to accomplish this. For
example, in D. Andrew Stewart’s large ensemble work Catching Air and the
Superman – for two new digital musical instruments, keyboard MIDI controller and a
chamber ensemble of 13 instrumentalists – the composer first created the synthesized
voices of the two DMIs and then performed spectral analyses on these sounds, using
the most prominent partials to derive the microtonal pitches that formed the harmony
of the chamber ensemble. In this way, the composer was able to successfully integrate
the harmony of the traditional instruments into the timbral world of the DMIs. Other
techniques included, for instance, ensuring that the voices of the digital instruments
occupied different registers and had different timbres and attacks. In order to create a
sense of ensemble, composers often chose to isolate each instrument’s voice in a
single loudspeaker placed close to the performer, rather than, say, mixing all of the
instruments into a single stereo or surround audio environment. In this way it was
hoped that listeners would be able to use spatial cues to associate a particular voice
with a physical instrument on stage.
Performance
In order to evaluate DMIs for their musical potential, it was crucial to have input from
advanced artists who had already experienced a degree of artistic symbiosis with their
own instruments, whether it be cello, piano, percussion or any other instrument. The
performers involved in the Digital Orchestra provided constant feedback to the
instrument designers and composers throughout the entire three-year period. Since the
instruments were not designed to be based on existing acoustic instruments, many
different types of performers were involved (e.g. piano, percussion, cello, etc.).
Furthermore, performers were not necessarily required to have had previous
experience with gestural controllers.
Their contributions took a number of different forms. They evaluated the physical
aspects of the instruments, such as size and weight. At one point, for example,
concern over possible injury to performers due to the heavy weight of one instrument
caused it to be rejected for future use in the Digital Orchestra, as described above.
They also evaluated the “feel” of the materials of the instruments and helped in the
fine-tuning of the tactile feedback that the instruments provided. The performers were
The McGill Digital Orchestra Project
31
particularly vociferous in their demands for low and consistent latency between
gestures and their sonic results.
The composers also benefitted from the performers’ expertise. The instrumentalists
worked closely on the development of the notation for the instruments and of
idiomatic playing techniques. Another important area of collaboration was the
appropriateness of physical gestures to musical material.
Figure 7. D. Andrew Stewart demonstrating two different playing techniques on the T-stick:
tapping (left) and jabbing. In tapping, arms remain stationary, while the sensing of the finger
impacts, as well as the orientation of the T-stick, provide control variables. The top image
shows the hands in position just before a multi-finger tapping gesture, while the bottom image
shows the moment of contact. In jabbing, large amplitude gestures are produced by thrusts of
the instrument so that the orientation and the acceleration of the device are the main control
variables, as well as hand positioning. The top image is the end point of an upwards jab and
the bottom image is the preparation for a horizontal jab.
Since our goal was to design instruments that had the potential to be performed by
many different artists, we felt it would be valuable to observe the process of the
adoption of a new instrument by a performer who had not participated in its
development. At the end of the second year of the project, cellist Erika Donald joined
the Digital Orchestra to perform the soprano t-stick in Heather Hindman’s piece The
Long and the Short of It. Ms. Donald was asked to document her experience in
learning this new instrument. Her contributions were particularly valuable, since they
gave the point of view of a performer approaching a new DMI for the first time. She
was given approximately a year to develop expertise on the instrument before
performing in concert.
32
S. Ferguson and M. M. Wanderley
Given our stated desire to create instruments with expressive potential, as described
earlier, a fair question would be to what extent were we successful? If the goal was to
achieve the same degree of refinement as a traditional orchestral instrument such as
the cello within the confines of a three-year research project, then we did not succeed.
But if the goal was to use an interdisciplinary approach to move in the proper
direction so that the performers felt that they were able to reliably express their
musical ideas without undue interference from the instrument, then we believe that
the project has been a success. Furthermore, we feel that continuing to work in this
way will allow us to move closer to this goal by incorporating feedback from
experienced performers into the process of designing and refining DMIs. Ultimately,
as in any musical endeavor, the success or failure of the Digital Orchestra Project
should be judged by its artistic results. We feel that the enthusiastic response we
received to the performances by the Digital Orchestra, both from the general public
and from those familiar with the domain of digital musical instruments, is a good
indication that we were able to achieve our goals to a large degree.
Conclusion
The McGill Digital Orchestra presents a number of paradigms for the design, creation
and performance of digital musical instruments in the context of a long-term
interdisciplinary, collaborative environment. Issues related to mapping strategies,
notation, the relationship of physical and musical gestures, robustness,
responsiveness, and arose during the course of the project. Furthermore, the Mapper
software and Digital Orchestra Toolbox continue to be used in other contexts.
Based on our experience, we propose that one effective measure for the evaluation of
a digital musical instrument is its ability to reproduce a performance of a particular
piece, either by the same performer or by different performers. This involves the
ability to realize a piece based on a notated score, whether on paper or using softwarebased visual feedback in a graphical environment. We suggest that this may aid in
ensuring the viability and longevity of a novel digital musical instrument.
The results of this long-term, multidisciplinary approach to digital musical instrument
design include interfaces that have been proven in high-profile professional
performance contexts and that are still being used actively by several performers
world-wide.
Acknowledgements
We would like to thank all of the research assistants who participated in the Digital
Orchestra Project: David Birnbaum, Simon de Leon, Chloé Dominguez, Erika
Donald, Heather Hindman, Bryan Jacobs, Rodolphe Koehly, Joseph Malloch, Mark
Marshall, Xenia Pestova, Fernando Rocha, Bertrand Scherrer, Marlon Schumacher,
Stephen Sinclair, D. Andrew Stewart and Kent Walker. Special thanks to Joseph
Malloch for comments on earlier versions of this paper. Thanks also to the
The McGill Digital Orchestra Project
33
MusiMarch Festival of the Schulich School of Music, and its Artistic Director, Denys
Bouliane.
The Digital Orchestra Project received funding from the Fonds québécois de
recherche sur la société et la culture and the Centre for Interdisciplinary Research in
Music Media and Technology. Research was carried out in the Schulich School of
Music of McGill University at the Input Devices and Music Interaction Laboratory
and the Digital Composition Studios.
References
Burtner Matthew. ”Noisegate 67 for Metasaxophone: Composition and Performance
Considerations of a New Computer Music Controller.” In Proceedings of the 2002
International Conference on New Interfaces for Musical Expression (NIME02), Dublin,
Ireland, pp. 71-76, 2002.
Burtner, Matthew. ”Composing for the (dis)Embodied Ensemble: Notational Systems in
(dis)Appearances.” In Proceedings of the 2003 International Conference on New Interfaces
for Musical Expression (NIME03), Montreal, Canada, pp. 63-69, 2003.
Butler, Jennifer. “Creating Pedagogical Etudes for Interactive Instruments.” In Proceedings of
the 2008 International Conference on New Interfaces for Musical Expression (NIME08),
Genoa, Italy, pp. 77-81, 2008.
Cook, Perry R. “Principles for Designing Computer Music Controllers.” In online Proceedings
of New Interfaces for Musical Expression Workshop (NIME01), Seattle, WA, 2001.
Available from: http://www.nime.org/2001/ Accessed on Sept 06, 2010.
Cook, Perry R. “Re-­‐Designing Principles for Computer Music Controllers: a Case Study of SqueezeVox Maggie.” In Proceedings of the 2009 International Conference on New
Interfaces for Musical Expression (NIME09), Pittsburgh, USA, pp 218-221, 2009.
Dobrian, Christopher and Daniel Koppelman 2006. “The 'E' in NIME: Musical Expression with
New Computer Interfaces.” In Proceedings of the 2006 International Conference on New
Interfaces for Musical Expression (NIME06), Paris, France, pp 277-282, 2006.
Haken, Lippold, Ed Tellman and Patrick Wolfe. “An Indiscrete Music Keyboard.” Computer
Music Journal 22(1):31-48, 1998.
Jordà, Sergi. Digital Lutherie: Crafting musical computers for new musics' performance and
improvisation. Ph.D. Thesis, Universitat Pompeu Fabra, 2005.
Koehly, Rodolphe, Denis Curtil, and Marcelo M. Wanderley. "Paper FSRs and Latex/Fabric
Traction Sensors: Methods for the Development of Home-Made Touch Sensors". In
Proceedings of the International Conference on New Interfaces for Musical Expression
(NIME), pp. 230-233, Paris, France, 2006.
Laubier, Serge de. “The Meta-Instrument.” Computer Music Journal 22(1):25-29, 1998.
Laubier, Serge de and Vincent Goudard. “Meta-Instrument 3: a look over 17 years of practice.”
In Proceedings of the 2006 International Conference on New Interfaces for Musical
Expression (NIME06), Paris, France, pp 288-291, 2006.
Lyons, Michael, Michael Haehnel and Nobujo Tetsutani. “Designing, Playing, and Performing
with a Vision-based Mouth Interface.” In Proceedings of the 2003 International Conference
on New Interfaces for Musical Expression (NIME03), Montreal, Canada, pp. 116-121, 2003.
Machover, Tod. “Hyperinstruments - a Composer's Approach to the Evolution of Intelligent
Musical Instruments.” In L. Jacobson, ed. Cyberarts: Exploring Arts and Technology. San
Francisco: Miller Freeman Inc., pp. 67-76, 1992.
34
S. Ferguson and M. M. Wanderley
Malloch, Joseph, David Birnbaum, Elliot Sinyor, and Marcelo M. Wanderley. "A New
Conceptual Framework for Digital Musical Instruments." In Proceedings of the 9th
International Conference on Digital Audio Effects (DAFx-06), Montreal, Canada, pp. 49-52,
2006.
Malloch, Joseph. A Consort of Gestural Musical Controllers: Design, Construction, and
Performance, M.A. Thesis, McGill University, August 2007.
Malloch, Joseph, and Marcelo M. Wanderley. "The T-Stick: From Musical Interface to Musical
Instrument." In Proceedings of the 2007 International Conference on New Interfaces for
Musical Expression (NIME07), New York City, USA, pp. 66-69, 2007.
Malloch, Joseph, Stephen Sinclair, and Marcelo M. Wanderley. "A Network-Based Framework
for Collaborative Development and Performance of Digital Musical Instruments." In R.
Kronland-Martinet, S. Ystad, and K. Jensen (Eds.): CMMR 2007 - Proc. of Computer Music
Modeling and Retrieval 2007 Conference, LNCS 4969. Berlin Heidelberg: Springer-Verlag,
pp. 401–425, 2008.
Malloch, Joseph, Stephen Sinclair, and Marlon Schumacher. Digital Orchestra Toolbox –
http://www.idmil.org/projects/digital_orchestra_toolbox Accessed on Sept 06, 2010..
Marshall, Mark. The FM Gloves – www.marktmarshall.com/projects/the fm gloves Accessed
on March 17, 2010.
Marshall, Mark. Physical Interface Design for Digital Musical Instruments. Ph.D. thesis,
McGill University, March 2009.
Marshall, Mark T., Joseph Malloch and Marcelo M. Wanderley. "Gestural Control of Sound
Spatialization for Live Musical Performance. " In M. S. Dias, S. Gibet, M M. Wanderley
and R. Bastos (Eds.): Gesture-Based Human-Computer Interaction and Simulation · 7th
International Gesture Workshop, GW 2007, Lisbon, Portugal, May 23-25, 2007, Revised
Selected Papers. LNCS series Vol. 5085, Springer Verlag, pp. 227-238, 2009.
Miranda, Eduardo R. and Marcelo M. Wanderley. New Digital Musical Instruments: Control
and Interaction beyond the Keyboard, A-R Editions, Spring 2006. ISBN 0-89579-585-X
Oore, Sageev. “Learning Advanced Skills of New Instruments (or practicing scales an
arpeggios on your NIME).” In Proceedings of the 2005 International Conference on New
Interfaces for Musical Expression (NIME07), Vancouver, Canada, pp. 60-64, 2005.
Overholt, Dan. “The Musical Interface Technology Design Space” Organised Sound 14(2):
217–226, 2009.
Palacio-Quintin, Cléo. “Eight Years of Practice on the Hyper-Flute: Technological and Musical
Perspectives.” In Proceedings of the 2008 International Conference on New Interfaces for
Musical Expression (NIME08), Genoa, Italy, pp. 293-298, 2008.
Pestova, Xenia, Erika Donald, Heather Hindman, Joseph Malloch, Mark T. Marshall, Fernando
Rocha, Stephen Sinclair, D. Andrew Stewart, Marcelo M. Wanderley, and Sean Ferguson.
"The CIRMMT/McGill Digital Orchestra Project." In Proceedings of the 2009 International
Computer Music Conference (ICMC09), Montreal, Canada, pp. 295-298, 2009.
Piringer, Jörg. “Elektronische Musik und Interaktivität: Prinzipien, konzepte, anwendungen.”
Master’s Thesis, University of Viena, 2001.
Ryan, Joel. “Effort and Expression.” In Proceedings of the 1992 International Computer Music
Conference (ICMC92). San Francisco, International Computer Music Association, pp. 414416, 1992.
Schloss, W. Andrew and David A. Jaffe. “Intelligent Musical Instruments: The Future of
Musical Performance or the Demise of the Performer?” Interface Journal for New Music
Research, 22(3), 1993.
Siegel, Wayne, and Jens Jacobsen. “The Challenges of Interactive Dance: An Overview and
Case Study.” Computer Music Journal, 22(4):29-43, 1999.
The McGill Digital Orchestra Project
35
Silva, Andrey, Marcelo M. Wanderley and Gary Scavone. “On the Use of Flute Air Jet as a
Musical Control Variable." In Proceedings of the International Conference on New
Interfaces for Musical Expression (NIME05), Vancouver, Canada, pp. 105-108, 2005.
Sinyor, Elliot and Marcelo M. Wanderley. "Gyrotyre: A Hand-held Dynamic Computer-Music
Controller Based on a Spinning Wheel." Proc. of the 2005 International Conference on New
Interfaces for Musical Expression (NIME05), Vancouver, Canada, pp. 42-45, 2005.
Stewart, D. Andrew. Catching Air and the Superman, D.Mus. Thesis, McGill University.
January 2010.
Tanaka, Atau. “Musical Performance Practice on Sensor-based Instruments” In M. Wanderley
and M. Battier (eds), Trends in Gestural Control of Music. Paris: IRCAM, pp 389-405,
2000.
Wanderley, Marcelo M. and Marc Battier, editors. Trends in Gestural Control of Music, Paris,
Fr: IRCAM - Centre Georges Pompidou, 2000.
Wanderley, Marcelo M. and Philippe Depalle. “Gestural Control of Sound Synthesis.” Proc. of
the IEEE 92(4):632-644, 2004.
Wanderley, Marcelo M., Joseph Malloch, David Birnbaum, Elliot Sinyor, and Julien Boissinot.
"SensorWiki.org: A Collaborative Resource on Transducers for Researchers and Interface
Designers." In the Proceedings of the 2006 International Conference on New Interfaces for
Musical Expression (NIME06), Paris, France, pp. 180-183, 2006.
1
www.arduino.cc
as well as the orientation of the hands.
3
See, for example, the discussion of “expressivity” in [Malloch et al, 2006]
4
Both software tools are freely available from the IDMIL website at www.idmil.org/software
2
Biographies
Sean Ferguson is Associate Professor of Composition at the Schulich School of Music of
McGill University, in Montreal, Canada, where he directs the Digital Composition Studios.
Since 2009 he has also been the Director of the Centre for Interdisciplinary Research in Music
Media and Technology. His music has been performed by ensembles such as the Montreal
Symphony Orchestra, the Orchestre Philharmonique de Radio-France, the Ensemble
contemporain de Montréal and the Société de musique contemporaine de Montréal. In 2009 he
served as the Music Chair of the ICMC. His research interests include computer-assisted
composition, live electronics and digital musical instruments.
Marcelo Mortensen Wanderley holds a Ph.D. degree from the Université Pierre et Marie
Curie (Paris VI), France, on acoustics, signal processing, and computer science applied to
music. His main research interests include gestural control of sound synthesis, input device
design and evaluation, and human–computer interaction. In 2003 Dr. Wanderley was the Chair
of the International Conference on New Interfaces for Musical Expression (NIME03). In 2006,
he co-authored the textbook “New Digital Musical Instruments: Control and Interaction
Beyond the Keyboard”, A-R Editions. He is currently Associate Professor of Music
Technology at the Schulich School of Music, McGill University, Montreal, Canada.
1