3DMIN — Challenges and Interventions in Design, Development and Dissemination of New Musical Instruments Till Bovermann, Amelie Hinrichsen, Dominik Hildebrand Marques Lopes, Alberto de Campo Berlin University of the Arts [t.bovermann|a.hinrichsen|d.himalo|decampo]@udk-berlin.de Hauke Egermann, Alexander Foerstel, Sarah-Indriyati Hardjowirogo, Andreas Pysiewicz, Stefan Weinzierl Technical University Berlin [email protected] ABSTRACT This paper presents challenges in the design, development and dissemination of electronic and digital musical instruments as they were identified within a newly established interdisciplinary research project. These challenges, covering a range of theoretical, artistic and practical perspectives, fall into the categories Embracing Technology, Musicology-informed Design, Researching and Integrating Embodiment and Aesthetics and Artistic Values. We illustrate, how the research project provides interventions and measures to the community that are related to these challenges. Furthermore we intend to investigate conditions for the success of new musical instruments with respect to their design and both their scientific and artistic values. perience, Section 2 will summarise what we consider the most interesting challenges and propositions. Subsequently, we describe paths that emerge from combining these ideas with our research interests and report on a new project for the design, development and dissemination of new musical instruments (3DMIN), integrating an interdisciplinary group of researchers and artists (Section 3). 3DMIN gathers expertise in fields as diverse as musicology, psychology, composition, experimental music, sound art, art and design research, computational art, robotics, sonification, musical acoustics and audio technology, all neighbour disciplines to NIME- and ICMC-related research. We believe that the diverse backgrounds involved will lead to an integrative view on ICMC research that will provide benefit to the community and result in fruitful discussions and contributions. 1. MOTIVATION The emergence of electronic sound synthesis not only created a vast repertoire of new musical sounds and techniques, it also allowed performers to think independently of the control modality and the sonic gestalt of an instrument. This lead to an abandonment of stage expressiveness in the post-1945 period, which drastically reduced the role of the formerly immanent relation between player, instrument and sound generation. Over the course of the last two decades, a rediscovery of live electronic performance took place. The previously prevalent paradigm of absolute control for the composer was again complemented by a desire for live performance, in the form of both interpretation and improvisation. This trend called (and still calls) for instruments to provide expressivity for live music and is reflected in the research interests of academic communities such as those around NIME and ICMC conferences [1]. As the authors of this paper, we originate mostly from neighbouring disciplines, and thus want to risk an outside look into the challenges within the ICMC community and propose ideas for possible design strategies and interventions. Grounded in both literature review and artistic exc Copyright: 2014 Till Bovermann, Amelie Hinrichsen, Dominik Hildebrand Marques Lopes, Alberto de Campo et al. This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 Unported License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. 2. CHALLENGES FOR THE DESIGN OF NEW MUSICAL INSTRUMENTS In our view, the design, development, and dissemination of new musical instruments currently faces challenges from various areas which we list below. We recognise that some of them are already well know within the community, however, we found it useful to again add them here to provide a framing of our research interests. 2.1 (a) Embracing Technology A conventional acoustic instrument integrates control and sound synthesis into one monolithic artefact. Altering any one of its elements will affect the qualities of the others. However, in modern instrument design, it is possible to add an independent mapping layer between each two of the components, which allows to separate them and therefore make their qualities less intertwined [2]. An explicitly introduced mapping between instrument components allows the player to freely select a control structure that supports her in navigating the instrument’s parameter space. This will, however, affect the instrument’s character and therefore its playability. According to Wessel, an instrument with a fast and sensitive mapping of physical action to adjustments of underlying sound synthesis processes makes performers experience a direct link to the sound creation, creating control intimacy [2]. From a performer’s perspective, it only makes sense to add new parts to the repertoire of instrument building, if the specifics of the introduced technology add to its sound and control repertoire, and if they possess an actual chance of being used by the performer. Although it is possible to implement such mapping interfaces on an (electro-)mechanical level, 1 its true potential only unfolds by introducing (digital) software elements and, with them, scriptability of its components, i.e., the ability for programmatic control [3]. Scriptability adds flexibility to instruments because it not only allows to define and alter the mapping between two instrument components on the fly but also enables to define and even redefine the components themselves (as long as existing in software). 2 Scriptability, and more generally algorithmic elements, add even more to an instrument, since they allow to switch between levels of control: While conventional direct control, as it is common for acoustic instruments, lets the performer play sounds directly and directly only, meta-control levels add time and timbre automation, ranging from underlying musical material to capture small phrases or gestures up to larger sections or entire pieces. Meta-control allows steering of musical processes on multiple layers, from controlling prepared sequences to modifying control gestures (whether algorithmically generated, recorded in advance or played just in time) to adapt them to the evolving musical context of a performance. Spatialisation of sound has become a central compositional design category of electro-acoustic music and sound art in the 20th century, which is very often used only in the pre-performative composition process. While sound has an inherent spatial dimension, instrumentalists rarely integrate realtime control of spatial parameters into performances. The integration of controls for spatial sound parameters into instruments opens up a new performative dimension and adds to the experience of a musical instrument because it offers the performer a way to entangle spatial and musical parts in her playing. 2.2 (b) Musicology-informed Design Considering the increasingly wide range of manifestations of musical instruments and a rapidly evolving diversity in performance practice, the question arises whether previous approaches to classify electronic musical instruments and their taxonomies are adequate to classify the rich stream of new instrument designs. Existing classification models [5, 6, 7] are even more challenged by the fact that the paradigm of “musical instrument” as a merely interpretative tool is often shifted towards an independent, selfcontained artwork. New taxonomies for new musical instruments, in conjunction with a thorough examination of the epistemological principles of musical instrument classifications in general 1 See e.g., accordions in which various types of key layouts are mechanically implemented. 2 This implicitly includes just in time adjustments of digital signal processing parts, too. [4] and of the notion of musical instrument in particular, will help to account for paradigm shifts in instrument development, as well as in musical practice. While taking recent classification approaches into consideration [8, 9, 10, 11], a database on historical and contemporary electronic and digital musical instruments being developed in the course of the project shall be used to frame a systematisation method that no longer treats electronic and digital instruments as theoretical exceptions, but analyses them in the context of traditional musical instruments and seeks to order them according to technical-functional and to musical-aesthetical aspects. This process then feeds back into the creation process, establishing the informed (artistic) research method of experience as thinking. [12] Systematic documentation of both historical and current instrument designs make them available as artifacts that can be built and modified by anyone interested. By means of open source software, open design hardware, and detailed audiovisual documentation of performance practice as related to design intentions, experimental new instruments may find wider interest and artistic use. Making new instrument designs as easy as possible to understand, access and build will help widen their distribution, and will facilitate future research both extending and diverging from previous instrumentdesigns. 2.3 (c) Researching and Integrating Embodiment Recent theories of music production and listening propose an integrated model of motion and perception (e.g., Embodied Music Cognition [13]). Because many new musical instruments heavily rely on gestural control, this theory can be employed to study interaction processes between gestures and sound generation in musical instruments. Leman puts it as follows: “transparent mediation technology [. . . ] would then act as a natural mediator [. . . ] for interactive music-making.” (Leman [13, p. 2]) Imagine observing a craftsman in her daily routine: A certain elegance can be discovered in her trained and countlessly repeated movements. She does not have to explicitly think about the individual mechanical tasks because she is familiar with the use of her instruments. Her body has memorised everything necessary, the process of crafting is automatised. Regarding Godoys and Lemans Concept of Body Schema an equivalent moment of a so called daily routine can be found in a musical context: “ Trained musicians, for example, can play a particular melodic figure by heart. They do not have to think about how to move their fingers on the instrument [. . . ].[. . . ] the melodic pattern is just something that appears to come out of their body.” (Godoy [14, p. 8]) Not only the tool itself that became, in a Heideggerian sense, ready to hand [15], it is the whole process that turned into an embodied extension. Not only the instrument becomes an extension to the human body. It is the combination of an instrument with the corresponding motor pattern [14] which turn the whole process of acting into an “embodied extension”. Creating an instrument therefore is not only about the interface itself but the routines and patterns merging the object with the subject. Dobrian and Koppelmann argue that this process of merging is fundamental for developing ones own style and expression: “When control of the instrument has been mastered to the point where it is mostly subconscious, the mind has more freedom to concentrate consciously on listening and expression.” (Dobrian, Koppelmann [16, p. 279]) In other words, it is the embodied memory which makes virtuosity and therefore musical expression possible. Thus a physical interaction between instrument and musician is crucial in order to provide sensory connectedness to the performer. This relation should however not be reduced to a simple touch / no touch range of possibilities. E.g., a double bass player literally embraces her instrument while playing: Both bodies are physically involved in this confrontation of instrument and musician, gestures and postures are provoked simply by the size and shape of instrument and player. Playing an instrument expressively requires and enables movement of the whole body. Therefore, within the process of developing an instrument, the designer has to include the performer’s whole body as the corresponding element. This, however, does not mean that the performer’s whole body is controlling the instrument, rather that her body posture (implied by the instrument’s affordance) together with her actions create the emotional framework for her musical expression. We believe that gestures of the player can be translated into control commands to form an expressive, transparent, and therefore intuitive way to play. We further hypothesise that, through making use of embodied gesture-sound knowledge as is acquired in everyday life, the design of musical instruments can be substantially improved. This would result in instruments that are easier to play and perceive, especially when input gestures and sound output are following the regularities of everyday gesture-sound mappings. There are appropriate metaphors in gestures, movement and postures which have to be found in order to enhance the subconscious relation between musician and musical instrument. Furthermore, dedicated types of physical interaction, especially energy preserving mechano-acoustic links are important for the cognitive and emotional reception (i.e., readability) of a live performance. 2.4 (d) Aesthetics and Artistic Values An important part of instruments and their design is their aesthetic and artistic value, be it connected to its playability, its playing context or its expressivity. “if our goal is musical expression we have to move beyond designing technical systems.[//] we have to move beyond symbolic interaction.[//] we have to transcend responsive logic;[//] engage with the system:[//] power it and touch it with our bodies,[//] with our brains.[//] invent it and discover it’s [sic] life;[//] embrace it as instrument.[//] an instrument that sounds between our minds” (Waisvisz [1]) It is expressed frequently that it is easier to design a musical piece rather than an artefact [17]. However, in the light of Waisvisz’s quote, there is an inherent quality to an instrument that is independent from the piece it was designed for. In difference to a performance system that is built specifically for one piece, an instrument offers ways to mediate and catalyse a broad range of artistic intentions. Also, artists often express that something that might be coined the sustainable joyfulness of an instrument is a major factor for them to accept an instrument: It is important to design a musical instrument such that it offers paths through its (sonic and haptic) possibility space that feel natural yet surprising over long periods of playing. Another aesthetic quality of an instrument lies in the context in which it is used and how it is then perceived as being ready to hand. There is a difference between the artist’s perception of an instrument when considering stage use (with an audience) and personal use (without any or just a small audience). 3. THE 3DMIN PROJECT 3DMIN is an interdisciplinary research project that links various disciplines to investigate conditions for artistic success of new musical instruments. The team collaborates with musicians and performers on designs for various prototypes of new musical instruments and interfaces. The particular focus herein lies to support musicians in the realisation of their artistic vision of music making by including the context, i.e., designing instruments that allow the participating artists to perform their music as they envision it. More specifically, the project addresses several particular research interests, each looking at the challenges described above from a different angle. 3.1 Design Research for New Musical Instruments The aim of this central part of the project is to iteratively turn broad concepts and ideas into a series of working design prototypes for musical instruments. Within this endeavour, the results of the other project subdivisions are of major importance in order to follow the method of experience as thinking described in Section 2.2. Each iteration includes design, prototyping and evaluation phases, which ensures that the knowledge gained is fed back into the project. The process is accompanied by a lecture series for students of the participating universities and by collaborations with artists and performers. Additionally, all documentation and design guidelines of the resulting instrument prototypes, ranging from their physical and interaction design, their mapping, to sound synthesis software, are published as open source. 3 We aim to address the following research interests. 3.1.1 Towards Choreographic Instruments By analysing choreographic and compositional processes in contemporary dance and music, we aim at a practical 3 http://3DMIN.org approach to transdisciplinary artistic research that involves dancers, choreographers, composers and musicians. Incorporating knowledge on physical intelligence and correlations between sound and movement we design new musical instruments based on the assumptions that the whole body is an integral part of music making and that only the involvement of cognitive input enables virtuosity. 3.1.2 Understanding Meta-Control We consider the modularity of NIME instruments one of their essential features, and well worth deeper exploration. We contend that in the basic instrument model (human input, gestural data, mapping, generating process), the mapping is potentially the most flexible link. Candidates for new mapping strategies include: varying the mappings between controls and sound parameters, and even changing them in performance e.g., by gradually entangling or disentangling parameters; recording parameter state snapshots and gestural data in performance, and reusing them as modifiable performance material, creating networks of crossinfluence between gestural input from multiple human players, other gestural sources, and sound generating processes, which again can be modified in performance. This can be seen as gracefully losing direct control of the processes while gaining higher-level forms of influence on their behaviour. We expect to find non-obvious but interesting mapping approaches which can be built into more traditionally controlled instruments, and new concepts for playing singleperson instruments or multi-player instrument ensembles with influence-based approaches. First implementations and experiments with these notions are described in the paper Influx, submitted as well to ICMC 2014. 3.1.3 Ensemble environments and shared authorship in live music creation Common human-instrument interaction models mostly originate in pre-digital physical archetypes and their restrictions. Affordable, yet powerful technology supported the emergence of new instruments with different, possibly more subtle and complex interaction models. While these models focus on sound manipulation and expressive gestural play, and therefore allow new musical genres to emerge, contemporary musicians often run into difficulties when playing in more traditional ensembles. We contend that the possibilities gained through technology can be leveraged for these contexts as well, and methods to combine these two modes of expressiveness can be found. Next to the Meta-Control approach explained above, we here investigate and design multi-performer environments that provide a way to share tasks of music making between several users and/or (automated) agents. Those tasks are e.g., sound modulation, control of rhythm, melody, harmony, and spatialisation. Furthermore, we intend to explore the implications of the forms of shared authorship emerging in such practices. 3.1.4 Instruments for Moments of Solitude Within the 3DMIN project, the main focus is on musical instruments for a variety of public performance situations. Nonetheless, musical instruments are very often played for personal enjoyment in private situations. We consider this kind of use as equally important and worthy of research. Thus, we focus in this part on instruments for solitary use, i.e., with no intention for public performances or audience engagement but for reflection and as a catalyst for thought. The introduction of lateral thinking as a research method helps us to gain insights not only on designing instruments for actual solitary “performance” but, by means of comparison, also on the design process of instruments oriented towards public performance. 3.2 Systematisation of Electronic Instruments This part approaches the design of new musical instruments from a musicological perspective. It therefore integrates mainly the challenges described above under (b) Musicology- informed Design. While discussing broadly accepted taxonomies and classification models [10, 5], the creation of a systematic inventory of analog electronic and digital musical instruments considering their presentation mode, particular interaction model, technological design and conceptual context will form the basis for both a documentation and exhibition concept and the design and development process. Additionally, this subproject will compile a standardised approach as a set of Best Practice Guidelines for the documentation, conservation and dissemination of musical instruments. 3.3 Studying Interaction Processes between Gesture and Sound The goal of this part is to empirically research parameters in the interaction with musical instruments that are crucial both for the performer and the experience of the audience (it mainly integrates challenges described under (c) Researching and Integrating Embodiment). Especially the relationship between gestures and sensory feedback from the instrument are investigated. This research will not only result in fundamental findings concerning the interplay of music production and reception but will also provide direct recommendations for the development of new instruments in the other sub-projects. We are conducting open semi-structured qualitative interviews with developers and performers of musical instruments on the one hand and audience members on the other. Through applying grounded theory-based qualitative content analyses, we plan to identify important evaluation categories of gesture-sound mappings from both the production and reception perspective of new musical instruments. Furthermore, we perform experimental research under controlled laboratory conditions. The first group of experiments investigates the impact of mappings on the performing artist. This parameter probably has a substantial influence on the usability of a musical instrument and the user experience it elicits. A second group of experiments addresses the effect of gesture- sound-mappings on the evaluation of the performance by the audience. The hypothesis to be tested states that the movements executed during sound production substantially contribute to music experience. 3.4 Musical Instrument and Spatial Sound: Interface, Coding and Control Here we investigate how control over the spatialisation of sound can be made accessible to the respective performer (integrating challenges summarised under (a) Embracing Technology). Methods of sound field synthesis and efficient algorithmic solutions allow new approaches of spatial sound control in real time and pave the way for the development of an interface for a comprehensive simulation environment. This allows many forms of experimentation with creative interventions in space and any kind of interaction between room acoustic variables and the performer’s motion in space. 4. CONCLUSIONS AND FUTURE WORK In this paper, we discussed the challenges in instrument design as they present themselves to the newly formed team of 3DMIN researchers. This is more a discussion of concepts than a report on results. Nevertheless, we believe that the identification of the challenges suggested provide a good starting point not only for the research activities within the 3DMIN project, but also for meaningfully contributing to the ongoing discourse within the NIME community, hopefully triggering lively and fruitful exchange and discussion. In the course of the project, we expect that the ideas described here will provide a good conceptual framework for creating instruments that help the participating artists realise their ideas as fully as possible. Further, we hope to integrate the insights gained into a body of meaningful design guidelines, by generalising where appropriate, and by understanding the particularity of solutions that only work well under rare circumstances. Finally, we hope that the intended forms of dissemination of results by means of open content policies will make our instrument designs and the knowledge embodied in them widely accessible, both to the expert communities and to all other interested parties. Acknowledgments The realisation of the 3DMIN project, and thus this paper and its related research, was funded by the Einstein Foundation Berlin. 5. REFERENCES [1] M. Waisvisz, “Manager or Musician? About virtuosity in live electronic music,” in Proceedings of the 2006 conference on New interfaces for musical expression, 2006, p. 415. [2] D. Wessel, M. Wright, and J. Schott, “Intimate musical control of computers with a variety of controllers and gesture mapping metaphors.” in Proceedings of the 2002 conference on New interfaces for musical expression, 2002. [3] T. Bovermann and D. Griffiths, “Computation as material in livecoding,” Computer Music Journal, vol. 38, no. 1, 2014. [4] J. Rohrhuber and A. de Campo, Just In Time Programming. Cambridge, Massachusetts: MIT Press, 2008. [5] C. Sachs, The history of musical instruments. Courier Dover Publications, 1965. [6] H. H. Dräger, Prinzip einer Systematik der Musikinstrumente.-Kassel & Basel: BärenreiterVerl.(1948). 49 S. 8 ◦ . Bärenreiter, 1948, no. 3. [7] H. Davies, “Electronic instruments,” Grove music online, 1998. [8] M. Bakan, W. Bryant, G. Li, D. Martinelli, and K. Vaughn, “Demystifying and classifying electronic music instruments,” Selected Reports in Ethnomusicology, vol. 8, pp. 37–63, 1990. [9] G. Paine, “Towards a taxonomy of realtime interfaces for electronic music performance,” in Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), 2010, pp. 436–439. [10] E. R. Miranda and M. M. Wanderley, New digital musical instruments: control and interaction beyond the keyboard. AR Editions, Inc., 2006, vol. 21. [11] M. Consortium. (2011) Revision of the hornbostelsachs classification of musical instruments by the mimo consortium. [12] M. J. Jacob, “Experience as thinking,” in Art as a Thinking Process: Visual Forms of Knowledge Production : Abstracts, M. Ambrožič and A. Vettese, Eds. Sternberg Press, 2013, pp. 100–113, iSBN 9781-934105-93-1. [13] M. Leman, Embodied Music: Cognition and Mediation Technology. The MIT Press, 2008. [14] R. I. Godøy and M. Leman, Musical gestures: Sound, movement, and meaning. Routledge, 2010. [15] M. Heidegger, Sein und Zeit. Halle a.d.S.: Niemeyer, 1927. [16] C. Dobrian and D. Koppelman, “The ’E’ in NIME: musical expression with new computer interfaces,” in Proceedings of the 2006 conference on New interfaces for musical expression. IRCAM—Centre Pompidou, 2006, pp. 277–282. [17] P. Cook, “Principles for designing computer music controllers,” in Proceedings of the 2001 conference on New interfaces for musical expression. Singapore, Singapore: National University of Singapore, 2001, pp. 1–4.
© Copyright 2026 Paperzz