+ All Categories
Home > Documents > LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch,...

LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch,...

Date post: 05-Aug-2020
Category:
Upload: others
View: 4 times
Download: 0 times
Share this document with a friend
17
P. Campos et al. (Eds.): INTERACT 2011, Part III, LNCS 6948, pp. 306–322, 2011. © IFIP International Federation for Information Processing 2011 ToCoPlay: Graphical Multi-touch Interaction for Composing and Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada {sglynch,miguel.nacenta,sheelagh}@ucalgary.ca Abstract. With the advent of electronic music and computers, the human-sound interface is liberated from the specific physical constraints of traditional instruments, which means that we can design musical interfaces that provide arbitrary mappings between human actions and sound generation. This freedom has resulted in a wealth of new tools for electronic music generation that expand the limits of expression, as exemplified by projects such as Reactable and Bricktable. In this paper we present ToCoPlay, an interface that further explores the design space of collaborative, multi-touch music creation systems. ToCoPlay is unique in several respects: it allows creators to dynamically transition between the roles of composer and performer, it takes advantage of a flexible spatial mapping between a musical piece and the graphical interface elements that represent it, and it applies current and traditional interface interaction techniques for the creation of music. Keywords: Multi-touch, collaboration, composition, music, musical instrument. 1 Introduction Musical instruments can be considered interfaces between a musician and the acoustic space. It is, therefore, not surprising that music has been one of the main areas of application for the advances in interface science and engineering of the last few decades. The advent of new interface paradigms such as multi-touch and tangible computing have spawned a series of remarkable prototypes and commercial systems that enable new ways of creating and experiencing music, often beyond what is possible with traditional instruments. For example, new interfaces for musical expression currently allow people without music backgrounds to explore new sounds [4, 19, 21] and groups of people to perform together closely through interactive tables [14, 15, 19, 21].Computer and interface technology has changed how music can be performed, how sound can be generated, and also how it can be composed; however, with few exceptions [12, 20,19], composition tools do not take advantage of new interaction possibilities, and still are restricted to mouse and keyboard, sometimes with interfaces to electronic versions of traditional instruments [6, 9].
Transcript
Page 1: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

P. Campos et al. (Eds.): INTERACT 2011, Part III, LNCS 6948, pp. 306–322, 2011. © IFIP International Federation for Information Processing 2011

ToCoPlay: Graphical Multi-touch Interaction for Composing and

Playing Music

Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale

University of Calgary, Calgary, Alberta, Canada

{sglynch,miguel.nacenta,sheelagh}@ucalgary.ca

Abstract. With the advent of electronic music and computers, the human-sound interface is liberated from the specific physical constraints of traditional instruments, which means that we can design musical interfaces that provide arbitrary mappings between human actions and sound generation. This freedom has resulted in a wealth of new tools for electronic music generation that expand the limits of expression, as exemplified by projects such as Reactable and Bricktable. In this paper we present ToCoPlay, an interface that further explores the design space of collaborative, multi-touch music creation systems. ToCoPlay is unique in several respects: it allows creators to dynamically transition between the roles of composer and performer, it takes advantage of a flexible spatial mapping between a musical piece and the graphical interface elements that represent it, and it applies current and traditional interface interaction techniques for the creation of music.

Keywords: Multi-touch, collaboration, composition, music, musical instrument.

1 Introduction

Musical instruments can be considered interfaces between a musician and the acoustic space. It is, therefore, not surprising that music has been one of the main areas of application for the advances in interface science and engineering of the last few decades. The advent of new interface paradigms such as multi-touch and tangible computing have spawned a series of remarkable prototypes and commercial systems that enable new ways of creating and experiencing music, often beyond what is possible with traditional instruments. For example, new interfaces for musical expression currently allow people without music backgrounds to explore new sounds [4, 19, 21] and groups of people to perform together closely through interactive tables [14, 15, 19, 21].Computer and interface technology has changed how music can be performed, how sound can be generated, and also how it can be composed; however, with few exceptions [12, 20,19], composition tools do not take advantage of new interaction possibilities, and still are restricted to mouse and keyboard, sometimes with interfaces to electronic versions of traditional instruments [6, 9].

Page 2: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

ToCoPlay: Graphical Multi-touch Interaction for Composing and Playing Music 307

In this paper we present ToCoPlay (Touch-Compose-Play), a musical interface designed to support playing and composing music. By enabling seamless transition between performing and composing we explore facilitating musical actions that are already part of musicians’ activities, since composing music requires listening and changing what is being created, and, conversely, there are many forms of music where composition is part of the performance process (e.g., jazz). Our system is based on a horizontal multi-touch surface, and is intended for people with or without musical backgrounds. During the design of this new kind of musical interface, we made design decisions to enable control, configurability, a straightforward mapping between the visual and acoustic space, and to support a minimal set of interface elements and operations.

The structure of the paper is as follows: in the next section we provide relevant background on interfaces for musical expression. Then we discuss our goals and strategies for the design of the ToCoPlay interface, followed byits detailed description. We present four examples of ToCoPlay in use, followedby a discussion of the results of our exploration, andfinish with our conclusions.

2 Related Work

There exist many applications for musical expression on digital surfaces, and these vary widely in interaction methods and purpose. Most applications support a combination of the following musical activities: playing sounds, synthesizing sounds, sequencing sounds in real-time, and composing. In the following paragraphs we provide a brief summary of prior work, and how it differs from our own.

There are a large number of previous systems that focus on creating new sounds through the modification of streams or samples. Some of these are based mainly on tangibles, such as the seminal work by Jordà et al. on the Reactable [14,15], Audiopad [21], Pin & Play & Perform [27], and Condio [5]. These systems rely mostly on physical blocks that represent certain sound synthesizing operations (e.g., filters, tone generators, echo, samples) that are placed on the table and manipulated in real time to produce new sounds and rhythms. Other related work has the same approach, but is based on touch. For example, Akustich [1] is a direct-touch interface in which participants manipulate sound by performing multi-touch gestures on a table, where each gesture changes the sound and the visuals in a different way. In a similar way, performers using Sound Storm [23] stand in front of a vertical display and use touch to modify sound waves that come across the screen. The Synthesis and Control on Large Multi-touch Displays [7]demo by Perceptive Pixel provides a multi-touch knob and slider interface for the Synthesis Toolkit (a sound library for C++).

These systems resemble ToCoPlay in that they allow the real-time modification of sound by human performers, which makes them virtual instruments. However, they differ from ToCoPlay in several ways. First, ToCoPlay is based on standard note scales and sounds; therefore it does not focus on the generation of new sounds and rhythms. In this respect, ToCoPlay is simpler. Second, these systems are not designed to support composition: the temporal evolution of the music is difficult to reproduce because sound is mostly dependent on the state of the interface and the specific user

Page 3: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

308 S. Lynch, M.A. Nacenta, and S. Carpendale

actions. ToCoPlay is designed to store, modify, and reproduce created sequences, which supports performance, composition, and all combinations of the two.

A second group of related systems do not support sound synthesis; instead, they focus on supporting music performance that resembles how we play traditional instruments. For example, SurfaceMusic [8] enables playing three instruments on an interactive tabletop (a drum, a string instrument, and a wind instrument) by using a specific multi-touch gesture on each (tapping, strumming, and air-pushing). The Roots [12] application generates ambient sounds when a user touches a particular region on the multi-touch horizontal table (the Bricktable [11]) and produces visually pleasing, spiraling trees. The Jam-o-drum [4] is a series of external controllers that lay on a horizontal table display, where multiple people can play sounds by tapping their own pad; the Jam-o-drum is primarily for playing games. These systems are closer to ToCoPlay in the simplicity of the basic sound elements that are combined, but they do not generally support composition any more than traditional instruments.

A third group of interfaces enable composition. Traditional applications such as Garage Band [9], and Cubase [6] are single user, rely on traditional notation, and are meant to be used from a desktop PC. The MusicTable [24]allows people to place cards that represent pitch and instrument on a table. The cards are detected by the system and played in sequence from left to right. BlockJam [19]lets people combine attachable active physical blocks that have a button to generate a sequence of sounds. The rules of how the sound is produced depend on the type of block and its current state. The resulting sound cycles through the blocks and visual feedback is provided through a monitor. InstantCity [13]is an art music automata that plays different ambient sounds depending on how blocks are distributed on the table. Xenakis [2]is a non-deterministic system that uses tangible blocks to generate music according to probability distributions based on the distance between blocks and their similarity.In Noteput [20]participants place musical note-shaped physical blocks onto a digital horizontal surface which are played left to right according to their position, creating a physical staff.Spaces [12]is an application for composing ambient sounds on a multi-touch table, where participants touch a region of a table to make that region a warmer or cooler colour, with matching sound. Stereotronic Multi-Synth Orchestra [25] is a Microsoft Surface [17] application in which participants place notes into pre-defined concentric rings, and notes are triggered when the rotating element of each concentric ring goes over that particular note.

These systems resemble ToCoPlay in their ability to easily set up temporal sequences of sounds, but their design does not facilitate the performative component. ToCoPlay, similarly to all of these systems, allows setting up music and then triggering its performance without much human intervention; however, ToCoPlay focuses on supporting seamless transition between configuring the music, changing it, and performing or improvising with the available components.

In summary, ToCoPlay situates itself as being both a composition tool and an instrument, but not a sound synthesizer. ToCoPlay’s goal is different from the goals of systems mentioned above, since we set out to achieve seamless transitions between the playing and composing activities. It also has several features that distinguish it from previous systems, most notably a flexible and precise spatial mapping (contrary to the mostly linear ones described above) and its configurability to create configurations that adapt to the person and the situation.

Page 4: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

ToCoPlay: Graphical Multi-touch Interaction for Composing and Playing Music 309

3 Design Goal and Strategies

As illustrated by the related work section, there is considerable activity in the development of software with musical capabilities. However, most of these focus on strengthening either the performance potential or concentrate on supporting musical composition. Our goal is to, as part of a musical software interface, investigate the integration of these activities: composition and performance. Thus, we intend to both keep interactions as simple possible and to explore ways of using simple interactions to create powerful methods for introducing complexity. In ToCoPlay, we offer an interface that can be played as an instrument, but that also invites composition. To achieve these goals we apply four main design strategies: enabling control, enabling configurability, creating a formative visual-acoustic mapping, and providing a minimal set of interface objects and operations.

3.1 Enabling Control

For ToCoPlay we opted for a discrete paradigm, which is more closely related to how a piano is played (by pressing keys) than to the continuous model used by most modern synthesizers, which are commonly controlled by turning knobs and adjusting continuous parameters. In ToCoPlay individual sounds are made by pressingkeys. Complex sounds are developed through grouping, sequencing, and nesting. Grouping, sequencing and nesting are all interactively created and visually explicit.

3.2 Enabling Configurability

The physical characteristics of traditional physical instruments, such as the arrangement and shape of keys, have evolved into their current forms through many small mutations that often took centuries. Current electronic interfaces have not yet enjoyed the popularity or the time to evolve in this way. For ToCoPlay we introduce a model in which the location of keys and their grouping is fully configurable in real time. It is reconfigurable during the design of the sounds or melodies and it is reconfigurable during play. These freedoms can enhance the creation of a personalized interface that fits one’s own hands and, thus, is easier to play. Through reconfigurability it can also be adapted for different people and compositions; for example, one can group notes together so that they can be played without moving one’s hand (e.g., by placing them directly under the fingers when they are in a natural posture), or one can move a certain set of notes closer when they are to be used.

3.3 Visual-Acoustic Mapping

For most people composing music requires at least two basic elements: a way to try out the sounds, melodies, and harmonies as they are being created, and a way to record how these are produced (including their temporal relationships), for later performance. Many of the current music generation systems that we describe in Section 2 are inadequate for composition, not only because they lack features to record how interaction must take place to reproduce a certain sound, but also because the continuous nature of the interaction makes it difficult to create a notation that

Page 5: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

310 S. Lynch, M.A. Nacenta, and S. Carpendale

accurately and effectively relates the almost infinite variety of gestures that can take place in a multi-parametric continuous space to the way they sound.

Therefore we designed a visual-acoustic mapping that provides a straightforward relationship between what is displayed on screen and what is going to be reproduced, so that actions on the interface can be interpreted as changes in the music. Note that this is not a visualization of the sound but a spatialisation of the relationships between sounds. One such possible mapping is traditional music notation; however, this notation did not evolve to facilitate real-time interaction, is not generally accessible to novices and, most importantly, it imposes a rigid spatial mapping of time (along the staff, moving horizontally through time and then down through the scale) that directly contradicts our configurability goal. In other words, using a staff metaphor for our interactive system would not allow us to make an interface that is easily playable; it would encumber collaboration (e.g., due to its implicit orientation), and would not allow us to group elements according to other groupings that are not strictly temporal. Most of these arguments apply as well to other existing sequencing and composing applications (e.g., [2]). Again, we keep this spatial/visual relationship simple. Sequencing is specified through explicit directed links, where the spatial distance is mapped to temporal spacing.

3.4 Minimal Objects and Operations

Making an interface accessible to novices requires simplicity in the number and complexity of object types in the interface and their operations. We set out to create an interface that was based on a minimalistic approach. Unlike many existing musical interfaces that have large lists of objects, filters, types of connections and parameters, our interface needs only a few objects and a few rules about the ways the objects can relate to each other.

It might seem counter intuitive to restrict the design of the system to a small set of primitives in order to enhance expressivity and encourage playfulness. However, there is a wealth of experience in other fields that shows that minimalistic systems can lead to complex and expressive results; for example, a von Neumann machine, the Game of Life [10], SWARM intelligence [3], or the game of Go.

4 The ToCoPlay Interface

The interface of ToCoPlay is composed of five atomic elements: keys, key fountains, containers, links, and dummy keys. These elements can be combined to enable the properties outlined by the design section above. The following subsections provide a detailed description of each of the atomic elements, their interactions, and how their design, interactions and combinations support our design objectives.

4.1 Keys

The base atomic element in ToCoPlay is the key, a group of which are shown in Figure 1. A key is a circular interface element that plays a specific note of a specific instrument when touched. The note assigned to a key is one of the twelve pitches of the Chromatic Scale, which are represented visually by a change in value of the key’s

Page 6: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

ToCoPlay: Graphical M

color; that is, lower notes aof the key indicates the insynthetic instruments, a sineyellow key hues respectivel

Fig. 1. Keys i

Keys are circular to faciwhich allows many keys tunlike with most traditionanote which, as we will see,one mapping between the cto enable a direct relationsh

Tapping a key instantlyestablishing a clear connecsystem. A key can be mRepositioning keys allows the interface as a customizdifferent leit-motifs of a piefingers when the hand is ininterface also enables cludescribed under ‘links’, fuKeys can also be flicked ararea of interest, or to removof the interactive table.

4.2 Key Fountains

New keys are created by represented by a large circlThe keys are arranged arou(12 notes). When a key is rwhich enables the creation Figure 2.

Fig. 2. A key is remo

ulti-touch Interaction for Composing and Playing Music

are represented by darker colors than higher notes. The nstrument (or timbre) of the note. At this moment, te and a square wave generator, are represented by pink ly.

in ToCoPlay, representing the C Chromatic Scale

ilitate being easily touched while having a small footprto be visible simultaneously on the interface. Note tal interfaces, ToCoPlay can have many keys for the sa, supports the generative nature of the system. The onecolour of the keys and the sound that they emit is desighip between the visual and acoustic spaces. y plays the corresponding note and makes it transparction between the visual and the acoustic feedback of moved and repositioned anywhere across the interfathe musician to arrange them in ways that support play

zable instrument; for example, the keys necessary to pece can be placed so that they correspond to the tips of n different areas of the table. The mobility of keys on ustering of notes in different visual layouts which,urther supports the compositional aspects of the interfaround in the interface to quickly move them away fromve them from the interface if flicked out of the visible a

dragging them out of key fountains. Key fountains le surrounded by the smaller, circular keys (see Figureund the circle clockwise from C to B in half-tone intervemoved from a fountain, another copy appears in its plaof an unlimited number of keys of each type, as shown

oved from a key fountain, and is replaced by another key

311

hue two and

rint, that, ame e-to-gned

ent, the

ace. ying play f the

the as ace.

m an area

are e 3). vals ace, n in

Page 7: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

312 S. Lynch, M.A. Nacenta, and S. Carpendale

The interface contains a key fountain for each instrument (timbre). Although fountains initially appear at the center of the application, they can be moved around by dragging them from the central circle to make space for other kinds of interaction. It is also possible to rotate and scale fountains through the standard two-point RST gesture [18] to accommodate the needs of people located in different areas of the table. Consistent with the key interface elements, a fountain can also be flicked away out of the screen, which clears it from memory. Note fountains can be duplicated by double tapping on their central circle, again to accommodate the needs of multiple people around the table.

Fig. 3. The two different key fountains, each representing a different timbre. Each key fountain consists of a large circle surrounded by keys, representing the notes in the chromatic scale, from C(darkest) to B(lightest).

4.3 Containers

Containers, which are used to hold elements, are regions defined by free-form touch-traces on the table (see Figure 4). Containers are created by starting a touch-trace on an unoccupied place on the table and closing off the trace. A line indicates the outline of a container while it is being drawn, allowing for the precise definition of regions. When the finger is released and the container is completed, it appears as a translucent blue shape, and it automatically contains the objects that the trace enclosed.

Fig. 4. To trace a container start in an unoccupied space and create a closed form

Containers can group heterogeneous collections of any interface objects except key fountains(see Figure 5). Containers can also contain other containers, which supports the generative nature of composition by allowing nesting. They enable the individuals to work with their own pieces of a composition, or of an instrument, and then to combine them at a later point in time.

Similarly to key fountains, containers can be moved, scaled or deleted (i.e., they can be dragged, flicked or manipulated with two-finger RST). Objects such as keys

Page 8: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

ToCoPlay: Graphical M

and other containers that thcontainer is moved or rotefficiently by manipulatingbased use of the space [22moving it from outside thcontainer so that it fully con

Fig.

Containers also have twstructure it contains can becopy of the container appecopy also oscillates for a brto the copy button, there iscontainer. A container can container can be moved, reor removed. When locktransformation of its conunlocked, containers and thand external configurationsof a container; once it is fbelongs to that parent contastate of a given container:Containers can be deleted ikeys or key fountains, or by

Fig. 6. Pressing the co

ulti-touch Interaction for Composing and Playing Music

he container currently holds stay with the container as tated. This allows musicians to manage the screen ag meaningful groupings of objects and supports socia2]. An object (e.g., a key) can be added to a containerhe container to inside the container, or by moving ntains the object.

. 5. A container with keys placed inside

wo buttons. The container and all of the elements e copied or duplicated by tapping on its green button. Tears at a small offset from the original (see Figure 6). Trief period of time to highlight that it is a new object. Ns a red button that is used to control the lock state of be in one of two states: locked or unlocked. An unloc

esized, or rotated and also allows its contents to be moked, the container prevents accidental movement tents and itself, keeping all elements in place. Wheir contents can be manipulated to find different inters. If a container is unlocked, an element can be moved fully outside of a container, and is released, it no lonainer. The opacity of the red lock button reflects the curr: opaque red means locked, translucent means unlockn one of two ways: either by throwing them off screen l

y a gesture that crosses the container boundary twice.

ntainer’s green button creates a duplicate at a slight offset

313

the area ally-r by the

and The The

Next the ked ved

or When

rnal out

nger rent ked. like

Page 9: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

314 S. Lynch, M.A. Nacenta, and S. Carpendale

Together, duplication and repositioning provide the freedom to design personalized performance arrangements. Container and key groupings can become ‘instruments’ that can be shaped according to the specific anatomical or musical needs of the performer (e.g., according to the hand shape) and placed by the musician in different areas as required by the performance (e.g., storing instruments in distant regions when not required and bringing them close when being used). Simultaneously, the grouping of keys supported by containers, combined with nesting, enables the natural organization of musical compositions in different hierarchies such as themes, choruses, bridges, sections, etc. Container manipulation also supports reducing and organizing containers in space according to their current relevance (shown in Figure 7).

Fig. 7. A container is rotated, scaled and translated to fit inside of another container

4.4 Links

The elements described thus far support playing notes, grouping keys so that they form ‘instruments’ of arbitrary shapes, sizes, and orientations, and distributing these groupings in meaningful layouts over the table. However, the nature of the interaction with these elements does not enable composition or playback, since each note still has to be directly played by the musician. In order to support composition it is necessary to provide a mechanism that allows sequencing of notes at different times and intervals. We provide this mechanism through our link interface element.

Links are one-directional connections from one key to another that denote a sequence in how the notes of the corresponding keys are played (see Figure 8). Links are created by tracing a line that connects two keys when the original key is inside a locked container. Providing that the original key is within a locked container, any two keys can be connected, including keys belonging to different containers, keys that do not belong to any container or even keys that belong to contained sub-containers. Links can be deleted by tracing a line that crosses the link’s representation within a locked container or in blank space.

Page 10: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

ToCoPlay: Graphical M

Fig. 8. A key, inside a locked a link

If a key is tapped and itautomatically be played wsubsequent keys have linkschain ends, thus providing indicated by an arrow at thewhich is the subsequent keytapping or signal reception key. The initial key’s durasince its note will play unTimings created with links125ms.

A key can have any numsends a signal to a key, corresponding keys. Thismelodies –common elemenrecursive music as we will s

4.5 Dummy Keys

Dummy keys are special keproxies to activate other keplacing a dummy key that cthe note of key A will be pdummy note and key B ins9). This allows for flexibilneed to be played close Similarly, a combination of

Fig. 9. A dummy key conneoutgoing link will have a short

ulti-touch Interaction for Composing and Playing Music

container, is pressed and then connected to another key, form

t has an outgoing link to another key, the other keys wwhen the original key has finished playing its note. If s, the subsequent notes will be played and so on, until a way to sequence music events. The direction of a line end of the link, indicating which key is theinitial key y. The length of the link determines the timing between of the initial key and the start of the note of the subsequ

ation depends also on the duration of the following lintil the next note in this particular chain starts to p are not fully continuous and instead snap to multiples

mber of incoming or outgoing links; every time that a lits entire set of outgoing links will fire signals to

s branching enables creating chords and counterponts of music in most cultures– as well as cyclic patterns show in the examples of Section 7.

eys that do not play any sound by themselves, but serveeys. A dummy key’s incoming link length is ignored. connects distant keys A and B (in that sequence) close toplayed for a time proportional to the distance between stead of by the distance between keys A and B (see Figlity in the connection and placing of groups of keys tin time but need to remain in non-adjacent locatio

f dummy keys can be used to create instruments with k

cts two keys at a large distance. As a result, the key with ter distance to the outgoing link of the dummy key.

315

ming

will the the

nk is and the

uent ink, lay. s of

link the

oint and

e as By

o B, the

gure that ons. keys

the

Page 11: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

316 S. Lynch, M.A. Nace

located in anatomically conkeys can be created by drainner colour is grey.

5 Implementation

We implemented ToCoPMicrosoft’s WPF™ librariein ToCoPlay receives evenMediaPlayer framework, generated to have two distinand of a chromatic scaleapplication to create these, ability to play a sample moof ToCoPlay, prototypes weversions were built using a

6 Use Scenarios

6.1 Chord and Scale Inst

This use case scenario illusshaped containers, key layovideo figure. The video shoinstrument with the five keyto correspond to the tips of them easier to play, and adj

The second is an ad-hosimplified blues progressiowhich are activated by a duare activated (with no decontainers, to facilitate playperformer shown on the vid

Fig. 10. A cho

enta, and S. Carpendale

nvenient locations, as we will show in Section 6.1. Dumagging them from the center of a key fountain, and th

lay to work on the Microsoft Surface™, and ues for Surface development. As a result, every visual objnts when it is touched. For sound, we used Microsoand triggered wav samples. The wav samples w

nct timbres: that of a chromatic scale with sinusoidal tone with square tones. We used a simple tone generaand loaded multiple instances of each file so as to have

ore than once at any point in time. For the visual elemeere built using Microsoft Expression Blend™, and the ficombination of Microsoft’s C# and XAML.

trument

strates the configurability of the system through the useouts, and dummy keys. It corresponds to Example 1 in ows how to build two instruments. The first one is a simys of a blues pentatonic scale, which are arranged in spthe fingers of the performer. Keys are made larger to musted for angle of comfort for the performer.

oc chord instrument that can play three basic chords oon (C7, F7, and G7). Each chord is formed by four keummy key simultaneously. Each of the chord’s dummy kelay) by a dummy key located within their respecying with the left hand (see Figure 10 and video figure). Tdeo (one of the authors) does not have any piano experien

ord instrument (left) and a scale instrument (right)

mmy heir

used ject

oft’s were nes, ator the

ents final

e of the

mple pace

make

of a eys,

keys tive The nce.

Page 12: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

ToCoPlay: Graphical M

6.2 Harmonious Loops

This scenario illustrates hmusical texture through strearrangement of the keys. modify the copy’s loop timboth loops are similar but ncontinuously, in a sequencduration of either loop, crea

Fig. 11. A) A simple loop; Btogether will form a complex further evolve the texture creat

6.3 Canon

In music, a Canon is a musiby several instruments or instrument or singer has asimultaneously, resulting inthe harmonic loops, with a (see Figure 12) triggers theare delayed by exactly one one of the links of the triang

ulti-touch Interaction for Composing and Playing Music

how simple musical loops can produce a more comptraightforward copying of a container, and the real-tIn this example we part from a single loop, then copy

ming slightly, and play both loops simultaneously. Becanot identical in length, the harmonies being played chace that repeats itself on a period much longer than

ating a complex musical texture out of two simple elemen

B) A slightly modified copy of the loop in A);A) and B) plamusical texture; C) Both loops can be modified in real tim

ted

ical form that is based on the repetition of the same melovoices, starting at different points in time. Because e

a delay with respect to the others, different notes soun chords. This example is a more sophisticated evolution

more sophisticated output. The triangular structure on e start of a full cycle of the melody (bottom), at times tthird of the length of the full melody. The cycle ends wgular loop is cut (see video figure, Example 3).

Fig. 12. A Canon composition

317

plex time y it, ause ange

the nts.

ayed

me to

ody each und n of top that

when

Page 13: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

318 S. Lynch, M.A. Nace

6.4 Making Music with S

This scenario takes advantaconfigurability of the intercreated. In this case, two pAntoine de Saint-Exupéry’video figure shows how twthem with music, and then cexample illustrates how Tserious composition, but amusical themes might alsmusical sections when the goals.

Fig. 13. Two shapes are createa hat?

7 Discussion

7.1 Continuous versus D

Designing music interfacesadvantages; for example, a output space coincide and parameters and levels cangestures. These manipulatiReactable the rotation of anas pitch or volume. Our soffers a small discretized seand time intervals are multi

We fully recognize themanipulation of a compleHowever, our design is mosynthesizer and computer

enta, and S. Carpendale

Shapes

age of the flexibility of our visual-acoustic mapping and face to show how visually creative configurations can

participants are collaborating in creating music to illustrs famous story from the Little Prince. Figure 13 and

wo participants can create shapes, modify them, popucombine them to generate a small sound composition. T

ToCoPlay can also be playful, and suitable not only also for musical games. The creation of shapes to emso prove useful to improve recall and identificationpiece is complex, even if not used with visual-aesthet

ed with connected links inside: an elephant inside a snake; or

Discrete Control

s based on multi-touch interactive surfaces provides strohorizontal surface can enable collaboration, the input are fully configurable, and a virtually infinite number

n be controlled through touch manipulations, traces ons are often of a continuous nature; for example, in n element is often mapped to a continuous parameter ssystem intentionally diverts from this model and instet of alternatives: a key can only play one of twelve noiples of 125ms. e importance of continuous control, especially for ex, continuous control space such as sound syntheore discrete because our goals are different from those

instruments. We prefer instead to make the results

the n be rate the

ulate This

for mbed n of tical

is it

ong and r of and the uch tead otes,

the esis. e of s of

Page 14: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

ToCoPlay: Graphical Multi-touch Interaction for Composing and Playing Music 319

actions easier to control and predict, and therefore easier to learn for people without a background in music. In exchange for this simplification of interaction, we are reducing how parameters that can be manipulated through our system; for example, it is not possible yet to produce vibrato or tremolo (small variations in pitch or volume) with ToCoPlay.

7.2 Homomorphic Relation between Space and Sound

Many of the existing synthesizers and computer-based instruments described in the related work, as well as most VJing interfaces (e.g., [26]) provide visuals that depend mostly on the sound that is generated at that moment (or shortly before) and/or the input being provided. Since our main goal to support seamless transitions between performing and composing, we provide a tool that supports the broader view of the piece required by composers. We achieve this through our straightforward mapping that relates the visuals of the interface (the position, colour and lengths of the links) to sound. In particular, the mapping between the length of links and the duration and sequence of notes was of great help and impact when creating the examples described above, and it was often understood in demonstrations before it was explained. This relationship between the visual and the sound spaces is also useful in enabling shifts from composition to performance, since looking at the interface could allow performers to intervene and perform modification of a composed element in real time.

Although it would have been possible to create one-to-one relationships between the spatial and the acoustic spaces, we preferred to allow some freedom. Unlike traditional musical notations, which are generally isomorphic, our spatialisation of time allows several patterns to generate the same sequence of notes; for example, by changing the angle between keys without changing their distance or using dummy keys. This makes space for layouts that are not exclusively following musical requirements, but also accommodate interface considerations (e.g., collaboration, shape of the hand as in Examples 1 and 4), and aesthetic or storytelling possibilities, as shown by Example 4. We believe the visual aesthetic elements can be complementary and important for performing and composing, especially for people without musical background and for children; in fact, notations that cross media boundaries in art are not uncommon (e.g., Apollinaire’s concrete poetry).

7.3 Few Building Blocks Can Go a Long Way

Our current version of ToCoPlay is certainly limited in several aspects, and does not provide many of the expressive channels that other music applications offer. Nevertheless, we found our decision to keep the interface to a few elements and rules useful and rewarding. We believe that the small set of elements will make the interface easier to learn, and we showed through our examples how initially simple combinations of a few elements can result in fairly complex sound textures (e.g., Example 2).

This kind of approach is particularly suited to a multi-touch interface, and generally difficult to apply to tangible interfaces such as the Reactable [15] and Audiopad [21], which would require many physical elements and can be quite expensive. In multi-touch systems have advantages in that we are free to replicate

Page 15: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

320 S. Lynch, M.A. Nacenta, and S. Carpendale

elements until the screen is full. However, touch detection and touch feedback do not achieve the same tactile and haptic quality of tangible interfaces.

7.4 Limitations

ToCoPlay is an exploration of musical interfaces and, as such, has helped us uncover new challenges and presented new difficulties. Probably most obvious is the need to improve the quality and quantity of samples for instruments to improve the quality of the sound. The sinusoidal- and square-wave generators used as initial instruments were adequate for our initial explorations, but we are aware that new, richer sounds can substantially help increase the musical value of the interface. Adding an external sound generation engine (possibly on a different machine) can help improve the quality of sound and, at the same time, solve some of the problems derived from dealing with an interface that has many elements and momentarily freezes for some operations such as large container creation.

Similarly, we have contemplated adding new controllable parameters to increase the expressivity of the system. For example, changing the volume (dynamics) of individual notes or containers can be valuable for composing. We also do not currently provide a method of creating pauses and delays in a composition. Future changes in this direction must, however, be weighed against the increase of interface complexity that contradicts our strategy of maintaining a simple set of basic elements and interactions.

Some of the aspects of our visual-acoustic mappings also present space for improvement. For example, we have noticed that the mapping from the HSV colour space of keys to their note’s pitch is difficult to read and compare, especially if notes are distant from each other. We are already considering other kinds of mappings based on symbols and non-linear mappings of color to pitch, and mappings that would facilitate the inclusion of more than one octave.

Another issue with our spatial mapping is the ability to read complex scores of music.We are interested in finding out whether spatial relationships could become problematic (e.g., with a lot of containers embedded within one another) and how the interface can be modified for large scale compositions.

Finally, our current evaluation of the system has been constrained to just a few people from our department. Although reactions have been positive, we intend to validate our design by placing it into the real world and gather more impressions to produce a new, better version.

8 Conclusion

This paper presents ToCoPlay, a multi-touch tabletop musical interface that allows multiple people to create music. The main goal of ToCoPlay is to provide an interface that enables the performance of music while also encouraging composition. We applied four main strategies to design a system that allows seamless transitions between the performance and composition activities of participants: enabling control, enabling configurability, creating a simple visual-acoustic mapping, and relying on a minimal set of interface elements and operations. We also provide a set of four

Page 16: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

ToCoPlay: Graphical Multi-touch Interaction for Composing and Playing Music 321

examples that demonstrate different aspects of the operation of the interface and its expressive capabilities. Finally, we discuss the implications of our design decisions and how they make ToCoPlay unique.

Acknowledgements. We would like to thank Uta Hinrichs for her valuable assistance creating the first version of the video. We are also grateful for comments and discussion from Lawrence Fyfe and many other members of the Interactions Lab and the Innovis group. This work was supported in part by Canada’s Natural Science and Engineering Research Council (NSERC), Canadian Foundations for Innovation (CFI), Alberta’s Informatics Circle of Research Excellence (iCORE) and by SMART Technologies Inc.

References

1. Akustich, http://modin.yuri.at/tangibles/data/akustisch.mp4 (last accessed January 2011)

2. Bischof, M., Conradi, B., Lachenmaier, P., Linde, K., Meier, M., Pötzl, André, E.: Xenakis: combining tangible interaction with probability-based musical composition. In: Proceedings of TEI 2008, pp. 121–124. ACM, New York (2008)

3. Bonabeau, E., Dorigo, M., Theraulaz, G.: Swarm Intelligence: From Natural to Artificial Systems. Oxford University Press, Oxford (1999)

4. Blaine, T., Perkis, T.: The Jam-O-Drum interactive music system: a study in interaction design. In: Proceedings of DIS 2000, pp. 165–173. ACM, New York (2000)

5. Condio, http://media.aau.dk/~gtal05/condioProject.html (last accessed January 2011)

6. Cubase, http://www.steinberg.net/en/products/cubase/ cubase6_start.html (last accessed January 2011)

7. Davidson, P.L., Han, J.L.: Synthesis and control on large scale multi-touch sensing displays. In: Proceedings of NIME 2006, IRCAM, pp. 216–219 (2006)

8. Fyfe, L., Lynch, S., Hull, C., Carpendale, S.: SurfaceMusic: Mapping Virtual Touch-based Instruments to Physical Models. In: Proceedings NIME 2010, pp. 360–363 (2010)

9. Garage Band, http://www.apple.com/ilife/garageband/ (last accessed January 2011)

10. Gardner, M.: On cellular automata, self-replication, the Garden of Eden and the game life. Scientific American 224(4), 112–117 (1971)

11. Hochenbaum, J., Vallis, O.: Bricktable: A Musical Tangible Multi-Touch Interface. In: Proceedings of Berlin Open Conference 2009, Berlin, Germany (2009)

12. Hochenbaum, J., Vallis, O., Diakopolous, D., Murphy, J., Kapur, A.: Designing Expressive Musical Interfaces for Tabletop Surfaces. In: Proceedings of NIME 2010, pp. 315–318 (2010)

13. Hauert, S., Reichmuth, D.: Instant City, http://www.instantcity.ch (last accessed January 2011)

14. Jordá, S.: Sonigraphical instruments: from FMOL to the reacTable. In: Proceedings of NIME 2003, pp. 70–76 (2003)

15. Jordà, S., Geiger, G., Alonso, M., Kaltenbrunner, M.: The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of TEI 2007, pp. 139–146. ACM, New York (2007)

16. Max/MSP, http://cycling74.com/ (last accessedJanuary 2011)

Page 17: LNCS 6948 - ToCoPlay: Graphical Multi-touch Interaction ...€¦ · Playing Music Sean Lynch, Miguel A. Nacenta, and Sheelagh Carpendale University of Calgary, Calgary, Alberta, Canada

322 S. Lynch, M.A. Nacenta, and S. Carpendale

17. Microsoft Corporation. Microsoft Surface, http://www.microsoft.com/surface/ (last accessedJanuary 2011)

18. Krueger, M.W., Gionfriddo, T., Hinrichsen, K.: VIDEOPLACE - An Artificial Reality. In: Proceedings ofCHI 1985, pp. 35–40. ACM, New York (1985)

19. Newton-Dunn, H., Nakano, H., Gibson, J.: Block jam: a tangible interface for interactive music. In: Proceedings of NIME 2003, pp. 170–177 (2003)

20. Noteput, http://www.jonasheuer.de/index.php/noteput/ (last accessed January 2009)

21. Patten, J., Recht, B., Ishii, H.: Audiopad: a tag-based interface for musical performance. In: Proceedings of NIME 2002, pp. 1–6 (2002)

22. Scott, S.D., Carpendale, M.S., Inkpen, K.M.: Territoriality in collaborative tabletop workspaces. In: Proceedings of CSCW 2004, pp. 294–303. ACM, New York (2004)

23. Sound Storm, http://subcycle.org/ (last accessed January 2011) 24. Stavness, I., Gluck, J., Vilhan, L., Fels, S.: The MUSICtable: a map-based ubiquitous

system for social interaction with a digital music collection. In: Kishino, F., Kitamura, Y., Kato, H., Nagata, N. (eds.) ICEC 2005. LNCS, vol. 3711, pp. 291–302. Springer, Heidelberg (2005)

25. Stereotronic Multi-synth Orchestra, http://www.fashionbuddha.com/ (last accessed January 2011)

26. Taylor, S., Izadi, S., Kirk, D., Harper, R., Garcia-Mendoza, A.: Turning the tables: an interactive surface for vjing. In: Proceedings CHI 2009, pp. 1251–1254. ACM, New York (2009)

27. Villar, N., Lindsay, A.T., Gellersen, H.: Pin & Play & Perform: a rearrangeable interface for musical composition and performance. In: Proceedings of NIME 2005, pp. 188-191 (2005)


Recommended