+ All Categories
Home > Documents > How to Write Music For The Airband - NYU...

How to Write Music For The Airband - NYU...

Date post: 18-Mar-2018
Category:
Upload: vuongdieu
View: 219 times
Download: 2 times
Share this document with a friend
38
How to Write Music For The Airband LANGDON C CRAWFORD Submitted in partial fulfillment of the requirements for the Master of Music in Music Technology in the Department of Music and Performing Arts Professions in The Steinhardt School New York University Advisors: Kenneth J. Peacock, Robert J. Rowe [DATE:2006/12/21]
Transcript
Page 1: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

How to Write Music

For The Airband

LANGDON C CRAWFORD

Submitted in partial fulfillment of the requirements for the Master of Music in Music Technology

in the Department of Music and Performing Arts Professions in The Steinhardt School

New York University Advisors: Kenneth J. Peacock, Robert J. Rowe

[DATE:2006/12/21]

Page 2: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 1

Table of Contents:

0.0 Cover

0.1 T.O.C. ……………………………………………………………....01

0.2 Abstract …………………………………………………………….02

1.0 Motivation ………………………………………………………….03

2.0 History & Context ………………………………………………….04

3.0 The Airguitar: Hardware development …………………………….09

4.0 Airguitar Software………………………………………………….16

4.1 Control Maps………………………………………………..17

4.2 Signal Generators……………………………………………20

4.3 Signal Processors……………………………………………24

5.0 Instruments………………………………………………………….28

5.1 Rhythm Guitar………………………………………………28

5.2 Lead…………………………………………………………28

5.3 Techno………………………………………………………29

5.4 Shakey-Shakey……………………………………………...29

5.5 Rhythm Sampler…………………………………………….30

5.6 Tape Jockey…………………………………………………30

6.0 Deliverables………………………………………………………….31

7.0 Future Work…………………………………………………………34

8.0 Conclusion…………………………………………………………..35

9.0 Reference Materials…………………………………………………36

Page 3: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 2

Abstract

This paper describes the process by which a contemporary composer could write music

for above named electronic music performance ensemble, which employs some novel

interfaces for musical expression. To understand how to write music for this ensemble,

one should have an idea of how this ensemble came to be. I shall briefly discuss my

motivations for creating such novel interfaces and forming an ensemble, after which I

will go into the technical details of the fast prototyping process for the hardware

component (the air-guitars), and the current state of the ongoing development of the

software component of the instruments used in the ensemble. I will then describe the

current uses of these instruments in musical performance applications and how one may

write music to play on these instruments.

With a technical understanding of these instruments a composer may be able to

write whatever she pleases for the ensemble. But even without a deep technical

understanding of the tools, a composer could still deliver materials to the ensemble so

that pieces may be easily performed with minimal rehearsal. Examples of such

deliverables and all source code shall be included in the appendices of this paper for ease

of replication and modification by interested composers.

Page 4: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 3

Motivation.

My motivations for this project are completely selfish. I like electronic music. I like it a

lot. I have worked with Electronic Music Foundation for a number of years now and

continue to enjoy working with and around the electronic music world. There is one

major problem I have with electronic music: Tape music concerts.

Some of my favorite pieces of music in the electronic music cannon are tape music. The

early work of Stockhausen, “Williams Mix” and other tape pieces by Cage, are on my list

of great pieces of tape music. But I would walk out of a concert if these pieces were

played back via loud speakers. Defense of this opinion is outside the scope of this paper

and this topic is discussed with greater detail in other places [Rovan 1997 Wanderley

2000, Collins 2003]. When I go to a live I would like to see, hear and experience the

creation at least the human performance of music.

I also like rock music, rock concerts and theatre. At electronic music concerts I often

longed for the excitement and drama that I experience at these other perfomances. And

when at rock concerts and theatrical performances I often miss the diversity of sounds

available in electronic music.

So I decided to try to make a meeting of the best of both worlds. Inspired by the work of

several people in the electronic and computer music community who develop unique

methods of performing music [Allen 2005, Chadabe 1997, Cook 2001, Sunami 2006,

Waisvisz 2006, Weinberg 2004, Young 2002 ].

I was also inspired by the cultural phenomenon of air guitar playing. This is the playing

of an imaginary guitar using exaggerated gestures of strumming and such [Wikipedia,

2006]. These exaggerated gestures offered what seemed to me at that time an ideal

opportunity to gather gesture data via motion sensors and use that data to control

electronic music. This could potentially offer the performance energy and drama of rock

music while realizing the sonic possibilities of electronic music.

Page 5: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 4

Context.

I have been interested in gestural improvisation with electronics for some time. I have

developed a few control systems based on Joel Chadabe’s Solo, and one based loosely on

a dataglove paradigm.

The first successful system I developed was ‘Video Solo’. This system employed a

video-tracking program, using tools from SoftVNS2.0 by David Rokeby [Rokeby 2006].

This system was the first I developed that freed me from the mouse and keyboard control

paradigm. I could control the computer in limited ways by waving my hands in front of a

video camera. The video tracking software tracked the position of my hands (or any light

colored objects) along the two vertical edges of the video frame. The position was

mapped directly to control parameters, lower position in the frame output a lower value,

higher output position a higher value.

In early testing I made what I called a video Theremin. One hand controlled the

frequency of an oscillator the other controlled the amplitude of the audio signal. This

was my basic proof of concept. The video tracking worked fairly consistently as long as

certain technical considerations were met. These considerations included controlled

lighting and fixed camera exposure and focus settings.

In the next phase of development I tried to emulate Chadabe’s Solo, based loosely on his

personal descriptions and his papers [Chadabe, 1980, 2000, 2002]. One hand controlled

timbre of a Frequency Modulation (FM) synthesizer, which was driven by an algorithmic

note generator. The other hand controlled the tempo of the note generator.

In the last phase of development I mapped the hand positions to the continuous control of

two granular synthesis parameters. One hand controlled density of the grains. The other

hand controlled the amount of random variations in the values a number of several

parameters, which changed with each grain. Some of these parameters included grain

duration, pan position, and playback rate of granulated sample. In addition to the two

continuous controls there was a triggered event control. When the two hands were both

at the top of the frame a new sample would be randomly selected from a library of sounds

and loaded in to the granular patch. This allowed for a variety of sounds from very few

control parameters.

This instrument was fun to use for a while, but had several problems and limitations that

made it less than ideal for regular performance. One major problem was timing

resolution. I had to move my hands slowly for the video software to track their position.

This was do to a few factors, mostly the slow frame rate of my video tracking software.

The video input I used had a frame rate of approximately 30 fps. The motion sensing part

of the software compares sequential frames, which means at least two frames must go by

to get any meaningful position or change in position information from the software. At

15 accurate measurements per second, I had a time resolution of 67 ms at best. This

made precise musical timing very difficult if not impossible. Another timing issue is the

latency involved in digitizing video the input and transferring it to the computer over

firewire or usb.

Another major problem with my video tracking patch in softVNS2.0 is that it required

some control of the lighting environment. This is not such an issue in a concert hall

when there are hours to set up and the theatrical lighting can be dimmed so that the light,

Page 6: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 5

from a camera-mounted lamp, reflecting off my hands is the brightest object in the frame.

In other settings for example a gallery space with florescent and outdoor light leaking in,

my tracking software was not as consistent and very frustrating to try to use in

performance and presentations.

A few minor issues were related to portability. The video tracking used most of the CPU

time on a g4 500Mhz laptop, so the audio software in was controlling had to be rather

sparse. For more complicated sound generating routines another computer was needed.

This combined with the lights and stands for lights and cameras made the system difficult

to carry from show to show.

The next system I developed built upon the audio software from the first system and

replaced the video tracking and camera with a signal analysis patch and an Etherwave

Theremin. The Theremin outputs an audio signal where there frequency and amplitude

of a synthesized tone can be varied by moving one’s hands or body closer and further

from two antennae [MoogMusic 2006, wikipedia]. The signal analysis patch takes the

signal from the audio input of a computer and outputs continuous values based on the

estimated frequency and amplitude of a signal. These values are mapped to control

parameters of signal processing/ synthesis routines.

The Theremin control system solved a few of the problems present in the video based

system, introduced a few of its own problems and helped clarify what I needed to do in

future interface design projects.

The Theremin is a very responsive controller. Even when used to control the computer it

has very little if any perceived latency. Changes in hand positions have immediate effect

on the sound output from the computer. Compared to the video tracking system, the

Theremin less bothered by changes in the performance environment.

That being said there are some issues with using the Theremin as a controller. One is

that temperature and humidity have an effect on the capacitive response of the antennae

[Alford 1999]. From personal experience I can also say that the amount I am sweating

has a direct relationship with the response of the device. As I heat up and cool off during

a performance, I have to change my standing position relative to the Theremin. For

example in one performance I had to move from standing less than an arms length from

the device when I was cool to a step or two out of arms reach when I was sweating.

Page 7: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 6

The Theremin too provided the burden of transportation from show to show. I could fit

everything for this system except the computer into a hiking pack, which was great. But

these were much heavier objects than the video system. For consistent amplitude

tracking I had to use a microphone preamp and a passive DI box to connect the Theremin

to the audio input of the computer. (Note: I did not research the engineering of why a

longer cable run at line level would alter my amplitude readings) While this system was

portable enough for public transportation, it was still a burden and left me wanting a more

easily portable system.

Besides the technical and portability issues with the Theremin as a controller, I found the

lack of discrete controls in both the Theremin and video based systems somewhat

limiting. In some situations I used foot pedals to make program changes [fly by wire

v1.0 and v1.1] and other times someone would trigger program changes from the

keyboard while I was at the Theremin.

The limitation of the Theremin as a controller did not keep me from using it but did

inspire me to try something completely different.

My next interface project was based on a data glove paradigm [Sturman 1994, Sonami

2006, wikipedia]. The interface involved a gardening glove with sensors stitched on to it

and an analog to MIDI interface. This interface gave me the discrete control I desired

and the portability. There were continuous flexion sensors on the fingers and switches on

the fingertips and under the thumb. The glove and hardware were so small that they

could fit into the outer pocket of the case I used to carry my computer.

Performing with

my data glove.

At the NYU

adv. computer

music concert December 2005

Performing with the Etherwave Theremin as a

controller for laptop compositions with the Philippa

Kaye Company. March 2005

Page 8: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 7

But for all the gains in detailed control and portability, I lost the large gesture control of

moving my hands, arms and whole body in space. An additional issue with glove

controllers is ergonomics. I am not as comfortable with a glove on my hand as I am with

an object that I can pick up and put down. So again I move on to try to develop an

interface that has large form gesture control as well as discrete control of details and that

doesn’t require attaching anything to my body.

The Airguitar is my latest and most technically involve control system. This is also the

first project I developed as part of an in depth project for a gradate class on the very

subject of musical controllers. My aspirations were to allow a few of my favorite rock

music gestures, specifically the strumming of a chord on an electric guitar and the

overhead punch of the drummer striking the crash cymbals, to control or trigger

electronic music sounds and processes.

My technical inspirations came from the work of Burt Bongers and Perry Cook. Bongers

built the Lady’s Glove (version 4) and the Hands (version 2) [Sonami 2006, Waisvisz

2006]. Cook outlined principles of making musical controllers and discussed some

specific devices that he developed that directly influence my choice of sensors and how

they were used [Cook 2001]. While developing the Airguitar I also found several

resources on the web that deal with keyers “ A Keyer is a device for signaling by hand,

by way of pressing one or more switches” [wikipedia].

Having seen the Hands and the Lady’s Glove in live performances, I knew the musical

possibilities with such controllers. They had the gesture and detail control that I sought in

my own work and were wielded by experts who have been using these devices for many

years [Chadabe 1997, Waisvisz 2006, Sonami 2006]. The longevity of these interfaces

Two versions of the hands [Waisvisz 2006] Version 2 on left, current version below.

The Ladies glove.

Right: photo by Joel Chadabe. Below: photo from www.media.mit.edu

Page 9: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 8

made me think that they were excellent tools to model. The devices Cook described in

his paper, Principles for designing computer music controllers, have also had staying

power. He presented these devices to the NIME class at ITP in the spring of 2006 at 5

years after they were initially discussed in is paper [Cook 2001]. While these devices

where developed several years ago Cook still finds musical use for them. In developing

the Airguitar I sought to create a device that would work as well as these devices, such

that I would want to use the Airguitar for a long time.

Page 10: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 9

3.0 The air guitar

The first prototype of the air guitar was very simple. It allowed for the triggering of

sound, of which 8 sounds could be selected. The interface was comprised of two

components; One keyer, which represented the frets in the left hand and one switcher,

which represented the pick in the right hand.

The keyer was a small box with 6 push buttons mounted on it. One button was placed

under each finger, one extra under the middle finger and one more under the thumb.

There were 2 samples associated with each finger button. The position of the thumb

selected which of the 2 samples would be called when a finger pressed down. The

switcher was a small box with a toggle switch mounted on it. When the toggle changed

state, it would trigger the play back of a sample called by the left hand. The switcher

could be held in one hand and manipulated with the thumb and fingers, but I preferred to

strap it to my belt so that I could quickly flip the switch back and forth.

Above Left: Botton based Keyer for original prototype Above Right: Switch attached to belt

While this may have been a start and a proof of concept, that buttons and switches could

potentially rock, it was much too limiting. I could play simple rhythm guitar parts

consisting of a few notes of fixed duration and that did not require any continuous

variation such as volume or timbre.

The technical implementation was quick and dirty. The devices connected to a analog to

midi converter circuit under my clothes and a MIDI cable kept me tethered to my

computer. It was not so comfortable to wear around the house, and took some time to put

on before giving a presentation.

Above: analog to midi converter and early Airguitar.

There were also coordination issues for me as a performer. When playing this air guitar I

would often trigger a sample just a few milliseconds before I had put my fingers in the

Page 11: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 10

right position to call the sample I wanted. This motivated me to quickly move from

triggering samples to controlling a synthesis routine, that could be triggered and pitch

could be changed independently.

The most obvious synthesis routine for this project seemed to be a plucked string model.

I developed a pseudo plucked string model using a Max/MSP, based on the Karpus-

Strong Algorithm.

The Karplus-Strong Algorithm (KSA) uses a feedback loop where the period of the delay

is the inverse of the fundamental frequency of the string model. A high feedback

coefficient causes this string to resonate when excited. KSA also uses a FIR filter such as

feedforward delay, [z^-1] in the feedback loop to simulate the natural dampening of high

frequencies over time. KSA describes the excitation of the string model as filling the

delay buffer with random values (white noise). [Smith 2006, wikipedia]

Figure: Block diagram of the Karplus-Strong Algorithm [Smith 2006]

At first I did not use this algorithm, as I didn’t know much about filter design, and was

trying to follow one of Cook’s principles; “sound now subtletly later”[Cook, 2001].

Instead I used the Max/MSP implementation of a feedback comb filter [comb~]. A

feedback comb filter is a IIR filter, which adds a delayed version of its output back into

the input to the filter. This causes constructive and destructive interference at evenly

spaced intervals in the frequency domain. [Smith 2006, wikipedia]

Figure: block diagram of a comb filter [Smith 2006]

Using a high enough feedback coefficient causes the filter to resonate with peaks at

frequencies harmonically related to the inverse of the period of the delay. I took

advantage of this in my attempts to create a sound similar to a plucked string. [Crawford,

2006]

Page 12: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 11

The Max/MSP implementation of a comb filter [comb~] contains a feed forward delay at

the input but not in the feedback loop itself. This is where [comb~] differs from the

KSA. Rather than sounding like a plucked string when excited [comb~] outputs a sound

more like a standing wave in a tube. While not completely satisfied with this algorithm,

it worked well enough to inspire some significant modifications to the hardware device.

This routine required continuous controls to be used in performance. Specifically, the

amount of feedback used in the delay loop controlled how long the filter would resonate

after and impulse was passed into it. Also the timber of this instrument could become

tiresome very quickly with out some subtle changes in tone. The buttons on the keyer

could be used to change the delay time (therefore the pitch) of the filter and the toggles

on the switcher could trigger an impulse. But without out any continuous controlled I

was at a loss.

The next development of the air guitar addressed the control needs of this synthesis

routine. The devise built this time had a few continuous controls in the place of

switches. A keyer was used for the left hand again. This one was built in and around a

piece of PVC pipe, specifically a 90º 2.5” corner fitting.

Above Left:PVC pipe with top surface filed flat for mounting FSRs Above Right: Analog to MIDI convert that fits inside the PVC fitting.

This pipe fit comfortably in my hand, much more comfortably than the box of the

previous design. Low profile Force Sensing Resisters (FSRs) were used in place of

buttons. A small analog to MIDI converter was placed inside the pipe and powered with

a 9-volt battery.

For the right hand, I developed a plucker. The first round involved a flexion sensor

mounted perpendicular to a flat surface so that I could flick it with my thumb or finder to

trigger an event. Also mounted on this surface were a few FSRs. I planned to use these

for continuous control of feedback and filtering parameters. Mounted on the underside of

this surface was a dual axis accelerometer, which was used also to trigger an event. Also

mounted on the underside were another analog to MIDI converter and a 9-volt battery.

Page 13: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 12

Above Left: Plucker top surface, FlexSensor and FSR strips. Above Right: Plucker internal workings, ADC, two axis Accllerometer

While the plucker allowed for the continuous control I needed for the synthesis routine, I

was not satisfied with the keyer. Each button on the keyer was associated with a specific

pitch. 8 buttons could be used over the range of an octave using a diatonic scale.

Above: 8 FSR Keyer, 0.5 inch diameter on top, smaller diameter sensors on the side. Strap made of electrical tape goes around back of thumb.

I wanted a lot more pitch options. So I remapped the device considerably. Rather than

one pitch per button over 8 buttons, I used only four buttons in combination. I did this

for two reasons. Keeping the fingers in one position made the device easier to play with

less hand fatigue, and with the combinations of 4 buttons I had 16 discrete values.

Based on a few pictures and diagrams of cord keyers see images below I decided also

mount two buttons again under the thumb. With these thumb buttons changing the state

of the finger values I now had a range of 48 discrete values. This allowed for an

improved expressive range. But when mapped to notes in a 12-tone octave, the finger

positions were different for the same note over the 48 values. This made memorizing the

finger positions and accurately playing pitches so difficult that decided to scale back the

range. Instead of each thumb position shifting the finger pattern by 16, I made it just

shift by 12. The range would then be 40 discrete values. But the finger patterns were the

same. Making memorization much easier and my playing more accurate.

Far Left: Septambic Keyer

[Mann 2006, wikipedia]

Near Left: Twiddler2 [Mann 2006, wikipedia]

Page 14: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 13

Finger Chart for Keyer: 1 = FSR pressed, 0 = Not pressed

indx mid ring pnky t.Low t.High num Pitch indx mid ring pnky t.Low t.High num Pitch

0 0 0 0 0 0 0 c 0 0 0 0 0 0 0 c

1 0 0 0 0 0 1 c# 1 0 0 0 0 0 1 c#

0 1 0 0 0 0 2 d 0 1 0 0 0 0 2 d

1 1 0 0 0 0 3 d# 1 1 0 0 0 0 3 d#

0 0 1 0 0 0 4 e 0 0 1 0 0 0 4 e

1 0 1 0 0 0 5 f 1 0 1 0 0 0 5 f

0 1 1 0 0 0 6 f# 0 1 1 0 0 0 6 f#

1 1 1 0 0 0 7 g 1 1 1 0 0 0 7 g

0 0 0 1 0 0 8 g# 0 0 0 1 0 0 8 g#

1 0 0 1 0 0 9 a 1 0 0 1 0 0 9 a

0 1 0 1 0 0 10 a# 0 1 0 1 0 0 10 a#

1 1 0 1 0 0 11 b 1 1 0 1 0 0 11 b

0 0 1 1 0 0 12 c 0 0 1 1 0 0 12 c

1 0 1 1 0 0 13 c# 1 0 1 1 0 0 13 c#

0 1 1 1 0 0 14 d 0 1 1 1 0 0 14 d

1 1 1 1 0 0 15 d# 1 1 1 1 0 0 15 d#

0 0 0 0 1 0 16 e 0 0 0 0 1 0 12 c

1 0 0 0 1 0 17 f 1 0 0 0 1 0 13 c#

0 1 0 0 1 0 18 f# 0 1 0 0 1 0 14 d

1 1 0 0 1 0 19 g 1 1 0 0 1 0 15 d#

0 0 1 0 1 0 20 g# 0 0 1 0 1 0 16 e

1 0 1 0 1 0 21 a 1 0 1 0 1 0 17 f

0 1 1 0 1 0 22 a# 0 1 1 0 1 0 18 f#

1 1 1 0 1 0 23 b 1 1 1 0 1 0 19 g

0 0 0 1 1 0 24 c 0 0 0 1 1 0 20 g#

1 0 0 1 1 0 25 c# 1 0 0 1 1 0 21 a

0 1 0 1 1 0 26 d 0 1 0 1 1 0 22 a#

1 1 0 1 1 0 27 d# 1 1 0 1 1 0 23 b

0 0 1 1 1 0 28 e 0 0 1 1 1 0 24 c

1 0 1 1 1 0 29 f 1 0 1 1 1 0 25 c#

0 1 1 1 1 0 30 f# 0 1 1 1 1 0 26 d

1 1 1 1 1 0 31 g 1 1 1 1 1 0 27 d#

0 0 0 0 0 1 32 g# 0 0 0 0 0 1 24 c

1 0 0 0 0 1 33 a 1 0 0 0 0 1 25 c#

0 1 0 0 0 1 34 a# 0 1 0 0 0 1 26 d

1 1 0 0 0 1 35 b 1 1 0 0 0 1 27 d#

0 0 1 0 0 1 36 c 0 0 1 0 0 1 28 e

1 0 1 0 0 1 37 c# 1 0 1 0 0 1 29 f

0 1 1 0 0 1 38 d 0 1 1 0 0 1 30 f#

1 1 1 0 0 1 39 d# 1 1 1 0 0 1 31 g

0 0 0 1 0 1 40 e 0 0 0 1 0 1 32 g#

1 0 0 1 0 1 41 f 1 0 0 1 0 1 33 a

0 1 0 1 0 1 42 f# 0 1 0 1 0 1 34 a#

1 1 0 1 0 1 43 g 1 1 0 1 0 1 35 b

0 0 1 1 0 1 44 g# 0 0 1 1 0 1 36 c

1 0 1 1 0 1 45 a 1 0 1 1 0 1 37 c#

0 1 1 1 0 1 46 a# 0 1 1 1 0 1 38 d

1 1 1 1 0 1 47 b 1 1 1 1 0 1 39 d#

Page 15: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 14

On the plucker I had one FSR that acted as an octave shift, much like the thumbs on the

left hand. This was needed when the keyer only output a small range. But with the keyer

outputting a range of 3+ octaves, the octave shift became a bother. The string model I

was using did not sound as rich or sustain as well in the upper octaves as it did in the

lower octaves. So the octave switch on the right hand became a synthesis switch. When

that FSR was pressed it would trigger a sustaining sound. When released the sound

would fade away. The left hand could control the pitch of this sound, and could thus be

used to play a sort of lead line.

At this point I had a fun instrument to play. I could go from rhythm to lead guitar with

just a slight change in hand position and had enough control of the sound and timing of

things that I could play some simple parts along with a prerecorded drum track.

Less regular and syncopated parts were still very difficult to play with these devices. The

plucker specifically had its limitations. The flexion sensor on the plucker had to be

slowly “cocked” back to prepare to be quickly moved on an upbeat.

For the time being though I was willing to accept these limitations. My main concern

was that I remained tethered to a midi interface and the computer via MIDI cables. I

wanted to be wireless.

In brief I will say this did not work well. I should have listened to Perry Cook “Wires are

not that bad (compared to wireless)”[Cook 2001]. To keep development time to a

minimum all the gesture mapping, was handled in the software on the computer, rather

than in the firmware on the micro controller in the analog to MIDI circuit. This means

that when using MIDI there is a minimum system latency of 8 milliseconds when all the

channels of one controller are in use. For what I was doing 8 ms is not too bad. But this

is using a baud rate of 31250bps. The wireless transmission technology I could afford at

the time of this prototype worked somewhat consistently at a baud rate of 2400 bps. So

transmissions were 13 times slower, leaving me with a system latency of 100 ms or more.

Above Left: Wireless transmitter instead of midi plug, the yellow coil is an antenna.

Above Right: Wireless receiver connected to a pic [microchip, 2006] running serial to midi

For ambient electronic music this would probably be acceptable, but for a rhythm

instrument, 100ms feels like an eternity. In addition to the latency issue was the problem

that the wireless connection would drop out for a few moments every so often. This

made playing with any rhythmic precision nearly impossible. So I went back to the

hardwired solution.

Page 16: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 15

With the Airguitar working fairly well when connected via MIDI cables, I decided to

make another air guitar to play air bass or air keyboard parts along with the recorded

drums and live air guitar. This controller would have two keyers, the right hand would

mirror the left. With out a specific plucker device, triggering of events would be handled

by accelerometers mounted inside the PVC housing or by pressing the key on the right

hand. This keyer pair with two-axis accelerometers in both hands became what is now

the latest version of the air guitar.

Above: The latest AirGuitar controllers. 6 FSRs 2 axis accelerometer, midi out. Strap made of electrical tape for

now, goes around back of hand.

The Air guitar connects with the Max/MSP via MIDI

continuous control. Each analog to digital converter pin

on the pic chip is assigned a MIDI continuous controller

number on channel one. The patch below receives the

MIDI control data, displays the change in data over

time and sends the data out as a list. This patch may be

used in multiple instances. As it takes an argument for

the port letter associated with a MIDI interface. The

port letter is then mapped to an integer value via a look

up table, and all the [ctlin] object in this instance of the

patch will only listen to control data coming into the

assigned MIDI port. This allows for each device to run

the same exact MIDI code. This is convenient for

developers and performers. Only one program need be

coded, compiled and written to the pic chip. Left-

handed players may use these instruments with out any

reprogramming. One only needs to switch the plugs so

that the left hand plugs into port B and the right hand

plugs into port A of the MIDI interface.

Page 17: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 16

4.0 Software

This chapter describes the mapping, signal generation and digital audio effects used in the

Airband performance patch. The Airband patch uses 297 instances of 44 abstractions

created by the author for the Airband and other projects. Each patch developed

specifically for the Airband and some other sub-patches used extensively in the Airband

patch are pictured and explained below.

Page 18: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 17

4.1 Control Maps

[tab.4bit]

This patch takes the continuous controls from four FSRs, one under each finger, and

maps them to a single numeric output based on a 4bit word. It does this by first

considering each continuous control signal to be a binary integer (bit) value (1 or 0). The

state of this bit value is set when the continuous control signal is above or below a preset

threshold. Once each continuous value is set to a bit, the 4 bits are each scaled by a

power of two and summed to form a single number in base ten. [See chart in hardware

chapter].

[tab.4bit++]

This patch uses the two FSRs

under the thumb to transpose the

output of the 4bit patch. The

FSR on the inside of the curve

(thumb low) of the pipe,

transposes the output by an

octave (12nn), the FSR on the top

of the curve (thumb high)

transposes the output by two

octaves (24nn). [See chart in

hardware chapter]

Page 19: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 18

[tab.xltobang]

This patch takes the continuous control from the accelerometers and outputs a trigger

(bang). It measures direction change on one part of one axis of acceleration. It uses a

scaling factor to ignore changes in direction that involve small amounts of force, such as

vibrations and slow movements. It requires a rather quick and/or forceful motion to

produce a large enough number such that the change in direction is detected. The part of

the axis of acceleration, which is not measured, is reserved for rests, in order to prevent

accidental triggers.

[tab.tiltparam]

This patch takes the continuous control from the accelerometers and outputs a normalized

continuous control value such that the minimum tilt and maximum tilt are 0 and 127

respectively. The output is passed through [lang.smooth] to interpolate between

sequential measurements, as there are large jumps between the adjacent tilt values.

Page 20: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 19

[tab.tilttovol]

This patch is designed to work in conjunction

with [tab.xltobang] This patch works much like

[tab.tiltparam] in that it outputs a range of

values from 0 to 127, based on continuous tilt

measurements, but it only measures half of an

axis of acceleration, such that the range that

will not cause a trigger will be at minimum

volume. Interpolation is used to smooth

changes in quantized control data. Fast

movements use minimal interpolation where as

slow movements use long interpolation times.

This helps volume control be both sensitive and

bold.

[tab.PitchMap]

This patch uses [tab.4bit++] and one axis of

acceleration to output a pitch with vibrato

(float). The integer pitch value from

[tab.4bit++] is also output along with the unused

axis of acceleration.

[tab.pulserMap] (figure on next page)

This patch maps the FSRs to a control a clock. Using gates in various combinations

different regular rhythms may be generated, based on a preset tempo in BPM where the

time signature is 4/4. For example the index finger FSR set to high turns on the clock

and has it output quarter notes, the middle finger FSR set to high outputs 8th notes, both

together output 16th notes, and so on. This patch also incorporates [tab.xltobang], such

that when no keys are pressed, irregular rhythms or one shots may be triggered. The on

used axis of acceleration and the thumb controlled values are passed out of this patch to

be used with signal generators or effects, if needed.

Page 21: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 20

Above: [tab.pulserMap] patch Below: [tab.nnSynthMap] patch

Page 22: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 21

[tab.nnSynthMap]

This patch uses [tab.PitchMap] on the left hand along with [tab.4bit] and [tab.tilttovol]

on the right hand as a generalized monophonic synthesizer control scheme. [tab.4bit] is

used to control the on/off gate for the amplitude output, as well as to set a transposition

factor. The current transposition mapping is as follows: Index: no transposition

Middle: one octave

Ring: three octaves

Pinky: five octaves

All: 7 octaves

When more than one finger is pressed the highest possible transposition value is output.

With the transposition and three octave range of [tab.4bit++] this mapping routine has

and output range of 10 octaves. [tab.tilttovol] controls the output amplitude, when the

on/off gate is open/on. The right hand thumbs and one axis from each hand are passed

out of this patch for use with effects or other continuous controls for the synthesizer.

Page 23: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 22

4.2 Signal Generators

[tab.electricGuitar]

The next development in the Airguitar

software was a synthesized guitar sound,

which had several continuously

controlled parameters. At the core of

this patch is a rough implementation of a

plucked string model. This subpatch

based on the Karplus-Strong Algorithm

replaced the [comb~ ] object used in

earlier implementations of the Airguitar

(see KSA vs Comb~ discussion in

chapter 4). This string model is

separated into to main components, an

exciter and a resonator. The resonator

is the core of this variation of the KSA.

The exciter usually used in KSA

involves filling the delay buffer with

random values, essentially white noise

[Smith, 2006]. I don’t care much for

the initial burst of white noise heard

with this technique. So instead I use a

filtered impulse to excite the resonator.

Once this string sound is produced it is

lowpassed and distorted with a

waveshaping filter called [overdrive~].

This filter creates an effect similar to that of an overdriven tube amplifier [cycling74,

2006]. The pitch controls for this patch may be interpolated at the signal rate to produce

glissando and smooth trill effects. Dampening is mapped in this patch to reduce the

distortion, reduce the feedback coefficient in the resonator patch and lower the cutoff

frequency of the lowpass filter between the impulse generator and the resonator patches.

Page 24: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 23

[lang.igroov~] as used in [tab.loopSynth~]

[tab.loopSynth~]

This patch uses short percussive sounds as the sample material for a table lookup

oscillator. This patch uses the groove~ object with a loop synced window, to prevent

clicks at the beginning and end of a loop. Controlling the loop length, rather than

playback rate sets the pitch. Signal rate interpolation of the changes in loop length can

create glissandi effects. To play a high-pitched sound the source signal must have a lot

of energy at the very beginning of the loop. Percussion samples work well for this. A

convenient artifact of using loop length instead of play back rate is that high frequencies

are rolled off as even percussive sounds have less energy at the onset than they do just a

few milliseconds after the onset.

Also since this patch can use the same source material the sounds in this instrument

easily match the pallet of the sounds produces with the looping and sample playing

patches.

Page 25: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 24

[lang.jmsl.multisampler~]

This patch is a poly~ abstraction from a collection of code for using the Java Music

Specification Language (JMSL) to control Max/MPS[Didkovsky 2006]. These objects

may also be used with out JMSL. This patch takes a note number and trigger as input,

and outputs a sound. This patch may be polyphonic, with up to 16 voices simultaneously

sounding. Samples are indexed within one audio file. The patch then assigns one index

location per note up to 127 different index values (zero is reserved for rests/silence). This

patch also includes volume control, such that the trigger is not just a bang, but an integer

value. The volume is set when the note is triggered and cannot be continuously

controlled afterwards.

[lang.jmsl.singlesampler~]

This patch works very much like multisampler, except that it uses only one sound. This

sound is transposed across the keys, with middle c being the original playback rate. Notes

below middle c transpose the playback rated down by half steps, such that one octave

lower than middle c plays the sound half as fast, thus it sounds an octave lower. The

notes above middle c transpose the sound up by half steps.

Page 26: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 25

Above: [tab.dj.xfade2~]

[lang.dj.xfade2~]

This patch takes a three-item list as an input. The items are: file name, loop state, cross

fade time, and outputs an audio signal. The file name calls a file to be played from disk.

Loop state sets the file to play as a loop or play one time. Cross-fade time sets how long it

takes the new sound to fade in and the old sound (if any) to fade out. Only two voices

may be played simultaneously.

Page 27: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 26

4.3 Signal Processors

[lang.delloop~]

This patch takes a any audio input records a section into an delay buffer with an input

gate and a feedback gate coordinated such that a loop may be sampled and repeated with

out build up caused by continuous input. This loop will play until it is time for a new loop

to be sampled. Each time a new loop is sampled the delay buffer is clear so that no

residual signal from the previously sampled loop will be present in the new sampled loop.

[lang.samp&buzz~]

This patch a similar to [lang.delloop~]

It uses a buffer~ and groove~ combo

rather than a delay buffer. And its

loop playback is finite. The loop

length either becomes longer or

shorter for each repetition of the loop

depending on an optional argument.

If the loop length becomes shorter,

the sound output is a rhythmic

acceleration that continues into a

pitched glissando. If the loop starts

gets longer (it must start very short)

the sound output starts with a pitched

glissando and continues into a gradual

ritardando. The sounds stops when it

reaches present long and short

extremes of loop length.

Page 28: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 27

[tab.resonator~]

This patch is a resonant low pass filter. It takes a signal to be filtered, and two

continuous control parameters; cutoff frequency and resonance. The resonance is limited

to ranges such that the filter does not self oscillate. The control inputs are interpolated at

the signal rate so that changes in cutoff frequency and resonance do not cause clicks.

[lang.pingpongdelay2~]

This patch takes a mono or stereo signal and outputs a stereo delayed signal. There are

two delay buffers, one output for each channel. There is a feedback network such that the

output of one channel is fed back into the input of the other channel. The delay times of

the two buffers and the feedback level may be set as arguments or may be continuously

controlled.

Page 29: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 28

5.0 Instruments

Together synthesis routines and controller maps formed a variety of instruments. I refer

to these combinations as instruments as they suggest or were inspired by certain gestures

and approaches to playing. While many instruments were tried and tested during the

development of the Airguitar and the formation of the airband, I will focus on

instruments that are in regular use at the time of this writing. These instruments include

Rhythm Guitar, Lead, Techno, Shakey-Shakey, Rhythm Sampler, and Tape-Jockey.

5.1 Rhythm Guitar

The rhythm guitar instrument employs the [tab.pitchMap] and [tab.Pulser] mapping

routines to control [tab.electricGuitar~] and [tab.resonator~]. This is instrument and

previous versions of this instrument have been in use the longest. It was the first of these

instruments to take advantage of the guestural possibilities of playing an electronic music

instrument with the “air guitarist” gestures I adore. Played in the lower register the sound

is very rich and dense, which works well for the rock guitar sounds and gestures. Played

with a little dampening and combined with a regular sequence of triggers from

[tab.pulser] very basic rhythm guitar parts can be reproduced.

This instrument can also be use for sounds and styles other than guitar and rock. Played

in the top register with a small amount of dampening the sound is short and thin. I use

this method along with a slow moving filter sweeps on [tab.resonator] for playing rhythm

parts in a techno song for example.

5.2 Lead

The lead instrument employs the [tab.nnSynthMap] mapping routine to control

[tab.loopSynth~] and [tab.resonator~]. This instrument is often used with

[lang.pingpongdelay2~]. While this instrument was originally to play sustaining lead

guitar line over the rhythm guitar parts, this instrument is may be used to play almost

anything. It has a frequency range that covers bass thru treble instruments. Its range is

wide enough so that 3 people perform with it in different registers with out overlapping.

The pitch and rhythm controls are direct. They are not mediated by a control mapping

such as [tab.pulser]. The notes are either on or off depending on the state of the fingers

on right hand keyer. Notes may be faded in and out with [tab.tilttovol] but must be on the

entire time.

The [tab.resonator~] patch is key to performing with this instrument. In one piece for

example, we each pick a note of the overtone series of C, and we stay on that note while

varying the cutoff and resonance controls (exploring the overtones of that note). In this

piece we all play with different delay time and feedback settings for

[tab.pingpongdelay2~] . With feedback coefficients between 0.2 and 0.6 we can create

and control a layered sound as gestures add to the delay loop. With higher feedback

coefficients we can have an even more layered sound but some control is lost as sounds

remain in the delay loop for several seconds if not minutes.

This instrument may be one of the easiest to learn to play, as the mapping feels obvious

once the somewhat arbitrary 4bit mapping to pitch scheme is memorized. But with its

range and direct mapping it can also be very frustrating. One can be in the middle of

Page 30: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 29

playing a bass line and slip into a register five octaves higher. There is one piece where

we take advantage of this. We refer to the piece as the R2D2 duet. In this piece we

squeak, squawk, beep, buzz and whine back and forth as if having a conversation in the

language of the R2 units from “Star Wars”.

5.3 Techno

This instrument employs six [lang.delloop~] patches and two [lang.samp&buzz] patches

as well as one [tab.resonator~] patch for each hand. The delloop and samp&buzz patches

are controlled directly by the FSRs under the fingers and thumbs. The thumb high FSR,

controls the volume of the direct signal passing thru the techno patch. The thumb low

FSR selects between bank one and bank two of delloop patches and samp&buzz patches.

The finger FSRs control the volume of the recording and volume of loops. When a finger

FSR is pressed a loop is recorded and can be faded in and out. When the FSR is released

the loop stops. Each loop is set to repeat at a different subdivision of a master loop

duration that is passed to the techno patch.

The sampled loops are summed and passed to the dry output (though amplitude scaled by

0.5) and to [tab.resonator~]. The dry output and resonated output are soft panned such

that the resonated signal for each hand is panned towards that hand and the dry signal is

panned to the opposite side.

As this instrument does not generate signal it must be used in conjunction with a signal

generator. [lang.dj.xfade2~] provides the input audio in the latest version of the Airband

patches. Whenever a new tape cue is played, a tempo setting associated with the new cue

must be passed to techno.

Techno gets its name from its effect on the input signal. It can turn most any audio

source from spoken word, to drum tracks, to abstract field recordings into rhythmic

material on the fly.

Techno can create a variety of drum fills and transitions. Its is especially effective for

transitions from one rhythmic tape cue to another. For example, several sub loops of a

main loop may be played at once, while [tab.resonator~] provides a filter sweep just as a

new tape cue is dropped right on the down beat of the next measure.

5.4 Shakey-Shakey

This instrument employs [tab.4bit++] and [tab.xltobang] to call and trigger samples with

[lang.jmsl.multisampler~]. The two axis of tilt from each hand are mapped to control

volume and transposition. The transposition control allows for one sample in the multi-

sample bank to be played at different playback rates, thus providing for many sound

possibilities. This instrument has both hands controlling their own sounds. Such that one

hand can be triggering a percussive sample bank, while the other could be triggering a

collection of pitched samples, for example. This instrument is often used in conjunction

with [lang.pingpongdelay~] to create abstract textures, as triggers samples are layer both

with polyphony and with long delay loops with high feedback coefficients.

This instrument can be fun to play as it feels physically much like playing a shaker

instrument, such as maracas, while the sounds produces can be almost anything.

Page 31: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 30

5.5 Rhythm Sampler

This instrument works very much like the Rhythm Guitar instrument with one major

difference. The signal generator is a polyphonic sampler instead of a string model and the

duration of each note is fixed when it is triggered, and cannot be shortened later. The

same FSR that affects note duration in the rhythm guitar affects note duration for this

instrument. This patch was originally developed with multi-sampler, but it was too

confusing to use in a meaningful way with different sounds. With one sample mapped to

pitches this patch was much easier to learn to play in a few short rehearsals.

This patch is great to use with fixed duration bass samples, as a rhythm bass instrument.

While the [tab.resonator~] patch is used here too, it has little affect on bass samples

except to change the tone of the attack transients.

5.6 Tape-Jockey …

This instrument employs [tab.4bit++] to [tab.xltobang] to control [lang.dj.xfade2~] as

well as the system presets. This instrument is always available as a quasi-mode. A

quasimode is allows for the benefits of a mode with out the cognitive burden of

remembering which mode the system is in [Wikipedia, Raskin 2000]. A sustain pedal

accesses this quasimode. When the pedal is engaged the right hand selects a tape cue or a

system preset, the left hand triggers the cue or preset. The presets include effect settings

for [lang.pingpongdelay2~], which instrument is playing and what tempo the rhythm

instruments and techno should reference. The first two octaves of the 40 note range are

reserved for tape cues, one of these cues is always a silent cue so that rests may be

triggered. These cues can change with each song. The last 15 notes are reserved for

system presets. These presets are the same for every song. Examples of system presets

and tape cues can be seen in the next chapter discussing scores and deliverable materials.

This instrument allows for realtime playback of any tape music piece. The piece could be

segmented into several shorter sections of the entire piece which one person could

perform. Or the piece could be decomposed into several separate tracks or stems each

segmented into short sections, which an ensemble perform and rearrange as desired.

The next chapter will describe how a composer could produce instructions and audio files

for the Airband to perform a tape music piece.

Page 32: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 31

6.0 Deliverables

The Airband can play music described by composers in several ways. When limited to

the Lead and Rhythm guitar instruments a composer can describe the musical her piece

entirely with common music notation. When using the other instruments, sound files

must be supplied in addition to written parts. When using Shakey-Shakey or TapeJockey

(a requirement for Techno), index files must be supplied in addition to audio files and

written parts. These index files describe how the software will playback the multi-sample

banks in Shakey-Shakey, and describe a variety of details in TapeJockey, such as audio

playback controls, tempo settings and effects presets.

The pitch range of the Lead instrument is from the C three octaves below middle C to the

D# 7 octaves above middle C. This range corresponds to the MIDI note numbers 24 to

135, and fundamental frequencies of 32hz to approximately 20kHz.

The pitch range of the Rhythm Guitar is from the C three octaves below middle C to the

D# above middle C. This range corresponds to the MIDI note numbers 24 to 63 and the

fundamental frequencies of 32hz to 311hz.

Audio files are assumed to be 16 bit audio sampled at 44.1kHz saved as aiff or wave.

Audio delivery of samples in other TapeJockey assumes all files are stereo. Mono files

will be played only out of the left channel. Shakey-Shakey and RhythmSample assume

sample files to be mono. Only the left channel of stereo sample files will be used.

Below are some examples of scores successfully used in performance with the Airband.

One might notice that structures are loose in these scores. At the time of this writing the

Airband was moving from freely improvised music exclusively on the pitched and

sample based instruments (lead, rhythm guitar, shakey-shakey and rhythm sampler) to

playing more structured compositions which incorporated the long sound file and effects

based instruments (tapejocky and techno).

Page 33: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 32

for

refe

rence o

nly

Cue N

um

ber

com

ma

tape,

fx, in

str

file

nam

e,

on/o

ff,

instr

num

ber

loop o

n/o

ff,

dela

y1,

tem

po (

if a

pplic.)

xfa

de t

ime, dela

y2

tem

po,

feedback

scale

sem

icolo

n

C 1 , tape silence 0 100 120 ;

C# 2 , tape FuegRptLow.aif 0 100 120 ;

D 3 , tape FuegRptNoiseBig.aif 0 100 120 ;

D# 4 , tape FuegRptNoiseSml.aif 0 100 120 ;

E 5 , tape NOISE.AIF 0 100 120 ;

F 6 , tape NOISE3.AIF 0 100 120 ;

F# 7 , tape LITTLE.AIF 1 100 120 ;

G 8 , tape DryWords.aif 0 100 120 ;

G# 9 , tape EnMiTodo.aif 1 100 120 ;

A 10 , tape FuegoRpt1.aif 1 100 120 ;

A# 11 , tape InMe.aif 0 100 120 ;

B 12 , tape InMe2.aif 1 100 120 ;

C 13 , tape FIRE.AIF 1 100 120 ;

C# 14 , tape ASH.AIF 1 100 120 ;

D 15 , tape AiAmorAh.aif 0 100 120 ;

D# 16 , tape NothNorm.aif 0 100 120 ;

E 17 , tape ExtngNorm.AIF 0 100 120 ;

F 18 , tape FORGTN.AIF 1 100 120 ;

F# 19 , tape DeHarQue.wav 1 100 120 ;

G 20 , tape IShlStp.aif 0 100 120 ;

G# 21 , tape RptFirAsh.aif 0 100 120 ;

A 22 , tape silence 0 10 120 ;

A# 23 , tape silence 0 10 120 ;

B 24 , tape silence 0 10 120 ;

This is a spreadsheet used to create the index file for [Transcending Translation], a tape

music piece adapted and arranged for the Airband by the original composer, Laura M

Sinnott. The spreadsheet is a template, which a composer may easily create an edit an

index file. The index file is a txt file with just the cue number thru semicolon on each

line. The tinted area; the header and the reference only note numbers are not to be used

as part of the index file. The syntax for creating an index file from any text editor can be

observed in the header at the top of this spreadsheet. Note that the file name “silence”

actually calls an empty audio file, so that silence can be quickly triggered as a sound cue.

Page 34: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 33

for

refe

rence o

nly

Cue N

um

ber

com

ma

tape,

fx, in

str

file

nam

e,

on/o

ff,

instr

num

ber

loop o

n/o

ff,

dela

y1,

tem

po (

if a

pplic.)

xfa

de t

ime, dela

y2

tem

po,

feedback

scale

sem

icolo

n

C 24 , instr 0 ;

C# 25 , instr 1 ;

D 26 , instr 2 ;

D# 27 , instr 3 ;

E 28 , instr 4 ;

F 29 , instr 5 ;

F# 30 , fx 0 ;

G 31 , fx 0 ;

G# 32 , fx 0 ;

A 33 , fx 0 ;

A# 34 , fx 0 ;

B 35 , fx 0 ;

C 36 , fx 1 30 100 0.5 ;

C# 37 , fx 1 176 352 0.5 ;

D 38 , fx 1 500 350 0.2 ;

D# 39 , fx 0 ;

This is a spread sheet of the System cues in TapeJocky. These cues stay the same for

each song. I have reserved the top range of note numbers from [tab.pitchmap] and/or

[tab.4bit++] to for use with instrument switches and effects presets. These controls

could also be included in an index file for a song. The column after the comma is used to

assign and route the rest of the data with each row. The item “tape” sends data to

[lang.dj.xfade2] , “intr” calls the instrument number (0 = all mute, 1 = Lead, 2 = rhythm

guitar, 3 = techno, 4 = rhythmSampler, 5 = shakey-shakey), “fx” sends data to

[lang.pingpongdelay2~] and an associated volume control which acts as a gate. At the

time of this writing System cues 30-35 were not in use, [“fx” 0] is a place holder, which

if called by accident would only close the gate associated with [lang.pingpondelay2~].

Page 35: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 34

7.0 Future Work

The musical and technical development for the Airband is on going. On the musical

front we are trying more structured pieces, and attempting to move away from

dependency on the laptop monitor for visual feedback in performance.

The future technical developments include possible wireless communication with the

computer via Bluetooth® technology. There are a few NIME projects, which have

successfully used this protocol to communicate with Max/MSP [Allen 2005, Weinberg

2002, Young 2004]. Also at the time of this writing one of the key performers in the

Airband, Laura M Sinnott, just finished initial development and performed with a

wireless NIME device for use with Max/MSP. Her work was presented as part of the ITP

NIME class concert at Tonic, December 2006. The set up for her performance was to set

a laptop down on stage, plug in the audio and turn on the audio in her patch. She then

started performing. When the Airband played this same show in April 2006, we had to

plug in the MIDI devices plug in the MIDI to USB interfaces, launch Max (wait), load

the patch (wait some more while audio files are buffered in RAM) and start the audio.

This process took probably 2-4 minutes. For a 5 minute performance this was far too

much set up time.

Developments for the Airguitar might include switching to an analog to digital circuit

which is easily reprogrammable, via USB connection from an Apple computer [Arduino,

2006]. This would allow for robust firmware development, such that a circuit could be

switched from Bluetooth mode to MIDI mode. This may be necessary in the event that a

space has too much interference for the Bluetooth connection to work consistently.

The physical device still needs a lot of work. At the time of this writing, the device is

held together mostly with electrical and gaffers tape. The batteries, accelerometers, FSR

leads and even the strap are all fastened with lots of tape. This worked well for a fast

prototype but as time goes on the adhesive on the tape fails. I do not have specific plans

to remedy this at this time, but its something that must be done soon.

The Software offers the most opportunities for fast improvements. During the

preparation for the presentation of this project, several minor improvements took place.

Some of them were documented and replaced older versions of the software maps and

instrument description. Some improvements took place almost in passing and were not

documented. These improvements included parameter adjustments and scaling to

improve sensitivity and range on some of the sensor maps. As they did not represent a

change in the structure of a patch or how it was used, I did not take the time to describe

them. As the Airband develops more pieces and performances, several improvements

will likely take place.

Some improvements that have already been discussed by the performers and composers

in the Airband, include the development of new synthesis routines to use with the

existing mapping routines. For example [tab.nnSynthMap] is general enough that it

could be used to control most any monophonic sustaining synthesizer. [tab.loopSynth]

could be replaced with an oscillator based synthesizer, or a sustaining physical model,

such as a bowed string model [Sarafin, 2004]. The same is true for the [tab.pitchMap] and

[tab.Pulser] mapping routines. They have already been used with two signal generators, a

physical modeling synthesizer and a sample player. Several more synthesis methods are

Page 36: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 35

already in the works [Crawford, 2006]. With a larger selection of synthesis methods

using the same control mappings, the performers would not have to learn many more

techniques for playing to have a larger variety of sounds at their fingertips.

8.0 Conclusion

The Airband could potentially perform any piece of electronic music. This could bring

life to many pieces that might otherwise only exist as recordings. There is still work to

be done regarding the technical development of the hardware and software tools used by

the Airband. But with just the existing devices and software tools, tape cue playback

system and a few instruments, a variety of pieces may be played. I hope this text may

encourage others to develop tools for the Airband and/or write music for the Airband, or

ensembles like the Airband.

Page 37: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 36

9.0 References

Alford, A, S. Northrup, K. Kawamura, K-W. Chan, and J. Barile. A Music Playing Robot, Proceedings 1999

International Conference on Field and Service Robotics

Allen, Jamie. Boom Box, Proceedings of the 2005 conference on New interfaces for musical expression

Arduino 2006 http://www.arduino.cc/

Cook, Perry Principles for designing computer music controllers, Proceedings of the 2001 conference on New

interfaces for musical expression.

Chadabe, Joel. Devices I Have Known and Loved. In Trends in Gestural Control of Music. M.M. Wanderley and M.

Mattier, eds. Ircam – Centre Pompidou. 2000

Chadabe, Joel. Electric Sound: The Past and Promise of Electronic Music 1997

Chadabe, Joel. Solo: A Specific Example of Realtime Performance in Computer Music – Report on an International

Project. Canadian Commission for UNESCO, Ottawa, 1980.

Chadabe, Joel. The limitations of mapping as a structural descriptive in electronic instruments Proceedings of the 2002

conference on New interfaces for musical expression

Collins, Nick. "Generative Music and Laptop Performance." Contemporary Music Review 22/4, pp. 67-79, 2003.

Crawford, L. String Madness Final Paper for Digital Signal Theory @ NYU fall 2006

Cycling 74 Manual and Help files downloaded from company webpage 2006 http://www.cycling74.com

Didkovsky, N., L. Crawford. Java Music Specification Language and Max/MSP Proceedings of the 2006 International

Computer Music Conference

Mann, Steve personal webpage 2006 http://wearcam.org/septambi/

Microchip 2006 http://www.microchip.com/

Moog Music manual downloaded from company webpage 2006 www.moogmusic.com/manuals/ewave_user.pdf

Raskin, Jeff. The Humane Interface: New Directions for Designing Interactive Systems Addison Wesley 2000

Rokeby, David personal webpage 2006 http://homepage.mac.com/davidrokeby/softVNS.html

Rovan, J., Wanderley, M., Dubnov, S., and Depalle, P., Instrumental gestural mapping strategies asexpressivity

determinants in computer music performance in Proceedings of the Kansei – The Technology of Emotion Workshop

1997.

Serafin, S. The sound of friction: real-time models, playability and musical applications. PhD thesis, Stanford

University, Stanford, CA, 2004.

Smith, J.O. Physical Audio Signal Processing, 2006 http://ccrma.stanford.edu/~jos/pasp/

Sonami, Laetitia personal webpage as of 2006 http://www.sonami.net/lady_glove2.htm

Sturman D, Zeltzer D. A survey of glove-based input. IEEE computer graphics and applications. 1994.

Waisvisz, Michel personal webpage 2006 http://www.crackle.org/TheHands.htm

Wanderley, M. and M. Battier, 2000. Trends in Gestural Control of Music. (Edition électronique.) Paris: IRCAM.

Weinberg, G., Aimi R., Jennings, K., The Beatbug network: a rhythmic system for interdependent group collaboration,

Proceedings of the 2002 conference on New interfaces for musical expression.

Wikipedia 2006 http://en.wikipedia.org/wiki/Comb_filter

Also: /Data_glove , /Karplus-Strong_string_synthesis, /Keyer , /Theremin , /Doctor_Who, /Air_guitar, /quasimode

Young, D., Fujinaga,I., Aobachi: a new interface for Japanese drumming, Proceedings of the 2004 conference on New

interfaces for musical expression.

Page 38: How to Write Music For The Airband - NYU Steinhardtsteinhardt.nyu.edu/scmsAdmin/uploads/002/920/crawford_Airband... · 4.1 Control MapsÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉÉ. .17 ...

- - 37


Recommended