sensors & mobile music lalya gaye. * body-based human body as start for design: expressive...
TRANSCRIPT
SENSORS & MOBILE MUSICLalya Gaye
* Body-basedHuman body as start for design: Expressive qualities of human movements
Music controllersInterfaces
The Hands, Waisvicz, STEIM, 1984
* User movement- Choreographed body movement
- Traditional instrumental gesture
- Novel gestures
Music controllersInteraction
Machover & Yoyo Ma, Hypercello, 1991
Dark around the Edges, Winkler, 1997
The Hands, Waisvicz, STEIM, 1984
* User movement- Full-handed gesture
- Empty-handed gesture
Music controllersInteraction
Lady glove, Bongers & Sonami, 1991
Unfoldings, Interactive Inst., 2003
Stranglophone, Sharon, ITP/NYU, 03
* Environment-basedInteractive environments
- Reactive floors- Digital realm: networked audio
Take advantage of the features of spaceInteractive environments: many people together, control of interaction parameters…
Music controllersInterfaces
Magic Carpet, MIT Medialab, 1996
Global String, Tanaka & Toeplitz, 1998
* WearablesMusical jeans jacket (MIT Medialab, 1992)
Tgarden (FoAM & sponge, ~2001)
Expressive Footwear(MIT, 1997-2000)
ensemble (Kristina Andersen, ~2003)
Intimate interfaces; Body movement and postureTheatrical vs. daily life dimensions
Music controllersInterfaces
* Object-basedStarting with existing instruments
- augmented (hyperinstruments…)
- digitalised (ex: piano synth)- interface used as controller (ex: MIDI keyboard)
Use metaphor of object
Music controllersInterfaces
Machover & Ma, Hypercello, MIT, 1991Taku Lippit, ITP/NYU, 2002-03
* Object-basedRepurposed everyday objects
and materials: water, fabric,
chemicals, vegetables …
Music controllersInterfaces
Daniel Skoglund, 8Tunnel2 Particles, Horio Kanta, 2003 MIDI Scrapyard Challenge, Brucker-Cohen & Moriwaki, 03-04
* Object-based
Take advantage of the material properties of objects f.e.x bendable, conducts electricity, etc
Take into consideration human activities surrounding the objects: build upon it and / or break from it
Music controllersInterfaces
* MechanicalGuitarbot
(Eric Singer et al., LEMUR, 2003-)
* Electroacoustic
Spherical speakers (Curtis Bahn)
* Tactile output (haptics)Cutaneous Grooves (E. Gunther, MIT Medialab, 2001)
Music controllersOutput
Sensors in Ubicomp technology
* Computing where needed, not other way around.
Invisible in use, in the fabric of everyday life, embodied interaction.Connection to place and moment of use.
* Sensors:
- in everyday environments (e.g. context-awareness)
- on people (e.g. wearables)
- on artefacts (Media cup - TecO)
* Sensor fusion: combining different data and placements to gather context
- sensor networks
Sensors in mobile music & locative audio* Combining NIME and Ubicomp type of sensors use
* Urban settings + everyday: rich environment, familiar, unpredictable, dynamic, heterogeneous
* Sensors on environments, users, objects
* Interaction between:
- user and objects
- user and environment
- user and user(s)
+ combinations and networks
Possible uses, interactions, issues and implications of implementations?
* Space annotation:
sensing proximity / location
Hear&There(Rozier, MIT Medialab, 1999)
Tejp / Audio tags(PLAY & FAL, 2003-04)
Mobile music and locative audioLocative audio in public space
* Radio pirates: sensing environmental factors
Bit Radio(Bureau of Inverse Technology)
Mobile music and locative audioLocative audio in public space
* Mobile music sharing: sensing othersSoundPryer (Mattias Östergren, Interactive Institute, 2001)
TunA (Arianna Bassoli et al.,
Medialab Europe, 2002)
Push!Music (Håkansson et al., Viktoria Institute, 2005)
Mobile music and locative audioMobile music
* Mobile music making
Music making away from computer screen or performance setting: in the everyday
Sensor technology + GPS -> situated music making
Ad hoc & distributed networks throughout the city -> collaborative music making
etc
Mobile music and locative audioMobile music
* Mobile music making:sensing user-environment interaction
Sonic City (Gaye et al., FAL & PLAY, 2002-04)
Sound Lens(Toshio Iwai, 200?)
Mobile music and locative audioMobile music
* Mobile music making:device as interface between user and space
Sound Mapping (Iain Mott et al., Reverberant, 1998)
Mobile music and locative audioMobile music
* Mobile music making:sensing user-user + user-device interaction
CosTune (Nishimoto, ATR, 2001)
Sound Lens(Toshio Iwai, 200?)
Malleable Mobile Music (Atau Tanaka, Sony CSL, 2004)
Mobile music and locative audioMobile music
* Sound-art installationsElectric walks(Christina Kubisch)
Drift (Teri Rueb)
* Walking through digital spaceSeven Mile Boots(Beloff et al., 2003-04)
Mobile music and locative audioSound Walks: mapping audio world to physical paths
Personal instrument(Krzysztof Wodiczko, 1969)
Mobile and locative soundWearable audio
Headphones vs Boombox vs Using everyday objects
SoundbugTM speakers & piezos
Flower Speakers (LET’S corporation, Japan, 2004)
Mobile and locative soundOutput
Wearables
Nomadic Radio(Nitin Shawney, MIT Medialab, 1998)
Sonic Fabric(Alice Santaro, 2002)
Mobile and locative soundOutput
DemoDIY music controller
* System set-up
Tracking & other sensorsMicro-controllersMIDI protocolInteractive softwares
Sensors MIDI speakers
Micro- controller
PC w/ music software
DIY music controller
* Components- sensors: potentiometer + switch / light + proximity sensors
- micro-controller: BasicX-24
- protocol: MIDI
- software: Pd
Sensors MIDI speakers
Micro- controller
PC w/ music software
Tracking & other sensors
* Contact-based tracking
Isometric• Pressure, switches, etc
Movement sensing• Rotation: pots, goniometers, joysticks • Linear movement: sliders, tension sensors, pads, tablets• Bending
Ref: “Human Movement Tracking Technology”, Mulder, A. Technical Report, NSERC Hand Centered Studies of Human Movement project. Burnaby, B.C., Canada: Simon Fraser University.
Tracking & other sensors
* Contact-based tracking
Inside-in• Emitter + receiver on subject body-centred• Workspace in principle unlimited• ex: flex sensors, biometric sensors…
Inside-out• Sensor on subject + external emitter• Workspace limited if source artificial, unlimited if source natural• ex: accelerometers, gyroscope, compass…
Ref: “Human Movement Tracking Technology”, Mulder, A. Technical Report, NSERC Hand Centered Studies of Human Movement project. Burnaby, B.C., Canada: Simon Fraser University.
Tracking & other sensors
* Contactless tracking
Outside-in• External sensor + emitter on subject• Least obtrusive• Workspace limited• ex: video tracking + markers
Indirect acquisition• Deduction from audio output• LatencyRef: “Human Movement Tracking Technology”, Mulder, A. Technical Report, NSERC Hand Centered Studies of Human Movement project. Burnaby, B.C., Canada: Simon Fraser University.
Tracking & other sensors
* Other sensors
Objects• More or less same as human tracking sensors
Environment • Light, sound, temperature, humidity, electricity, magnetism…
Digital information• ex: activity on internetRef: “Human Movement Tracking Technology”, Mulder, A. Technical Report, NSERC Hand Centered Studies of Human Movement project. Burnaby, B.C., Canada: Simon Fraser University.
* Micro-controllers
• Collecting sensor data and sending them to processor (e.g. PC) as serial data (e.g. MIDI signal)
• Can also be used to trigger actuators (f. ex: LED)
Common micro-controllers • BasicX-24 • Basic Stamp II• PIC
* MIDI protocol
• MIDI=Musical Instrument Digital Interface
• Standardised serial communications protocol between synthesizers and other digital music devices
• Controllers / receivers
• Midi command = status byte + 2 data bytes
– action (note on, note off, pitch bend, control change )
– pitch
– velocity (how loud)
* Interactive music softwares
Common softwares• MAX/MSP
• Pd (Pure Data)…
• Using MIDI signals
as control data…
* Reading sensor values with BX-24
• connect sensor to ADC pins
• power supply them with the BX’s 5V DC output power
(! BX power = 9V)
• add ”SerialPort” module for communicating with serial port
• write routine for reading voltage on pins
• download program to EPPROM
Option Explicit
Dim voltIn As Byte
Dim switch As Byte
Public Sub Main()
voltIn = 1
switch = 1
Do
'potentiometer
voltIn = cByte(getADC(16))
'switch
switch = GetPin(17)
Debug.Print "voltIn:"; cStr(VoltIn)
Debug.Print "switch:"; cStr(switch)
Call Sleep(0.05)
Loop
End Sub
* Sending values as MIDI signal
- convert data into MIDI scale (0-127)
- create buffer
- adapt baud rate to MIDI speed
- write subroutine loop for sending MIDI
- MIDI command 144 (note on) + 128 (note off)
- or on + ”velocity” used as ID + ”pitch” used as sensor value
- download on EPPROM
- sending out serial data via MIDI adapter circuit and MIDI-USB adapter
Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/code/archives/bx-24/000249.shtml
Option Explicit
Dim InputBuffer(1 To 12) As Byte
Dim OutputBuffer(1 To 10) As Byte
Dim midiCmd As Byte
Dim vel As Byte
Dim midiTaskVar(1 To 50) As Byte
Dim voltIn As Byte
Dim switch As Byte
Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/code/archives/bx-24/000249.shtml
Public Sub Main()
voltIn = 1
switch = 1
Call openQueue(Inputbuffer, 12)
Call openQueue(Outputbuffer, 10)
Call OpenCom(1, 9600, InputBuffer, Outputbuffer)
Register.ubrr = 14
midiCmd = 144
CallTask "midiTask", midiTaskVar
Do
'potentiometer
voltIn = cByte(cSng(getADC(16)) * 127.0 / 1023.0) 'switch
switch = GetPin(17)
Call Sleep(0.05)
Loop
End Sub
Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/code/archives/bx-24/000249.shtml
Sub midiTask ()
Do
vel=1
Call putQueue(OutputBuffer, midiCmd, 1)
Call putQueue(OutputBuffer, voltIn, 1)
Call putQueue(OutputBuffer, vel, 1)
vel=2
Call putQueue(OutputBuffer, midiCmd, 1)
Call putQueue(OutputBuffer, switch, 1)
Call putQueue(OutputBuffer, vel, 1)
Call Sleep(0.05)
Loop
End Sub
Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/code/archives/bx-24/000249.shtml
* Sending values as MIDI signal
Ref: Physical Computing, Tom Igoe. http://www.tigoe.net/pcomp/midi.shtml
* Receiving MIDI data in Pd
• C:…/pd/bin
• pd –midiindev 1
• route data according to ID (”vel”)
• use ”pitch” as control values
Sensors MIDI speakers
Micro- controller
PC w/ music software
Discussion
* Mobile music application using sensors:Possible uses, interactions, issues and implications of implementations?
* Props: sensor platform, soundbug, tell me
* Focus:
- sensor positioning- physical interaction and relation between sound, body and place- combining data
LinksDIY links• BX-24: http://www.basicx.com
• Pd: http://www.crca.ucsd.edu/~msp/software.html
• More micro-controllers etc: ITP Physical computing
http://tigoe.net/pcomp/index.shtml
Book Physical Computing – Dan Sullivan & Tom Igoe
• On iPaq: Linux + PDa (by Gunther Geiger):
http://gige.xdv.org/pda/
LinksSensors & Mobile Music Links
New Interfaces for Musical Expression: http://www.nime.org
Mobile Music & Locative Audio:http://www.netzwissenschaft.de/mob.htmhttp://www.viktoria.se/~lalya/tamabi05/
Ubiquitous Computing: http://www.ubicomp.org/