a platform for dance performances with multiple...

8
A Platform for Dance Performances with Multiple Quadrocopters Angela Sch¨ ollig, Federico Augugliaro, and Raffaello D’Andrea Abstract— This paper presents a platform for rhythmic flight with multiple quadrocopters. We envision an expressive multi- media dance performance that is automatically composed and controlled, given a random piece of music. Results in this paper prove the feasibility of audio-motion synchronization when precisely timing the side-to-side motion of a quadrocopter to the beat of the music. An illustration of the indoor flight space and the vehicles shows the characteristics and capabilities of the experimental setup. Prospective features of the platform are outlined and key challenges are emphasized. The paper concludes with a proof-of-concept demonstration showing three vehicles synchronizing their side-to-side motion to the music beat. Moreover, a dance performance to a remix of the sound track ‘Pirates of the Caribbean’ gives a first impression of the novel musical experience. Future steps include an appropriate multiscale music analysis and the development of algorithms for the automated generation of choreography based on a database of motion primitives. I. INTRODUCTION The interface of music and robotics has become a promi- nent area of research, attracting the attention of not only roboticists, but also musicians, artists, and the general public. When robot technology is brought together with musical expression and rhythmic performance, it opens up the space for a novel human-robot interplay and a unique musical experience that cannot be achieved by traditional means. During the past decade research on musical robots has, for the most part, been driven by two distinct communities. The first – consisting mostly of musicians, composers and music technologists – has generally sought to develop innovative forms of musical expression and sound production that overcome the limitations of human music generation and traditional musical instruments. Early robotic developments from this first community include the automation and mech- anization of instruments, such as the piano, percussion and woodwinds (see [1], [2] and, for a historical overview, [3]); more recently, this group has been behind the development of perceptual music robots [4], some of which facilitate music collaboration with human musicians [5]–[7]. The second community, working out of the fields of robotics and engineering, has sought to use music to establish a new dimension of human-robot communication, interaction and collaboration. Research from this second community has been largely focused on the development of humanoid robots capable of imitating human musical behavior: robots that perform human-inspired rhythmic motions, such as dancing The authors are with the Institute for Dynamic Systems and Control (IDSC), ETH Zurich, Sonneggstr. 3, 8092 Z¨ urich, Switzerland, [email protected],[email protected], [email protected]. Associated video at www.tinyurl.com/DancePlatform. [8]–[13], drumming [14], stepping and singing [15]–[17] along to music. Common to both communities is the desire to create an audio-visual performance with both aesthetic and entertainment value; this merging of musical expression, robotic technology and entertainment has also been a fasci- nating playground for artists [18], [19]. The goal of this work is to create a novel visual mu- sical experience: Multiple quadrocopters fly in a rhythmic performance, expressing the character of the music as they move in a coordinated and precisely-timed fashion through the three-dimensional space. Prior to flight performance, a random piece of music is analyzed and its main features are extracted. These features are later used to guide the quadro- copters’ choreography. This approach differs from most other research on musical robots, which typically aims for a real- time interaction with the environment and which studies the problem of real-time music analysis, see e.g. [5], [17], [20]– [23]. In contrast, the focus of this work is the design, control, and synchronization of the rhythmic quadrocopter motion. Major challenges include: Motion Design – Given the body and dynamics of a quadrocopter, translating music into suitable motion patterns is a significant artistic challenge. Unlike humanoid dance robots, which may imitate any human dance movement and produce an easily-recognized expressive gesture [8]–[13], quadrocopters do not move the way humans do. Choreog- raphy of their movement is new, and requires creativity, aesthetic judgment and a deep understanding of the system’s dynamics and its limits. Fig. 1. The Flying Machine Arena, an indoor flight space for multiple quadrocopters. 1

Upload: phamdieu

Post on 13-Mar-2018

217 views

Category:

Documents


1 download

TRANSCRIPT

A Platform for Dance Performances with Multiple Quadrocopters

Angela Schollig, Federico Augugliaro, and Raffaello D’Andrea

Abstract— This paper presents a platform for rhythmic flightwith multiple quadrocopters. We envision an expressive multi-media dance performance that is automatically composed andcontrolled, given a random piece of music. Results in thispaper prove the feasibility of audio-motion synchronizationwhen precisely timing the side-to-side motion of a quadrocopterto the beat of the music. An illustration of the indoor flight spaceand the vehicles shows the characteristics and capabilities ofthe experimental setup. Prospective features of the platformare outlined and key challenges are emphasized. The paperconcludes with a proof-of-concept demonstration showing threevehicles synchronizing their side-to-side motion to the musicbeat. Moreover, a dance performance to a remix of the soundtrack ‘Pirates of the Caribbean’ gives a first impression of thenovel musical experience. Future steps include an appropriatemultiscale music analysis and the development of algorithms forthe automated generation of choreography based on a databaseof motion primitives.

I. INTRODUCTION

The interface of music and robotics has become a promi-nent area of research, attracting the attention of not onlyroboticists, but also musicians, artists, and the general public.When robot technology is brought together with musicalexpression and rhythmic performance, it opens up the spacefor a novel human-robot interplay and a unique musicalexperience that cannot be achieved by traditional means.

During the past decade research on musical robots has, forthe most part, been driven by two distinct communities.

The first – consisting mostly of musicians, composersand music technologists – has generally sought to developinnovative forms of musical expression and sound productionthat overcome the limitations of human music generation andtraditional musical instruments. Early robotic developmentsfrom this first community include the automation and mech-anization of instruments, such as the piano, percussion andwoodwinds (see [1], [2] and, for a historical overview, [3]);more recently, this group has been behind the development ofperceptual music robots [4], some of which facilitate musiccollaboration with human musicians [5]–[7].

The second community, working out of the fields ofrobotics and engineering, has sought to use music to establisha new dimension of human-robot communication, interactionand collaboration. Research from this second community hasbeen largely focused on the development of humanoid robotscapable of imitating human musical behavior: robots thatperform human-inspired rhythmic motions, such as dancing

The authors are with the Institute for Dynamic Systems andControl (IDSC), ETH Zurich, Sonneggstr. 3, 8092 Zurich, Switzerland,[email protected],[email protected],[email protected].

Associated video atwww.tinyurl.com/DancePlatform.

[8]–[13], drumming [14], stepping and singing [15]–[17]along to music. Common to both communities is the desireto create an audio-visual performance with both aestheticand entertainment value; this merging of musical expression,robotic technology and entertainment has also been a fasci-nating playground for artists [18], [19].

The goal of this work is to create a novel visual mu-sical experience: Multiple quadrocopters fly in a rhythmicperformance, expressing the character of the music as theymove in a coordinated and precisely-timed fashion throughthe three-dimensional space. Prior to flight performance, arandom piece of music is analyzed and its main features areextracted. These features are later used to guide the quadro-copters’ choreography. This approach differs from most otherresearch on musical robots, which typically aims for a real-time interaction with the environment and which studies theproblem of real-time music analysis, see e.g. [5], [17], [20]–[23]. In contrast, the focus of this work is the design, control,and synchronization of the rhythmic quadrocopter motion.Major challenges include:

Motion Design –Given the body and dynamics of aquadrocopter, translating music into suitable motion patternsis a significant artistic challenge. Unlike humanoid dancerobots, which may imitate any human dance movement andproduce an easily-recognized expressive gesture [8]–[13],quadrocopters do not move the way humans do. Choreog-raphy of their movement is new, and requires creativity,aesthetic judgment and a deep understanding of the system’sdynamics and its limits.

Fig. 1. The Flying Machine Arena, an indoor flight space for multiplequadrocopters.

1

Motion Control –A quadrocopter’s nonlinear and unstabledynamics, cf. [24],demands sophisticated controls in or-der to achieve precise trajectories and exact audio-motionsynchronization. Modeling errors and other nonidealities,such as motor saturation and communication delays, have anoticeable effect on the quadrocopter performance. Note thata stabilizing controller is required just to keep the vehicle inthe air. Moreover, additional underlying controllers imposea periodic oscillation corresponding to the music frequency,presenting the basis for an overall rhythmic behavior.

Motion Synchronization –To make the quadrocopters‘dance to music’, it is essential to time the vehicle movementwith the music beat precisely. In the proposed setup, how-ever, the quadrocopters’ dynamics are highly non-linear andnot fully identified. Furthermore, the system’s time delays arevariable. An appropriate synchronization algorithm is there-fore indispensable. Note that in most other approaches onmusical robots, simple adjustments (like a constant time shiftof the reference signal) are sufficient for synchronization,since they deal with systems that are better understood, lesssensitive to disturbances, and, generally, easier to manage,see [9], [10], [15], [22], [25]. Inspiring work that explicitlyconsiders synchronization is presented in [14], [16], [23],[26].

Motion Composition –Given an arbitrary piece of music,music analysis extracts the associated sequence of musicfeatures with the corresponding timing information. Thegoal is to automatically generate a dance performance withmultiple quadrocopters, where the quadrocopters’ motion notonly reflects the character of the music but also takes intoaccount the physical limitations of the space and the vehicle.We plan to develop an automated tool that will compose asuitable multi-vehicle choreography based on a selection ofmotion primitives (seeMotion Design). Previous work onautomatic music-driven dance synthesis focused exclusivelyon human dance motions; music features were analyzed andmapped manually to real human dance movements, cf. [27]–[29]. Algorithms were visualized and tested solely in virtualenvironments.

The core components mentioned above build upon a priormusic analysis that extracts important music features, suchas beat, dynamic range, pitch, melody, etc., together withtheir timing information. The success of this project dependsheavily on the quality of the multiscale music analysis.Various algorithms and software solutions (see [30], [31])are available, especially for tempo and beat analysis, amongothers [32]–[36]. Moreover, every year researchers presentthe latest music analysis solutions at the Music InformationRetrieval Evaluation eXchange (MIREX). In collaborationwith those music experts, we would like to ask the followingquestions: Which music features are critical to the develop-ment of a musical quadrocopter performance? And whichalgorithms fit the proposed application?

This project aims to combine musical expertise, emotion,and aesthetic judgment with agile aerobatic maneuvers andsophisticated control techniques. In this interdisciplinarycontext, collaboration with a variety of experts is essential.

Fig. 2. The quadrocopter, measuring 53 cm in diameter.

The objectives and prospective features of the proposedplatform are presented in Section IV together with a proof-of-concept demonstration in Section V. The associated video,available also athttp://tinyurl.com/DancePlatform,provides a first impression of the intended quadrocopterflight dance. The experimental setup and the characteristicsof the vehicles are described in Section II, and SectionIII explains the synchronization problem encountered whenworking within the proposed environment.

II. EXPERIMENTAL SETUP

It was the agility of the quadrocopters, the dimension ofthe flight space, and the reliability of the communication andcontrol infrastructure that laid the foundation for a musicaldance performance with multiple vehicles. In the following,we sketch the experimental setup in order to convey animpression of the project’s starting conditions.

A. The Flying Vehicle

Figure 2 shows the current flying vehicle. The quadro-copter is characterized by its small size, light weight, andstructural and electronic robustness, making it a versatile andeasy-to-operate experimental platform. The baseline platformis a X3D ‘Hummingbird’ quadrocopter designed and manu-factured by Ascending Technologies GmbH [37]. Measuring53 cm in diameter, the vehicle’s overall weight includingthe onboard battery is approximately 460 g. The operationalflying time varies between 10 and 20 minutes depending onthe aggressiveness of the performed flight maneuvers. Thevehicle is equipped with four brushless DC motors, whichtogether are able to produce a vertical acceleration of around12.5 m/s.

Though the original propulsion system, the motor con-trollers and the frame of the standard X3D quadrocopterwere preserved, cf. [38], the central electronics and onboardsensors were replaced to obtain better control over theonboard algorithms and to have access to better-quality andhigher-range rate gyro data. These changes allow for moreaggressive maneuvers, faster turn rates, and generally betterflight performance. With the new rate gyros, rotations of upto 2000 deg/s are possible around the body’s principal axesof inertia. Detailed documentation of the changes are foundin [39].

2

The quadrotor accepts three body angle rate commandsand a collective thrust commandat 50 Hz. An onboard800 Hz feedback controller uses rate gyros to track the givencommands. The exact onboard controller design is presentedin [39]. Propeller wear trim factors allow for precise balanc-ing of the quadrocopter. Each vehicle is equipped with tworadio systems: a one-way 2.4 GHz module used exclusivelyfor controlling the vehicle (to constrain the amount ofvariable latency in the system), and a bidirectional 2.4 GHztransceiver with a different modulation for non-time-criticalcommunication such as data feedback or onboard parameterreads/writes.

Each quadrocopter is also equipped with three retro-reflective ball-shaped markers, which enable unambiguousvehicle identification by the Vicon communication system(described in Section II-B and II-C) during flight.

Currently, six vehicles are ready to fly.

B. The Space

A 10 × 10 × 10m cube of indoor space, called the ETHFlying Machine Arena (FMA), is reserved for testing andvalidating autonomous quadrocopter flight, see Fig. 1. Fora safe operation, the space is equipped with protective netsdelimiting the space on three sides. A glass front on thefourth side allows visitors an undisturbed view of the flyingvehicles. To reduce the occurrence of catastrophic crashes,12 cm thick foam mattresses cover the ground. An 8-cameraVicon motion capture system [40] provides position andattitude data for all appropriately marked vehicles in thearena at 200 Hz with millimeter accuracy, and a latency ofabout 10 ms.

C. Control and Communication

The overall organization of the system is similar to [41].The localization data provided by the Vicon system is sentto off-the-shelf PCs, which then execute estimation andcontrol algorithms. These in turn deliver commands to thequadrocopters with an approximated latency of 20 ms. Theoverall system time delay, from sending a vehicle commandto detecting the corresponding effects in the vehicle’s Viconpose data, varies between 10 ms and 40 ms with a mean valueof 35 ms.

Data is sent via a multicast UDP scheme, allowing forconvenient online visualization of all data sent over thenetwork, and also for recording, playback, and export ofarbitrary customized data series. A convenient side-effectof this setup is that estimation and control componentsare binary-identical when running in the real system orin simulation. The wireless and Vicon data bridges aresimply replaced by a simulator process, with all of the othercomponents remaining completely unaffected and unawareof any difference.

During normal operation the vehicle’s translational degreesof freedom are controlled by linear PID controllers designedfor near-hover operation. Yaw is held at a constant angle viaa proportional controller. To achieve trajectory tracking, asequence of reference points are fed to the controller together

Fig. 3. The desired side-to-side motion, a schematic of the two-dimensionalquadrocopter model.

with appropriate feedforward commands.More details aboutthis test environment may be found in [42] and [39].

III. THE SYNCHRONIZATION PROBLEM

Autonomous quadrocopters are fundamentally different intheir dynamic behavior from other existing musical robots.To prove the feasibility of the project, we must thereforeask: Is it possible to precisely time a rhythmic quadrocoptermovement with a music beat? In the following, a simplemotion primitive is selected which highlights the quadro-copter’s characteristic properties and for which a solution tothe synchronization problem is sketched. The details of thesynchronization algorithm are found in [43].

A. A Side-To-Side Motion

A planar side-to-side motion as depicted in Fig. 3 isconsidered, where at beat times the vehicle reaches the outer-most points of the trajectory. Given the music frequencyfd(obtained from an a priori music analysis), a correspondingdesired sinusoidal vehicle trajectory can be defined in thexz−plane,

xd(t) = Ad cos (ωdt)

zd(t) = zd = constant,(1)

with ωd = 2πfd assumed to be constant. Fig. 4 illustratesthe beat-motion relation. The altitude of the quadrocopter isstabilized at a given heightzd.

B. The Quadrocopter Dynamics

The side-to-side motion (1) is defined in thexz−plane.In-plane and out-of-plane dynamics are thus decoupled.Additional degrees of freedom are separately stabilized anddo not affect the rhythmic motion. The equations governing

Fig. 4. Reference trajectory in lateral direction (dashed line) withcorresponding quadrocopter motion (solid line). Vertical lines represent beattimes.

3

the dynamics of the system are then given by

z(t) = f(t) cos θ(t)− g (2)

x(t) = f(t) sin θ(t) (3)

θ(t) = u(t), (4)

where g is the gravitational constant andθ(t) is the pitchangle, see Fig. 3. The inputs to the system are the normalizedthrustf(t) in m/s2 and the pitch rateu(t) in rad/s.

C. Control and Synchronization

Underlying controllers turn the unstable quadrocopter dy-namics into an oscillating system. A phase comparator de-tects the phase error between the desired reference trajectoryand the actual quadrocopter motion. After having determinedthe phase error, it is compensated for by closing the loop,similar to [16] and [26].

The oscillating quadrocopter motion is achieved by usinga cascading controller: thez-direction is stabilized first and,assuming a constant height, the trajectory-tracking controllerfor the x-direction is then designed. With

f(t) =1

cos θ(t)

(

g − 2δzωz z(t)− ω2

z (z(t)− zd))

, (5)

and an appropriate choice of the parameterδz andωz, thedynamics in z-direction are stabilized, cf. [43]. Buildingupon the above control scheme (5) and assuming a resultingconstant heightzd, the thrustf(t) is

f(t) =g

cos θ(t). (6)

With (6), to first order,the x-dynamics (3) reduce to

x(t) = g θ(t) ⇔...x(t) = g u(t), (7)

assuming smallθ(t). With the aim of following the desiredsinusoidal side-to-side motion (1), the input

u(t) =1

g

(

u1(t) + u2(t)

)

(8)

is composed of a feedforward component,

u1(t) =...xd(t) = Adω

3

d sin(ωdt), (9)

and an additional feedback term to correct for errors,

u2(t) = α(

xd(t)− x(t))

+ β(

xd(t)− x(t))

(10)

+ γ(

xd(t)− x(t))

,

with the tuning parametersα, β, and γ. When applyingthe inputu(t) as defined in (8), a phase shift is observedbetween the reference trajectory of the sideways motionand the actual quadrocopter trajectory, illustrated in Fig. 4.(Corresponding experimental results are shown in Fig. 5.)This phenomenon results mainly from unmodeled dynamicswhich were neglected in the controller design, such ascommunication delays and propeller dynamics. This problemis resolved in two stages. First, the phase shiftϕ(t) betweenthe quadrocopter trajectoryx(t) and the desired motion (1)

is determined by an integration over a full periodTd = 1

fd,

η1(t) =1

Td

∫ t

t−Td

x(t) cos (ωdt) dt =A

2cosϕt (11)

η2(t) =1

Td

∫ t

t−Td

x(t) sin (ωdt) dt = −A

2sinϕt, (12)

whereϕt = − arctan

(

η2(t)

η1(t)

)

. (13)

In stage two, the phase errorϕt is corrected by a feedbacktechnique borrowed from PLL design [44] shifting the ref-erence signalxd(t) in (1) by a correction terme(t),

xsd(t) = Ad cos

(

ωdt+ e(t))

, with e(t) = −k

∫ t

0

ϕt dt.

(14)Similarly, the derivatives ofxd(t) in (8) are shifted in phaseby e(t). With the feedback integrator terme(t), precise androbust phase locking is achieved. Convergence behavior iscontrolled by tuning the gain factork.

Experimental results support the theoretical idea, see Fig. 5and 6. Without phase correction a constant, non-zero phaseerror remains, while with the proposed phase correctionmethod, the phase error converges to zero and perfect syn-chronization is achieved. Note that even when the phase errorbetween the reference trajectory and the actual quadrocopterresponse is hardly noticeable in Fig. 5, actual experimentsshow that small phase errors remain visible and audible tohumans.

0 1 2 3 4 5−0.5

0

0.5

Time (s)

NO

CO

RR

EC

TIO

N

Pos

ition

(m

)

0 1 2 3 4 5−0.5

0

0.5

Time (s)

WIT

H C

OR

RE

CT

ION

Pos

ition

(m

)

Fig. 5. Quadrocopter response to the dashed-lined input signal for the caseof no phase correction (top) and phase error compensation (bottom). Thesolid line represents the vehicle trajectory.

0 1 2 3 4 5

−0.4

0

0.4

Time (s)

Pha

se E

rror

(ra

d)

No CorrectionWith Correction

Fig. 6. Phase error of the vehicle trajectories shown in Fig. 5 (detected bythe phase comparator).

4

IV. A DANCE PERFORMANCE

In the previous section, the fundamentalbasis of the workwas presented: a quadrocopter motion that is synchronizedto music. This is the first successful step towards chore-ographing a flying music show. Below, the overall conceptand vision is outlined together with the corresponding stepsnecessary to realize this idea.

A. Vision

We envision the Flying Machine Arena as a stage for adance performance featuring multiple quadrocopters, whichmove in rhythm to music and perform flips, eights, circlesand other aggressive maneuvers. Our goal is to be able toquickly process any arbitrary piece of music and translate itinto choreography that reflects the music’s character.

To this end, the development of the musical platformcan be divided into four main components: music analysis,motion design, vehicle control, and, finally, motion chore-ography. The different steps are illustrated in Fig. 7 anddiscussed next.

B. Music Analysis

Humans often tap their feet to the beat of music as areflection of their rhythmic perception. Music comprehensionis not based solely on rhythm, however. Humans are able toperceive a sophisticated ensemble of musical features - in-cluding melody, measure, pitch, dynamic range and recurringthemes - which help them to associate emotions with whatthey are listening to, and to react with a corresponding rangeof emotive movements we refer to as ‘dance’.

Thus far, robotic perception of music has been largelylimited to beat tracking. Early beat tracking algorithms weredeveloped as early as 1994 [45], and automated beat trackingis still an active area of research. Recent work includes [30],[31], [34]–[36]. Given the current state of the art, can weexpect robots to perceive and respond to music in a mannersophisticated enough to call ‘dance’? Which music featuresare responsible for human musical perception? Is it possibleto extract these features from the music and correlate themwith emotional expression?

A multiscale music analysis is fundamental to the creationof a meaningful choreography. The music is digitized anddivided into different sections that are categorized accordingto rhythm, melody and character. The result is a vocabularythat describes the temporal development of the music, whichis in turn coupled with a variety of quadrocopter movements.

The proposed application aims to automate and acceleratethis process. For now, however, a semi-assisted system is areasonable expectation: Software processes a piece of musicand suggests a categorization to a human user, who overseesand corrects the results. This information is then forwardedto the choreography component, which will create a suitablecomposition of motions for the quadrocopters, see SectionIV-E.

C. Motion Design

As already mentioned in Section I, defining suitable rhyth-mic motion primitives for quadrocopters is a major challenge.When developing a dancing performance, two different typesof motion are distinguished:

Synchronized Motion –Movements that aim to followthe rhythm of the music must be precisely synchronizedto the beat (or multiple of it). Normally, these are swingmotions, consisting of sinusoids – which enable the readymeasurement and correction of phase error – to achieveperfect synchronization. The side-to-side motion describedin Section III belongs to this type of motion primitives.

Triggered Motion –Movements that are not strictly linkedto the rhythm of the music, but start at a beat time, areexecuted during a given time interval, and end again with amusic beat. These motion primitives are used to transitionbetween two different synchronized motions, or to reflecta particular section of the music. Aggressive trajectories,like flips, eights, and circles, belong to this group of motionprimitives.

Numerous motions can be created and cataloged accordingto the above categories. In this way, a library of motionprimitives is built, serving as an important resource for thecreation of new choreographies.

D. Vehicle Control

As emphasized in Section I, sophisticated methods arerequired in order to control a quadrocopter’s dynamics.

The synchronized and triggered motions introduced abovehave to be carried over into the real world, and differentcontrol challenges are associated with each. Triggered mo-tions are complex and highly dynamic maneuvers requiringfast controllers and appropriate feedforward terms, whichmay need to be acquired through (machine) learning. Incontrast, synchronized motions are less aggressive and easierto follow, but require special attention to maintain precisetiming.

According to [46], two notes are perceived as beingsimultaneous if they occur within less than 30 ms. Accuracywith respect to beat times must therefore be in this range, too.To define and measure the timing error, we introduced theconcept of phase in Section III. Indeed, the music beat can betranslated into a sinusoidal signal which builds the referencetrajectory for the synchronized motion. In this case, a timingerror is reflected by a non-zero phase error between the musicreference and the actual vehicle trajectory. If trajectories arerepresentable by a sinusoidal, phase-dependent description(as e.g. eights or circles), the timing error can always beobtained from the phase error. After having recognized thephase error, an approach like the one described in SectionIII can be used to compensate for it, achieving a preciselysynchronized motion.

E. Motion Choreography

The previous steps result in a categorized piece of music(from the music analysis, see Section IV-B) and a libraryof motion primitives (from the motion design, see Section

5

Fig. 7. The proposed platform: combining music and quadrocopter motion into a multi-vehicle dance performance.

IV-C). The physical limits of the vehicles and the space arealready known. Is it possible to combine all this informationin a meaningful way? How can multiple quadrocopters flyin a manner we can recognize as ‘dance’?

Well-established and recognizable parameters for humandance have long been in use by professional dancers, chore-ographers and dance teachers to build choreography with in-terest, dynamics and aesthetic appeal. These parameters canprovide us with a framework for meaningful quadrocoptorchoreography, and are described as follows:

Space –Space refers to the area the dancer is performingin and also relates to how the dancer moves through the areaas characterized by the direction and path of a movement,its size, level, and shape.

Time –Time encompasses rhythm, tempo, duration, andphrasing of movements. Using time in different combinationsto music can create intricate visual effects. Ideas such asquick-quick, slow or stop movements are examples.

Energy – Energy relates to the quality of movement.This concept is recognizable when comparing ballet and tapdance. Some types of choreography are soft and smooth,while others are sharp and energetic.

Structure –Structure represents the organization of move-ment sequences into larger concepts: the combination andvariation of movements using recurring elements, contrast,and repetition. Movements can even follow a specific storyline to convey specific information through a dance.

With these parameters in mind, the motion primitivesdescribed in Section IV-C can eventually be combined intosequences, which can in turn be combined to create anoverall choreographic performance. Endless permutations arepossible, much the way individual words can be combinedinto a variety of subtle and sophisticated stories. Finally,an expressive connection of music sequences and motionprimitives is established capable of visually conveying themusic’s emotions to the audience. As choreographing perfor-mances for human dancers, creativity and aesthetic judgmentis required to achieve artistic quality.

V. CURRENT STATUS

So far, our research has been primarily focused on controland synchronization. As explained in Section III, a method

was developed that is able to precisely synchronize a sinu-soidal quadrocopter motion to a given music reference signal.Currently, a horizontal swing motion in thexz-plane and asimilar motion in theyz-plane are implemented using theproposed algorithm. This allows a dancing behavior bothalong thex- andy-axis. Using multiple vehicles, this alreadyleads to a wide range of combinations and patterns.

In addition to synchronized motions, aggressive trajec-tories are flown in the FMA with high accuracy and animpressive smoothness. Our research group is pushing thelimits of what can be achieved with quadrocopters. Forexample, the quadrocopters learned to perform up to threeflips, cf. [39].

A variety of motion primitives are now available in ourlibrary, discussed in Section IV-C. More complex motionsof both types will be investigated in order to add variationto our quadrocopter dance performances. As mentioned inSection I, the motion design is not trivial and requires furtherresearch. Specifically, a transition motion capable of takingthe quadrocopters from the final position of one rhythmicmotion primitive to the starting position of the followingmotion must be implemented. The transition motion shouldbe fast and guarantee a collision-free position change.

We developed a software platform built on a collection ofsynchronized motion primitives to guide the choreographyof multiple quadrocopters flying together. A graphical userinterface (GUI) allows the user to assign desired motions tospecific quadrocopters, set all the necessary parameters, takeoff and fly the vehicles. This platform is especially importantfor testing new motions. The results shown in Section V-A were developed using this software platform. The dancemotion presented in Section V-B is also incorporated into thegraphical interface. The platform is connected to a simulationenvironment that is part of the FMA’s infrastructure, seeSection II. The simulation environment provides a 3D viewof the FMA and allows to visually determine the qualityof a new motion or controller. As new motion primitivesare created, they can be immediately added to the softwareplatform.

In order to create a dancing performance, as in Section V-B, the beat times have to be extracted from the music. Thisprocess was done with the aid of Beatroot [31]. Beatroot

6

Fig. 8. A three-vehicle swing motion synchronized to the music beat;screen shot of our simulation environment.

analyzes the music and provides beat times. An interfacevisualizes the beats, such that the user can listen to the musicand adjust the beat times if mismatches are noticed.

For designing the dance choreography, the extracted beattimes give the underlying frequency. The performance com-bines the synchronized motions and the triggered motions.After having analyzed a piece of music, the setup of a newdancing performance is simple and can be done in shorttime; however, this procedure is not yet automated. Theassignment of different motions to different sections of themusic is done manually in order to reflect the characterof the musical piece. In an interdisciplinary exchange, wehope to learn more about methods that are able to performthe multiscale analysis, as discussed in Section IV-B, andidentify the different parts of a musical piece.

Most of our work has focused on the control of thequadrocopters and their motions. We are still looking for anautomated method to retrieve information from music. Giventhis information, the key challenge will be to develop analgorithm for automatic choreography and motion planning.The results presented in this paper show the first stepstowards a musical platform as presented in Section IV.

The two examples presented below show the capabilitiesof the current platform.

A. Example: Three-Vehicle Swing Motion

One possibility offered by the platform is to manuallyassign synchronized motions to specific quadrocopters. Thiscan be done for testing purposes or simply to producevariation in the performance. The example demonstrateshow the platform integrates multiple quadrocopters executingdifferent motions: two vehicles perform a swing motion inthe x-direction, while another one crosses their trajectorieswith a swing motion in they-direction, see Fig. 8. Usingthe methods developed in [43], synchronization is achievedbetween the motion of the three quadrocopters and the musicbeat. The method is robust to the differences between thevehicles (e.g. due to the marker’s position on the vehicle).

The experimental results are shown in the attached videohttp://tinyurl.com/DancePlatform.

B. Example: Dancing Performance with Two Vehicles

We also developed a dancing performance for two vehiclesbased on a remix of the ‘Pirates of the Caribbean’ soundtrack. First the music is analyzed and the beat times areextracted. Then the different parts of the musical pieceare identified and manually assigned to suitable motionsfrom the library. In this example, a circle trajectory ischosen while the quadrocopters spin. Combinations withdifferent speed and circle radii are shown in the associatedvideo http://tinyurl.com/DancePlatform. The danc-ing parts consist of the swing motion presented in SectionIII. The amplitude is varied to achieve different effects.

VI. CONCLUSION

In this paper, a novel musical experience is proposed:flying vehicles that express the character of the music byperforming an aerial dance. A cubic indoor flight spaceforms the stage, and small autonomous quadrocopters are theactors of this performance. Current features of the platforminclude the control of the quadrocopters and an audio-motionsynchronization. A software infrastructure is built to be easilyextendible with regard to the future objectives. One prospec-tive feature is an automated multiscale music analysis thatis able to identify distinct musical characteristics includingbeat, measure, dynamic range, melody and recurring themes.Building upon this information, the project aims towards anautomated generation of coordinated musical multi-vehiclemovements.

This project lies at the interface between robotics, controland music information retrieval, and has thus far beenfocused on the control aspect. The project’s interdisciplinarynature requires intensive exchange and close cooperationwith researcher in other fields. To this end, we are ableto provide the experimental setup and control knowledge;we seek input concerning musical information retrieval andinterpretation of music features.

VII. ACKNOWLEDGEMENTS

This work would have not been possible without theprior work of Markus Hehn and Sergei Lupashin. Theircontributions are gratefully acknowledged.

REFERENCES

[1] E. Singer, J. Feddersen, C. Redmon, and B. Bowen, “LEMUR’smusical robots,” inProceedings of the Conference on New Interfacesfor Musical Expression (NIME), 2004, pp. 181–184.

[2] R. B. Dannenberg, B. Brown, G. Zeglin, and R. Lupish, “McBlare:a robotic bagpipe player,” inProceedings of the International Con-ference on New Interface for Musical Expression (NIME), 2005, pp.80–84.

[3] A. Kapur, “A history of robotic musical instruments,” inProceedingsof the International Computer Music Conference, 2005, pp. 21–28.

[4] N. A. Baginsky, “The three sirens: a self-learning robotic rockband,” Available online at www.the-three-sirens.info (accessed June23, 2010), 2004.

[5] G. Weinberg, S. Driscoll, and M. Parry, “Musical interactions with aperceptual robotic percussionist,” inProceedings of the IEEE Inter-national Workshop on Robot and Human Interactive Communication(ROMAN), 2005, pp. 456–461.

7

[6] G. Weinberg and S. Driscoll, “The design of a perceptual andimprovisational robotic marimba player,”in Proceedings of the IEEEInternational Symposium on Robot and Human Interactive Communi-cation (ROMAN), 2007, pp. 769–774.

[7] G. Hoffman and G. Weinberg, “Gesture-based human-robot jazzimprovisation,” inProceedings of the IEEE International Conferenceon Robotics and Automation (ICRA), 2010, pp. 582–587.

[8] J.-J. Aucouturier, “Cheek to chip: dancing robots and AI’s future,”IEEE Intelligent Systems, vol. 23, no. 2, pp. 74–84, 2008.

[9] K. Shinozaki, A. Iwatani, and R. Nakatsu, “Construction and evalu-ation of a robot dance system,” inProceedings of the IEEE Interna-tional Symposium on Robot and Human Interactive Communication(ROMAN), 2008, pp. 366–370.

[10] D. Grunberg, R. Ellenberg, Y. Kim, and P. Oh, “Creating an au-tonomous dancing robot,” inProceedings of the International Confer-ence on Hybrid Information Technology (ICHIT), 2009, pp. 221–227.

[11] A. Nakazawa, S. Nakaoka, K. Ikeuchi, and K. Yokoi, “Imitating humandance motions through motion structure analysis,” inProceedingsof the IEEE/RSJ International Conference on Intelligent Robots andSystems (IROS), 2002, pp. 2539–2544.

[12] J. Oliveira, F. Gouyon, and L. P. Reis, “Towards an interactiveframework for robot dancing applications,” inProceedings of theInternational Conference on Digital Arts (ARTECH), 2008.

[13] K. Kosuge, T. Hayashi, Y. Hirata, and R. Tobiyama, “Dance partnerrobot – Ms DanceR,” inProceedings of the IEEE/RSJ InternationalConference on Intelligent Robots and Systems (IROS), vol. 4, 2003,pp. 3459–3464.

[14] S. Kotosaka and S. Schaal, “Synchronized robot drumming by neuraloscillator,” Journal of the Robotics Society of Japan, vol. 19, pp. 116–123, 2001.

[15] K. Murata, K. Nakadai, K. Yoshii, R. Takeda, T. Torii, H. G. Okuno,Y. Hasegawa, and H. Tsujino, “A robot uses its own microphone tosynchronize its steps to musical beats while scatting and singing,” inProceedings of the IEEE/RSJ International Conference on IntelligentRobots and Systems (IROS), 2008, pp. 2459–2464.

[16] K. Yoshii, K. Nakadai, T. Torii, Y. Hasegawa, H. Tsujino, K. Komatani,T. Ogata, and H. G. Okuno, “A biped robot that keeps steps in timewith musical beats while listening to music with its own ears,” inProceediings of the IEEE/RSJ International Conference on IntelligentRobots and Systems (IROS), 2007, pp. 1743–1750.

[17] R. Takeda, K. Yoshii, K. Komatani, T. Ogata, and H. G. Okuno, “Arobot listens to music and counts its beats aloud by separating musicfrom counting voice,” inProceedings of the IEEE/RSJ InternationalConference on Intelligent Robots and Systems (IROS), 2008, pp. 1538–1543.

[18] F. Thorn, “Felix’s machines,” Available online atwww.felixsmachines.com (accessed June 22, 2010), 2008.

[19] C. Southworth and L. Hasan, “Ensemble robot,” Available online atwww.ensemblerobot.com (accessed June 22, 2010), 2004-2006.

[20] K. Murata, K. Nakadai, K. Yoshii, R. Takeda, T. Torii, H. G. Okuno,Y. Hasegawa, and H. Tsujino, “A robot singer with music recognitionbased on real-time beat tracking,” inProceedings of the 9th Interna-tional Conference on Music Information Retrieval (ISMIR), 2008, pp.199–204.

[21] K. Murata, K. Nakadai, R. Takeda, H. G. Okuno, T. Torii,Y. Hasegawa, and H. Tsujino, “A beat-tracking robot for human-robot interaction and its evaluation,” inProceedings of the IEEE/RASInternational Conference on Humanoid Robots, 2008, pp. 79–84.

[22] M. P. Michalowski, S. Sabanovic, and H. Kozima, “A dancing robotfor rhythmic social interaction,” inProceedings of the ACM/IEEEInternational Conference on Human-Robot Interaction (HRI), 2007,pp. 89–96.

[23] C. Crick, M. Munz, and B. Scassellati, “Synchronization in socialtasks: robotic drumming,” inProceedings of the IEEE InternationalSymposium on Robot and Human Interactive Communication (RO-MAN), 2006, pp. 97–102.

[24] G. M. Hoffmann, H. Huang, S. L. Wasl, and C. J. Tomlin, “Quadrotorhelicopter flight dynamics and control: theory and experiment,” inPro-ceedings of the AIAA Guidance, Navigation, and Control Conference,2007.

[25] T. Takeda, K. Kosuge, and Y. Hirata, “HMM-based dance stepestimation for dance partner robot – Ms DanceR,” inProceedingsof the IEEE/RSJ International Conference on Intelligent Robots andSystems (IROS), 2005, pp. 3245–3250.

[26] D. Pongas, A. Billard, and S. Schaal, “Rapid synchronization andaccurate phase-locking of rhythmic motor primitives,” inProceedingsof the IEEE/RSJ International Conference on Intelligent Robots andSystems (IROS), 2005, pp. 2911–2916.

[27] K.-M. Chen, S.-T. Shen, and S. D. Prior, “Using music and motionanalysis to construct 3D animations and visualisations,”Digital Cre-ativity, vol. 19, no. 2, pp. 91–104, 2008.

[28] K. Takahashi and H. Ueda, “A dance synthesis system using motioncapture data,”Knowledge Acquisition: Approaches, Algorithms andApplications, pp. 208–217, 2009.

[29] F. Ofli, E. Erzin, Y. Yemez, and A. M. Tekalp, “Multi-modal analysisof dance performances for music-driven choreography synthesis,”in Proceedings of the IEEE International Conference on Acoustics,Speech, and Signal Processing (ICASSP), 2010.

[30] G. Tzanetakis, “Marsyas: music analysis, retrieval and synthesis foraudio signals,” Available online at www.marsyas.info (accessed June26, 2010), 2009.

[31] S. Dixon, “Beatroot: an interactive beat track-ing and visualisation system,” Available online atwww.elec.qmul.ac.uk/people/simond/beatroot/ (accessed June 26,2010), 2006.

[32] M. S. Puckette, T. Apel, and D. D. Zicarelli, “Real-time audio analysistools for Pd and MSP,” inProceedings of the International ComputerMusic Conference, 1998, pp. 109–112.

[33] M. Goto, “An audio-based real-time beat tracking system for musicwith or without drum-sounds,”Journal of New Music Research,vol. 30, no. 2, pp. 159–171, 2001.

[34] M. F. McKinney, D. Moelants, M. E. P. Davies, and A. Klapuri, “Eval-uation of audio beat tracking and music tempo extraction algorithms,”Journal of New Music Research, vol. 36, no. 1, pp. 1–16, 2007.

[35] D. P. W. Ellis, “Beat tracking by dynamic programming,”Journal ofNew Music Research, vol. 36, no. 1, pp. 51–60, 2007.

[36] Y. Shiu and C.-C. J. Kuo, “Musical beat tracking via Kalman filteringand noisy measurements selection,” inProceedings of the IEEEInternational Symposium on Circuits and Systems (ISCAS), 2008, pp.3250–3253.

[37] Ascending Technologies GmbH, “Multi-rotor air vehicles,” Availableonline at www.asctec.de (accessed June 27, 2010), 2010.

[38] D. Gurdan, J. Stumpf, M. Achtelik, K.-M. Doth, G. Hirzinger, andD. Rus, “Energy-efficient autonomous four-rotor flying robot con-trolled at 1 kHz,” inProceedings of the IEEE International Conferenceon Robotics and Automation (ICRA), 2007, pp. 361–366.

[39] S. Lupashin, A. Schollig, M. Sherback, and R. D’Andrea, “A simplelearning strategy for high-speed quadrocopter multi-flips,” inProceed-ings of the IEEE International Conference on Robotics and Automation(ICRA), 2010, pp. 1642–1648.

[40] Vicon, “Motion capture systems,” Available online at www.vicon.com(accessed June 27, 2010), 2010.

[41] J. P. How, B. Bethke, A. Frank, D. Dale, and J. Vian, “Real-timeindoor autonomous vehicle test environment,”IEEE Control SystemsMagazine, vol. 28, no. 2, pp. 51–64, 2008.

[42] G. Ducard and R. D’Andrea, “Autonomous quadrotor flight using avision system and accommodating frames misalignment,” inProceed-ings of the IEEE International Symposium on Industrial EmbeddedSystems (SIES), 2009, pp. 261–264.

[43] A. Schollig, F. Augugliaro, S. Lupashin, and R. D’Andrea, “Synchro-nizing the motion of a quadrocopter to music,” inProceedings of theIEEE International Conference on Robotics and Automation (ICRA),2010, pp. 3355–3360.

[44] W. F. Egan,Phase-Lock Basics, 2nd ed. Wiley-Interscience, 2008.[45] M. Goto and Y. Muraoka, “A beat tracking system for acoustic signals

of music,” in Proceedings of the ACM International Conference onMultimedia, 1994, pp. 365–372.

[46] R. Rasch, “The perception of simultaneous notes such as in polyphonicmusic,” Acustica, vol. 40, no. 1, pp. 21–33, 1978.

8