bbcut2: integrating beat tracking and on-the-fly event analysis

9
This article was downloaded by: [Thammasat University Libraries] On: 04 October 2014, At: 19:12 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Journal of New Music Research Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/nnmr20 BBCut2: Integrating beat tracking and on-the-fly event analysis Nick Collins a a University of Cambridge , UK Published online: 16 Feb 2007. To cite this article: Nick Collins (2006) BBCut2: Integrating beat tracking and on-the-fly event analysis, Journal of New Music Research, 35:1, 63-70, DOI: 10.1080/09298210600696600 To link to this article: http://dx.doi.org/10.1080/09298210600696600 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

Upload: nick

Post on 21-Feb-2017

218 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: BBCut2: Integrating beat tracking and on-the-fly event analysis

This article was downloaded by: [Thammasat University Libraries]On: 04 October 2014, At: 19:12Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

Journal of New Music ResearchPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/nnmr20

BBCut2: Integrating beat tracking and on-the-fly eventanalysisNick Collins aa University of Cambridge , UKPublished online: 16 Feb 2007.

To cite this article: Nick Collins (2006) BBCut2: Integrating beat tracking and on-the-fly event analysis, Journal of New MusicResearch, 35:1, 63-70, DOI: 10.1080/09298210600696600

To link to this article: http://dx.doi.org/10.1080/09298210600696600

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: BBCut2: Integrating beat tracking and on-the-fly event analysis

BBCut2: Integrating Beat Tracking and On-the-fly

Event Analysis

Nick Collins

University of Cambridge, UK

Abstract

BBCut2 is the latest manifestation of a software libraryfor realtime algorithmic audio splicing. Machine listeningcapabilities are supported for realtime beat tracking andaudio event analysis, such that splicing manipulationsrespect component events, and their micro-timing withrespect to an inferred metrical structure. The architec-ture, whilst currently most effective for transient richpercussive signals, is modular enough to be extensible tonew observation models. A scheduling system is de-scribed that can cope with splicing driven from anexternal clock, empowering realtime beat tracking ledsegmentation and other processing effects.

1. Introduction

From sample-based granular synthesis (Truax, 1988)through to concatenative synthesis (Schwarz, 2004) theautomatic manipulation of audio signals has becomeincreasingly sophisticated. In recent approaches, featuresdescribing the source audio are extracted which areperceptually and musically relevant. Rather than lettingthe source run through processes blind to the fine detail oftheir target, processing is more responsive to the quali-ties of the source itself. Whilst the former may leadserendipitously to interesting results, the latter is lessprone to trial and error. Further, the settings within whichsound extracts may be played back can themselves beadapted to the circumstances of the originating material;for example, where the metrical properties of a sourcesound are known, and might guide the context of itsreuse, or enable its own repurposing for a new context.

Such analysis of the audio to be manipulated has greatdividends for new compositional applications, includinglive concert settings. Extracting higher level musical

features from a low level audio signal, such as note eventswith associated timbral parameters or metrical frame-works, can only assist in reaching a more comfortableunion of human and machine musician. For live perfor-mance, such music understanding algorithms must operateas causal and real-time processes. This article outlinesattempts to update a real-time computer music systemwithamachine listening frontend capable of a greater sensitivityto the material it treats. The machine listening tools treatedare specifically on-the-fly metrical analysis, exhibited viathe domain of beat tracking, and running note eventanalysis based on signal segmentation and associatedfeature extraction. Harnessing such music understand-ing technology necessitated changes to the fundamentalscheduling mechanisms on which the system is based.

The system in question is BBCut, an extension libraryfor the SuperCollider audio programming language(McCartney, 2002), providing a suite of functions forreal-time algorithmic audio manipulation. It was origin-ally motivated by the thought of automating certainmanual sequencer practices of dance music producers,namely the ‘‘breakbeat cutting’’ from which the librarygets its name. Initial experiments extended to generalfacilities for splicing audio on-the-fly at rhythmic (ratherthan micro-granular) rates, for both fixed and circularbuffers (Truax, 1988). A variety of algorithmic composi-tion procedures were devised, drawing both from moreabstract mathematical procedures like recursion andpermutation chains, and inspired by contemporarymusical themes such as thrash drumming or the rhythmicexplorations of the Warp Records artist Squarepusher.

BBCut provides an extensible and customizable frame-work for experiments in audio cutting, encouraging goodcode reusability. Assuming that the synthesis of cuts isindependent of the composition of possible cut sequences,

Correspondence: Nick Collins, Centre for Music and Science, Faculty of Music, University of Cambridge, 11 West Road, CB3 9DP,UK. E-mail: [email protected]

Journal of New Music Research2006, Vol. 35, No. 1, pp. 63 – 70

DOI: 10.1080/09298210600696600 � 2006 Taylor & Francis

Dow

nloa

ded

by [

Tha

mm

asat

Uni

vers

ity L

ibra

ries

] at

19:

12 0

4 O

ctob

er 2

014

Page 3: BBCut2: Integrating beat tracking and on-the-fly event analysis

BBCut separates cut procedures from cut synthesisers,such that any algorithmic composer can work with anycutting target. This allows the easy substitution of newcutting targets, whilst retaining a given algorithmicallycomposed cut sequencer. There are source readers ableto cope with fixed buffers and live infinite streams of audio,with MIDI or OSC (Open Sound Control, (Wright &Freed, 1997)) message passing to third party video appli-cations, or with text permutation. The original softwaredesign is discussed in greater detail in Collins (2002).

In performance, the first version of BBCut works on an‘‘as fast as possible’’ rendering scheme where the next cutis decided at the scheduled end of the previous, andappropriate synthesis commands are sent (determined bythe cut synthesisers) as soon as this cut is known. BBCut’smain limitation is the requirement to use an internal clockas the scheduling driver; it cannot cope with synchroniz-ing to an external clock, a capability necessary for real-time beat tracking and associated splicing.

Thus, the original BBCut worked well enough uponits own terms, that is, when it imposed its own timebaseon material. In this article I discuss a new version ofBBCut, unimaginatively dubbed BBCut2, which incor-porates some ability to respect the timebase and eventstructure of the material it treats. Rather than imposingan internal clock and assuming metronomic target audio,beat tracking and event analysis routines are built intothe architecture. In the main, this article’s academiccontribution will be to outline a scheduling solution fora live algorithmic splicing system that can be driven by abeat tracker, whilst also applying a running eventanalysis. The design of the beat tracker and eventanalyser themselves are separate issues which manyauthors (some represented within this issue) haveexplored, but stand as independent modules within theBBCut2 system. BBCut2 is open source under the GNUGPL and contains a variety of UGen implementationsfor the signal processing modules; for example, Auto-Track is an autocorrelation beat tracker based on Daviesand Plumbley (2005) and the event analysis subsystem isa development of that described in Collins (2005a).

Integrating machine listening processes into BBCut2required a re-design of the original BBCut1’s core.Introducing an external controlling clock and exactlysynchronizing the timing of future events requires carefulscheduling mechanisms. The time representation is alsocritical in terms of extracting expressive timing; aninferred beat allows quantization with respect to thatbeat, but one must carefully dissociate expressive timingdeviations from quantized inter-onset intervals. Thearchitecture outlined below can cope with various timingfactors arising from the perceptual attack times of soundevents, expressive timing, network latency for distributedsynthesis and rendering time.

There are certain technical aspects of the followingthat are influenced by the existing architecture of

SuperCollider 3; SuperCollider separates the language,within which algorithmic composition takes place, fromthe synthesizer, the Server. As its name suggests, theServer is an independent application which is controlledby network messages using Open Sound Control(Wright & Freed, 1997). Rigid timing therefore demandscompensation for network timing jitter, and this isachieved by sending messages with a time-stamp, around50 ms or more ahead of their actual performance time.

In previous work on scheduling for computer musicDannenberg (1989) covers the case of computer accom-paniment, where a known score exists. He distinguishesvirtual time as distinct from physical time; schedulingwith respect to these is also commonly referredto as beat-based and time-based (in SuperColliderprogramming corresponding to a TempoClock and aSystemClock). In a later paper discussing synchroniza-tion of clocks over a computer network, Brandt andDannenberg (1999) discuss the ‘‘trade-off betweenlatency and the chance of jitter’’. They further write that‘‘An accompaniment system . . . would more likely chooseminimal delay and take its chances with jitter’’. In thesolution described below, the predictive capabilities ofbeat tracking are exploited to avoid jitter whilst stillmaintaining synchronization, excepting unanticipatedshifts of period and phase.

2. Scheduling synchronized to an externalclock

Figure 1 gives an overview of the central message passingin the situation where an external clock (represented inBBCut2 by the ExternalClock class and subclasses)determines the timebase. The tick() method is called oneach beat indicated by the clock. Such a beat might bedetermined by the click from a beat tracker, running as asignal processing unit on a live audio input. Real-timebeat trackers rarely update their hypotheses faster thanper beat, and assume constant tempo within beats, so asto make sub-beat ticks unviable. There are a number ofissues here.

1. A beat may be detected early or late with respect tothe previously assumed tempo, for the beat trackermay make errors, or be tracking a shift of period orphase.

2. If scheduled events are to be time locked to anexternal clock, only predictive scheduling will work.For exactly synchronized timing, pre-scheduling isnecessary to take into account synthesis delays.

Beat based scheduling is naturally used for algorithmiccomposition, but synthesis parameters such as perceptualattack time, network latency and rendering delay, orexpressive timing constraints independent of the beat

64 Nick Collins

Dow

nloa

ded

by [

Tha

mm

asat

Uni

vers

ity L

ibra

ries

] at

19:

12 0

4 O

ctob

er 2

014

Page 4: BBCut2: Integrating beat tracking and on-the-fly event analysis

require the resolution of scheduling in absolute time,i.e. seconds. The solution to this specification trades offimmediacy of interactive control of the algorithmiccomposition against predictive synchronization to anexternal clock. Associated delay is much greater thanordinarily due to Anderson and Kuivila’s (1990, p. 60)action buffering, the associated delay is extended:algorithmic event generators are controlled, and notsingle synthesis parameters such as a filter cutoff. Thedelay introduced is usually up to two beats, thoughthis may be much longer where algorithmic composi-tion itself generates material in larger chunks. Eachnew beat indicated by the clock is the cue to scheduleevents still due during the next beat (as indicated bythe new phase and period just received) plus on intothe beat after that, as required by the minimal pre-scheduling time for synthesizing events. Pseudo-codelisting scheduling steps are given below.

1. Having received a new beat signal from the control-ling clock, we now know the predicted period inseconds until the next beat, and the current phase.

2. Call the provideMaterial() method of each BBCut2object which is running on this clock.

3. Make sure the cache of events from the algorithmiccomposition covers at least the next beat and a half(more may be required depending on the relationbetween the tempo and the synthesis delay). If it doesnot, call the chooseBlock() method to obtain anotherblock’s worth of material (there would be equivalentmethods of providing data for various forms ofalgorithmic composer).

4. Render any cut sequences into appropriate syn-thesis messages and associated absolute timingcorrections for the CutGroup, representing thechain of CutSynths involved (playback units, effectsunits, et al.).

5. Convert the beat-based timings to absolute timingstaking into account such time-based factors asexpressive timing corrections, perceptual attack time,network latency and rendering delay.

6. Take any events from the cache which must be sentwithin the time period of the upcoming beat (this mayinclude events whose beat position is within the nextbeat after that, but whose time pre-scheduling is suchas to require sending earlier). Retain in the cache inbeat-based ordering those events not yet required.

7. Prepare a sorted list of the pertinent events,returning them to the ExternalClock object.

8. Schedule the upcoming events using sched() whichcalls an absolute time based scheduler, and can becancelled early if necessary due to an unanticipatedearly next beat tick.

Scheduling is therefore achieved by an inner loop whichrequests future events from client agents in small blocksuntil enough are available to fill the time to be presched-uled. Because the agents themselves often have to calculatemore than is needed at a given time (perhaps because theywork out their material by measures), the schedulerprovides a caching queue to store any spare future events.All agents are compatible with this evaluation on demandsystem by providing the appropriate interface methods,through a class hierarchy. Beat-based scheduling coverslong-term events but locations are converted to secondsfor the next beat (where the tempo is known); this short-term time-based scheduling queue can always be cancelledearly on receipt of an unexpected premature beat signalfrom the tracker (perhaps corresponding to an acceler-ando or phase jump). It is critical that the beat-based eventorder may change when converting to final absolute timepositions, due to differences in perceptual attack time,expressive timing corrections or other factors tied toindividual synthesis events.

Fig. 1. Message passing between core BBCut2 objects. Instance data are shown in rectangular boxes, instance methods are in circles.Method calling is shown by black and data access by outlined arrowheads.

BBCut2 65

Dow

nloa

ded

by [

Tha

mm

asat

Uni

vers

ity L

ibra

ries

] at

19:

12 0

4 O

ctob

er 2

014

Page 5: BBCut2: Integrating beat tracking and on-the-fly event analysis

A generality beyond BBCut’s BBCutProc-derivedalgorithmic composers was exhibited by also providinginterface functions for SuperCollider’s Patterns library(McCartney, 2002). This is an algorithmic compositiontoolkit of classes for generating streams of events, fromstatic sequences, shuffled order sequences, weightedchoices and a multitude of other options, furtherempowered by the ability to nest patterns within oneanother. BBCut2 can run patterns in synchrony with anexternal clock.

One assumption made by this design is that theexternal clock which drives the scheduling of events(thus, for a beat tracker, the audio to be tracked) admitsan isochronous beat, preferably for a simple rather thancompound time signature, and preferably 4/4. Non-isochronous time signatures will have a special interac-tion with the scheduling, in that they will appear to be anisochronous beat that keeps jumping forwards betweenthe last beat of each measure and the next down-beat, orwill be tracked as swapping emphasis between on-beatand off-beat (in the sense that two measures of 7/8 addup to 7/4). In fact, it is most likely that the adjustments ofthe beat tracker will lead to jumping of beats as thetracker reacts late to the changes; a beat tracker must beprebuilt with more heuristics specialized to deter-mining longer scale measure patterns to cope with anon-isochronous meter. Changing meter is another orderof problem again, and in general can only be coped withthrough advance knowledge of the score. The result ofsuch tracking behaviour will be to drop certain vitalevents from the scheduling queue (often those eventsassociated with the down-beat, which are probably themost critical), though the mechanisms described aboveare robust enough not to crash. Dannenberg (1989,p. 257) and Mathews (1989, p. 271 – 272) note otherpossible synchronization strategies, such as gradualtempo adjustment to some human-like reaction profile.However, real-time applications in processing which areoften inherently faster-than-human benefit from immedi-ate and abrupt transition given an update of schedulingposition, and this is the default taken for BBCut2. Afurther refinement might tag vital messages which mustbe sent no matter how late they become, so that thewhole scheduling queue is not abandoned in an abruptadjustment. In practice, jumps are regulated by the beattracker, which tends to provide a regular clock as long asthe source tracked is regular, and as has often beenobserved in such work, is most reliable on metronomicstimuli in the first place.

3. Time representations in BBCut2

Imagine splicing a fixed buffer by shuffling eighth notebeat segments around. This manoeuvre requires thedetermination of the eighth note metrical level within

the target, and such information might be gleaned in thefollowing ways.

1. The target buffer has a known length in beats;subdivide strictly assuming an even tempo.

2. A beat track is given, perhaps by an application ofan automated beat tracking process.

The first case might occur where a metronomic sampleddance loop is the target, and is prepared to a known beatlength, such as a four beat measure. The second is a moregeneral case, where the target probably includes expressivetiming of some order.Whilst the events in the first examplemay fall in a kind way upon the eighth note grid, those inthe second are likely to overlap grid points. Events shouldbe kept associated to particular beat locations (quantizedlocation, the nearest position in the appropriate metricallevel) but may involve some expressive timing deviation asan absolute timing correction from that beat position.This is the basis of time representations which separatetempo curves from local timing deviations (Baggi, 1991;Desain &Honing, 1992, 1993; Bilmes, 1993; Honing, 2001),rather than the single tempo curve representations of anearlier generation (i.e. the FORMULA language’s timedeformations (Anderson & Kuivila, 1990)). Gouyon andDixon (2005, p. 37) note that the metrical structureprovides ‘‘anchor points for timing deviations’’. To hon-our this system, BBCut2 has scope for beat positions forevents as well as local timing deviations from the metricalgrid (Figure 2). In a change of playback tempo, events canstay attached to their beat positions. Beat positions areordinarily taken to a resolution of an eighth note. Thestandard assumption is thus 4/4 simple meter; a com-pound meter like 6/8 with three eighth notes per beatcould also work as long as the beat tracker could specifythis to the system. In general, swing is speculated toalways occur just below the fastest isochronous metricallevel. So for 4/4 time signature, sixteenth note quantiza-tion would make assumptions about the expressive timingof swing which are unwarranted (Friberg & Sundstrom,2002; Gouyon et al., 2003).

In the re-synthesis required by beat-based splicing,playback of events is carried out where those events havea quantized position falling within a particular metricalslice taken from the target. Their synthesis time can becorrected by the timing deviation if desired (since rigidquantization can remove timing errors). The deviationvalue may be unaffected by tempo change or potentiallycorrected in line with the thinking of Desain and Honing(1994), that tempo provides a context for allowabletiming deviations in terms of musical style and motorconstraints. I have already outlined the schedulingmechanisms above that support such absolute timedeviations and their interaction with beat-based schedul-ing. The case of beat tracking and event analysis on-the-fly allow the tagging of events for splicing as they are

66 Nick Collins

Dow

nloa

ded

by [

Tha

mm

asat

Uni

vers

ity L

ibra

ries

] at

19:

12 0

4 O

ctob

er 2

014

Page 6: BBCut2: Integrating beat tracking and on-the-fly event analysis

detected. Positions are recorded with respect to theinferred metrical levels given by the beat tracker,registered onset times being quantized to beat positionsalongside associated absolute time corrections.

4. BBCut2 capabilities

I finish with the payoff of the hard work: real-timeprocessing capabilities supported by the BBCut2 archi-tecture. BBCut2 is a publicly available open sourcesystem (from the author’s website) and examples of theseeffects are included with the distribution.

4.1 Algorithmic FX locked to splices

Since BBCut2 compensates fully for any synthesis delay,it provides rock-solid timing capabilities and in parti-cular allows effects units to be run which are perfectlysynchronized to the beat and associated cut sequences.For example, comb filters might have their delay adjustedover the course of a stuttering roll, so that the delayshortens (comb pitch rises) with successive repetitions.

An FX unit that algorithmically adds and removesother effects units from the chain (the CutGroup) witheach new phrase is provided as a subclass of CutSynthcalled CutSwap1. Filters, distortion, ring modulators and

reverbs can all be added to the enclosing CutGroup; eachof which is itself a cut sequence sensitive effect.

4.2 Beat tracking led segmentation

Where the audio to be spliced is a live input stream, beattracking of this stream empowers live splicing which islocked to the metrical frame of the target. The target audioderives the reference clock with respect to which predictivesplicing is managed. Such splicing assumes that events tendto fall on metrical level markers, so is not robust toexpressive timing, but certainly is an improvement fromhaving no knowledge of the source’s tempo base.

4.3 Event sensitive splicing

Section 3 detailed how the actual position of eventswithin a target could be taken account of in splicing.Splicing is with respect to beat-based units in a quantizedmetrical framework; a given cut takes along those eventswhose quantized position at the appropriate metricallevel (the 8th note in this work under a 4/4 assumption)falls within the cut; cuts themselves do not have tosquarely lock to eighth notes. There are options toexclude expressively timed events that would precede orfollow the cut’s scope itself, calculated from the currentbest tempo estimate. Absolute timing deviations can be

Fig. 2. The upper figure shows a drum beat waveform and detected events; the lower shows a metrical grid, with the beat levelindicated in solid and a binary subdivision (eighth notes) in dotted lines. Detected events are shown attached to quantized eighth notepositions in the grid; the timing deviations are the x axis differences from the grid in seconds.

BBCut2 67

Dow

nloa

ded

by [

Tha

mm

asat

Uni

vers

ity L

ibra

ries

] at

19:

12 0

4 O

ctob

er 2

014

Page 7: BBCut2: Integrating beat tracking and on-the-fly event analysis

restored (if desired) in rendering, as described under thescheduling capabilities. Assuming that the consequenceof simultaneous sound events and spillover of any reverbor other effects is negligible, the detection of eventsallows replay at different tempi without repitching thesample. Expressive timing may even be modified within aconsistent manner to change the swing or groove(Gouyon et al., 2003); for instance, the expressive timinginformation can be modified systematically whilst pre-serving the quantized beat locations. These variouseffects are implemented in BBCut2.

4.4 On-the-fly event analysis

A running event analysis system can be used to tag eventswhere the target audio is a live stream rather than somepre-analysed soundfile (Brossier et al., 2004; Collins,2004). BBCut2 provides a database class on the language

side which is constantly updated as a signal processingroutine on the Server finds new sound events. Therunning event analysis system is described in separatepublications (Collins 2005a,b); the percussive onsetdetector had been entered in the MIREX2005 competi-tion. Event detection will have a delay up to the length ofthe event detected (for its offset must be found as well asits onset, and often a new onset is the cue for the offset ofthe previous event). The circular buffer for storedsamples is of a known length, and events which referenceabsolute time locations too far into the past can beremoved from the database as they go out of date.Further parameters relevant to scheduling such asperceptual attack time, or parameters useful for algo-rithmic event selection from the database such asperceptual loudness and pitch or timbral classification,are maintained in the buffer alongside event bufferlocations and absolute collection time. These mechanisms

Fig. 3. Processing chain for BBCut2 where the clock arises from a beat tracking process, and the audio to be spliced is analysed

on-the-fly to preserve events in the metrical frame.

68 Nick Collins

Dow

nloa

ded

by [

Tha

mm

asat

Uni

vers

ity L

ibra

ries

] at

19:

12 0

4 O

ctob

er 2

014

Page 8: BBCut2: Integrating beat tracking and on-the-fly event analysis

are independent of any algorithmic splicing, but may ofcourse inform such.

4.5 Event sensitive splicing under beat tracking control

Figure 3 gives a signal chain for the combination of thevarious capabilities of BBCut2, with simultaneous beattracking and on-the-fly event analysis. In one auditorydemonstration for this tool, a human beat boxer can betracked and spliced in real-time with respect to their ownmetre, the events themselves being quantized to the beatto give a more machine-like output. In a second, arecording of a straight pop track (but it could be a livepop band) is tracked and algorithmically cut-up in real-time, adding swing.

5. Conclusions

Updating the BBCut library to be more aware of theaudio material it treats necessitated fundamental changesin its architecture. These changes are to the benefit ofbuilding autonomous interactive music systems whichcan perform with human musicians, but leverage thenovel processing effects possible with digital audiomanipulation. This overview of BBCut2 has concentratedon some novel aspects of scheduling required by beattracking clock control, some representational aspectsrelating to event detection, and some new real-time effectspermitted by the incorporation of machine listeningtechnology into an algorithmic processing system. Ofcourse, limitations in real-time causal beat trackingimpact upon the system as described. Further metricalcontext information, and the ability to cope with differentnon-simple (and even non-isochronous) time signaturesremain open research problems. Greater built-in culturalknowledge rather than universal periodicity analysis maylead to better style-specific predictions of smooth tempovariation that assist with maintaining synchrony. Thereare also issues for event analysis to solve, primarily inrecognizing non-percussive onsets, and factoring outpotentially confounding frequency and amplitude mod-ulation in such signals as the singing voice. These willhave their own impact on the sophistication of schedulingrequired, for instance in requiring algorithmic agentrecalculations (or multiple alternative renderings) ofmaterial for a sudden shift of metrical frame or timbralsoundscape. Yet the possibility of dynamic on-the-flyeffects using machine listening technology has beenexhibited and remains an exciting research arena withmany interactive music applications.

It is hoped that the reader will try out the software,freely available as an extension library of classes for theSuperCollider platform; BBCut2 includes help files andexample code pertaining to effects mentioned in thisarticle.

References

Anderson, D.P. & Kuivila, R. (1990). A system forcomputer music performance. ACM Transactions onComputer Systems, 8(1), 56 – 82.

Baggi, D.L. (1991). Neurswing: An intelligent workbenchfor the investigation of swing in jazz. IEEE Computer,24(7), 60 – 64.

Bilmes, J.A. (1993). Techniques to foster drum machineexpressivity. In Proceedings of the International ComputerMusic Conference. San Fancisco, CA: InternationalComputer Music Association.

Brandt, E. & Dannenberg, R.B. (1999). Time in distributedreal-time systems. In Proceedings of the InternationalComputer Music Conference. San Fancisco, CA: Inter-national Computer Music Association.

Brossier, P., Bello, J.P. & Plumbley, M.D. (2004). Real-timetemporal segmentation of note objects in music signals.In Proceedings of the International Computer MusicConference. San Fancisco, CA: International ComputerMusic Association.

Collins, N. (2002). The BBCut Library. In Proceedings ofthe International Computer Music Conference, Goteborg,Sweden, 16 – 21 September. San Fancisco, CA: Interna-tional Computer Music Association, pp. 313 – 316.

Collins, N. (2004). On onsets on-the-fly: Real-time eventsegmentation and categorisation as a compositionaleffect. In Sound and Music Computing (SMC04),20 – 24 October (pp. 219 – 224). Paris: IRCAM.

Collins, N. (2005a). An automated event analysissystem with compositional applications. In Proceedingsof International Computer Music Conference, Barcelona.San Fancisco, CA: International Computer MusicAssociation.

Collins, N. (2005b). A change discrimination onset detectorwith peak scoring peak picker and time domain correc-tion. MIREX2005 entry, online paper at: http://www.music-ir.org/evaluation/mirex-results/articles/onset/collins.pdf

Davies, M.E.P. & Plumbley, M.D. (2005). Beat trackingwith a two state model. Paper presented at Proceedings ofthe IEEE International Conference on Acoustics, Speechand Signal Processing (ICASSP 2005), Philadelphia,USA, 19 – 23 March.

Dannenberg, R. (1989). Real-time scheduling and computeraccompaniment. InMathews&Pierce (1989).pp. 225 – 261.

Desain, P. & Honing, H. (1992). Music, Mind and Machine:Studies in Computer Music, Music Cognition and ArtificialIntelligence. Amsterdam: Thesis Publishers.

Desain, P. & Honing, H. (1993). Tempo curves con-sidered harmful. Contemporary Music Review, 7(2),123 – 138.

Desain, P. & Honing, H. (1994). Does expressive timing inmusic performance scale proportionally with tempo?Psychological Review, 56, 285 – 292.

Friberg,A.&Sundstrom,A. (2002).Swing ratiosand ensembletiming in jazz performance: evidence for a commonrhythmic pattern.Music Perception, 19(3), 333 – 349.

BBCut2 69

Dow

nloa

ded

by [

Tha

mm

asat

Uni

vers

ity L

ibra

ries

] at

19:

12 0

4 O

ctob

er 2

014

Page 9: BBCut2: Integrating beat tracking and on-the-fly event analysis

Gouyon, F. & Dixon, S. (2005). A review of automaticrhythm description systems. Computer Music Journal,29(1), 34 – 54.

Gouyon, F., Fabig, L. & Bonada, J. (2003). Rhythmicexpressiveness transformations of audio recordings: Swingmodifications. Paper presented at Proceedings of theDigital Audio Effects Workshop (DAFx).

Honing, H. (2001). From time to time: the representation oftiming and tempo.ComputerMusic Journal, 25(3), 50 – 61.

Mathews,M.V. (1989). The conductor program andmechan-ical baton. In Mathews & Pierce (1989). pp. 263 – 281.

Mathews, M.V. & Pierce, J.R. (Eds). (1989). Current Direc-tions in Computer Music Research. Cambridge, MA: MITPress.

McCartney, J. (2002). Rethinking the computer musiclanguage: SuperCollider. Computer Music Journal, 26(4),61 – 68

Schwarz, D. (2004).Data-driven concatenative sound synthesis.PhD thesis, Universite Paris 6. Available online at: http://recherche.ircam.fr/equipes/analyse-synthese/schwarz/

Truax, B. (1988). Real-time granular synthesis with a digitalsignal processor. Computer Music Journal, 12(2), 14 – 26.

Wright, M. & Freed, A. (1997). Open Sound Control: anew protocol for communicating with sound synthesi-sers. In Proceedings of the International ComputerMusic Conference, Thessaloniki, Hellas. San Fancisco,CA: International Computer Music Association, pp.101 – 104.

70 Nick Collins

Dow

nloa

ded

by [

Tha

mm

asat

Uni

vers

ity L

ibra

ries

] at

19:

12 0

4 O

ctob

er 2

014