robot-assisted playing with fingering support for a

4
ROBOT-ASSISTED PLAYING WITH FINGERING SUPPORT FOR A SAXOPHONE Yoshifumi Kurosawa, Kenji Suzuki University of Tsukuba Artificial Intelligence Laboratory, Tsukuba, Japan [email protected], [email protected] ABSTRACT This paper proposes a robot-assisted playing system for reed instruments that allows human players to conduct their own musical performance without applying the fingering motion. The developed system assists in this necessary part of play- ing, not with the purpose of substituting the player with the robot but rather to assist the human’s playing without miss- ing the intention and expressiveness of the musical perfor- mance. The assisted person is able to achieve the feeling of playing and express their own feelings. In this study, we developed a modular device attached to the traditional saxo- phone to perform the fingering actions in order to open and close keys by mechanic modules. It should be noted that the system not only performs automatic fingering but also al- lows the players to achieve fingering in an ordinary manner. In this paper, we introduce a novel style of musical perfor- mance, robot-assisted playing, and its implementation. In addition, we also report on evaluation experiments and a comparison to traditional playing. 1. INTRODUCTION Recent studies on robots using musical instruments have concentrated on two major approaches: robots that play mu- sic automatically and humanoid robots [1, 2, 3]. Robots that play music automatically are used for entertainment by automating a variety of musical instruments [1, 2]. On the other hand, humanoid robot based studies such as the Waseda Flutist Robot [3] embrace the mechanisms of the human mind and body by substituting the player with a hu- manoid robot which plays the musical instrument. A number of studies such as accompaniment or acoustic instruments has been reported so far in the area of assisted playing. For example, Oshima, et al.[4] have reported on an ensemble supporting system. This system uses a MIDI key- board and is capable of accurately reproducing the sound of the pressed keystroke to match the melody being played, thereby assisting the beginner. On the other hand, Handi- cap Recorder [5] targets crippled players through the alter- ation of the design of the recorder. MIDI Recorder [6] which also targets hand crippled players is used through a module fixed on the recorder. These researches, in general, attempt to support playing by modifying the instrument to accom- modate the target group of players. In contrast, very few attempts have been made to design systems which do not alter either the construction of the musical instrument or the playing method. For example, the Concert Hands [7] sup- port the practicing of the piano by fixing a separate module on the player’s fingers and wrist. However, this is a wear- able system which has limited use due to the capabilities of the robotic assistance. Furthermore, almost all the above mentioned researches target a specific player group and en- tertainment rather than collaboration. The main purpose of these studies is to create an environment for the robot to live together with humans. The system we propose is designed to support the play- ing motion of a player, and paying attention to the player without trying to substitute a robot playing with a musical instrument (see Figure 1). Musical instruments using this system do not necessarily get altered but rather work with added functionalities. Thus this system provides support without changing the method of playing. In addition, the robotic support does not differ from player to player. The system can be used by beginners for practicing and also as an assistive device for disabled people. The player gets to Human Human Human Human Robot Robot Robot Robotic Playing Robot Robot Human Playing Robot-Assisted Playing Fingering Modules Sensory Mouthpiece breath data control the fingering Fingering Control System Figure 1. Human playing, robot-assisted playing and robotic (automatic) playing [8]. 1

Upload: others

Post on 28-Feb-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ROBOT-ASSISTED PLAYING WITH FINGERING SUPPORT FOR A

ROBOT-ASSISTED PLAYING WITH FINGERING SUPPORTFOR A SAXOPHONE

Yoshifumi Kurosawa, Kenji Suzuki

University of TsukubaArtificial Intelligence Laboratory, Tsukuba, Japan

[email protected], [email protected]

ABSTRACT

This paper proposes a robot-assisted playing system for reedinstruments that allows human players to conduct their ownmusical performance without applying the fingering motion.The developed system assists in this necessary part of play-ing, not with the purpose of substituting the player with therobot but rather to assist the human’s playing without miss-ing the intention and expressiveness of the musical perfor-mance. The assisted person is able to achieve the feelingof playing and express their own feelings. In this study, wedeveloped a modular device attached to the traditional saxo-phone to perform the fingering actions in order to open andclose keys by mechanic modules. It should be noted that thesystem not only performs automatic fingering but also al-lows the players to achieve fingering in an ordinary manner.In this paper, we introduce a novel style of musical perfor-mance, robot-assisted playing, and its implementation. Inaddition, we also report on evaluation experiments and acomparison to traditional playing.

1. INTRODUCTION

Recent studies on robots using musical instruments haveconcentrated on two major approaches: robots that play mu-sic automatically and humanoid robots [1, 2, 3]. Robotsthat play music automatically are used for entertainmentby automating a variety of musical instruments [1, 2]. Onthe other hand, humanoid robot based studies such as theWaseda Flutist Robot [3] embrace the mechanisms of thehuman mind and body by substituting the player with a hu-manoid robot which plays the musical instrument.

A number of studies such as accompaniment or acousticinstruments has been reported so far in the area of assistedplaying. For example, Oshima,et al.[4] have reported on anensemble supporting system. This system uses a MIDI key-board and is capable of accurately reproducing the soundof the pressed keystroke to match the melody being played,thereby assisting the beginner. On the other hand, Handi-cap Recorder [5] targets crippled players through the alter-ation of the design of the recorder. MIDI Recorder [6] whichalso targets hand crippled players is used through a module

fixed on the recorder. These researches, in general, attemptto support playing by modifying the instrument to accom-modate the target group of players. In contrast, very fewattempts have been made to design systems which do notalter either the construction of the musical instrument or theplaying method. For example, the Concert Hands [7] sup-port the practicing of the piano by fixing a separate moduleon the player’s fingers and wrist. However, this is a wear-able system which has limited use due to the capabilities ofthe robotic assistance. Furthermore, almost all the abovementioned researches target a specific player group and en-tertainment rather than collaboration. The main purpose ofthese studies is to create an environment for the robot to livetogether with humans.

The system we propose is designed to support the play-ing motion of a player, and paying attention to the playerwithout trying to substitute a robot playing with a musicalinstrument (see Figure 1). Musical instruments using thissystem do not necessarily get altered but rather work withadded functionalities. Thus this system provides supportwithout changing the method of playing. In addition, therobotic support does not differ from player to player. Thesystem can be used by beginners for practicing and also asan assistive device for disabled people. The player gets to

Human

Human

Human

Human

Robot

Robot

Robot

Robotic Playing

Robot

Robot

Human Playing Robot-Assisted Playing

Fingering Modules

Sensory Mouthpiece

breath data

control the fingering

FingeringControl System

Figure 1. Human playing, robot-assisted playing androbotic (automatic) playing [8].

1

Page 2: ROBOT-ASSISTED PLAYING WITH FINGERING SUPPORT FOR A

play with hisnatural style while the system provides supportwhen it is needed. Consequently, the system can be used byhealthy and disabled players alike while enjoying the auto-mated support.

In this study, a saxophone is used as the musical in-strument. The saxophone is a woodwind instrument whichproduces sound by oscillating a reed. Reed instruments arecomparatively easy to use since it is possible to express one’s individuality of sounds by varying the breath.

The system mainly consists of i) a sensory mouthpieceto detect single tonguing and ii) wires operated modules toopen and close the keys of the saxophone. The modules,called fingering modules, substitute the fingering and arecontrolled by the breath of the player. During play, the keysare opened and closed either by the fingering modules ac-cording to a given musical score and/or by the player. Byplaying with the fingering modules while using the breathof the player, it is possible to provide support only when itis needed by the player. We believe this consequently makesthe player feel he is playing a musical instrument.

Furthermore the proposed system can also be used insituations such as breathing practice by beginners, checkingthe fingering during playing and assisting the playing of armor finger crippled persons.

2. SYSTEM CONFIGURATION

Figure 1 shows an overview of the proposed mechanism.This system can be divided into three main parts: the finger-ing modules, the fingering control system and the sensorymouthpiece. All devices are designed to attach on to a tra-ditional saxophone. Details of the system are as follows.

2.1. The Fingering Module

The fingering modules are operated by wires, which arepulled by solenoids, connected to keys (see Figure 2). Basedon the power required to open and close the keys, two typesof general purpose solenoids are been employed in the sys-tem and custom made fixtures has been used to attach themon to the saxophone. Through some preliminary experi-ments, we found that 1.5 [N] and 2.0 [N] are required toopen and close each key by wired operation respectively. Acomplete fingering module weighs about 700 [g].

The two fingering modules are installed, one each for theleft hand and right hand respectively. Each of these finger-ing modules consist of seven solenoids as three for the lefthand part and four for the right hand part respectively, mak-ing it possible to play the full diatonic scale of an octave.Locations for fingering modules has been selected such thatit would not affect the standard playing style of a humanplayer, thereby making it possible for a human player to usethe modified saxophone without any uneasiness. Moreover,it is also possible to detach each of the fingering modules

Solenoid

Wire

Key

Figure 2. Fingering modules.

PressureSensor

PlayInformation

Musical Score

During blowing

Stop blowing

Breath Data

Non Playing

Playing

Threshold AmountPlayer

SensoryMouthpiece

Fingering Control System

Blow

FingeringModules

Tonguing

Figure 3. Algorithm of robot-assisted system.

separately. Therefore, depending on his liking, the playercan choose to use a selected number of fingering modules toassist his music.

2.2. The Fingering Control System

The fingering control system is based on a dedicated mi-croprocessor unit. First, it gets the air flow data from thesensory mouthpiece. Next, it controls the fingering modulesaccording to pre-programmed play information. Play infor-mation represents a musical score which is given in advance.

Although the fingering control is based on the identifica-tion of blowing, any change during blowing could affect theflow of the music. Hence, upon the recognition of a blowingchange, the necessary fingering change is determined andstored in a scheduler and loaded at the end of the blowing.Nevertheless, fingering changes required during blowing areidentified from the tempo of the musical score and loadedimmediately in to the fingering module. Figure 6 showsan example of the sound when playing with the developedsystem. The fingering is different from humans’ fingeringas shown in Figure 6 (a). Each arrow represents the loca-tion of holding keys. The note change is recognized whenthe breath pressure decreases below a certain threshold, andthen fingering of next not occurs as shown in Figure 6 (b).As it can be seen from the figure, although the timing offingering is different, the sound is not affected.

2

Page 3: ROBOT-ASSISTED PLAYING WITH FINGERING SUPPORT FOR A

Tube

Side Inside

Pressure Sensor

Breath Sensor

Figure 4. Sensory mouthpiece.

Fingering module (Left hand: LA)

Fingering module (Right hand: RA)

Fingering module (Right hand: RB)

Fingering module (Left hand: LB)

Sensory Mouthpiece

Key Open/Close

Breath blowing

LA

LB

RA

RB

Sensory Mouthpiece

Figure 5. A saxophone attached with the developed mod-ules.

2.3. The Sensory Mouthpiece

The sensory mouthpiece measures the breath of the player inreal time through a pressure sensor which has been fixed ona 3 [mm] diameter hole made in the mouthpiece and directlyin contact with the air flow inside the tube (see Figure 4).One of the main targets of this study is to create the systemwithout altering the body of the musical instrument. How-ever, the mouthpiece was altered to mount a breath sensor.Still, the position of the hole and the breath sensor was se-lected such that it would not affect the comfort of the player.

2.4. System Implementation

Figure 5 shows the customized saxophone. In order to ver-ify the basic performance, the system was evaluated withthe following two experiments conducted by an experiencedsaxophonist.

1. Using the sensory mouthpiece, the player conductsthe fingering.

2. Using the sensory mouthpiece, the fingering modulesconduct the fingering.

First, from experiment 1, it was found that a player canopen and close the keys without any discomfort. It was alsoconfirmed that the breath sensor fixed mouthpiece does notaffect the blowing operation. Figure 6 (b) shows the breathdata acquired by the sensory mouthpiece and Figure 6 (c)shows the corresponding sound pressure variation. As it can

0 1 2 3 4 5

Time [s]

Inf

-12

-6

-12

-6

Am

plit

ud

e [d

B]

Bre

ath

Pre

ssu

re [m

V]

Robot

HumanC

C D

D

E F

FE

Threshold Amount

(a)

(b)

(c)

Figure 6. Perfomance evaluation; (a) Comparison of finger-ing change timing between the robot and the human, (b) Thebreath data, (c) The sound pressurre data.

be seen from the two figures, the sensory mouthpiece is ca-pable of accurately detecting the air flow. Also, from exper-iment 2, it was found that sound produced with the fingeringmodule is similar to the natural playing output.

3. SYSTEM EVALUATION

The target of this study is to develop a system supportingthe expressiveness of the player. In order to evaluate the effi-ciency of the developed support, two types of musical piecesplayed with the system were evaluated by human listeners.

The experiment was conducted using records of saxo-phones plays. Two pieces were used during the experiments;Figure 7 (a) a scale in an octave and Figure 7 (b) the firstfive bars ofIsn’t She Lovelyby Stevie Wonder played at 120BPM (Beats per Minute). Three samples of these pieces;played by a human player, played by the same player us-ing the robot-assisted system and MIDI (Musical InstrumentDigital Interface) were prepared for the evaluation. Con-stant velocity and amplitude were used to prepare the MIDIsamples. 21 subjects who had no special training of mu-sic were asked to evaluate the musical pieces on a scale ofone to five by two different axes; expressiveness and natu-ralness. As we did not describe the meaning of each term inthe questionnaire form, all subjects evaluated the music ina subjective manner. The audio samples were played to thesubject in a random order.The saxophone used in this study

3

Page 4: ROBOT-ASSISTED PLAYING WITH FINGERING SUPPORT FOR A

(a)

(b)

1.

Figure 7. The two musical pieces used for the evaluation.

0

1

2

3

4

5

(#)

0

1

2

3

4

5

(#) p < 0.10

(a)

(b)MIDI (Music)Robot (Music)

Human (Scale) Robot (Scale) MIDI (Scale)

Human (Music)

Naturalness

Expressiveness

Naturalness

Expressiveness

Figure 8. (a) Experimental result of playing a scale, (b)Playing a musical piece.

has a key for octave changing. However, this key not beingsupported in the current stage of the study, the player wasasked to manually press it during automated play.

Figure 8 shows the results of the experiment. Statisticanalysis was done to evaluate the efficiency of the systemfor supporting the expressiveness of the player.As shown inFigure 8 (a), we could not obtain a significant differencebetween the robot and the MIDI sample in either expres-siveness or naturalness, although the averaged score of therobot-assisted playing was higher than the one obtained bythe MIDI sample. On the other hand, for one of the musicpiece (b), as shown in Figure 8 (b), we obtained a differ-ence between the robot and the MIDI sample (p=0.051. S.L.10%). It was concluded that unlike MIDI, this system wasable to support expressiveness. We could also note that thescore was comparable to the MIDI sample in regard to nat-uralness. On the other hand, as it can be seen from Figure 8,the human/robot comparison showed a significant differencein regard to both expressiveness and naturalness. This wasmostly due to the contamination of the audio samples by thesmall mechanical noise of the solenoids emitted when therobot played, which affected the naturalness. This contam-

ination is not clearly audible when listened in front of therobot, but audible when recorded.

4. CONCLUSION

In this paper, we proposed robot-assisted playing as the novelstyle of musical performance. We also developed the mod-ular devices such as the sensory mouthpiece and the finger-ing support, which can be attached to the traditional sax-ophone. Performance was measured and showed positiveresults. From the evaluation experiments, we demonstratedthe advantages of the proposed method that allows a playerto express him/herself with acoustic sound from reed instru-ment. The robot-assisted playing aims to support peoplewho cannot play instruments correctly due to different fac-tors such as lack of skills or disability. It allows people toexpress their feelings and their sensitivity with musical ex-pressiveness, which is one of fundamental factors in musicalperformance.

Reduction of the small mechanical noises produced bythe solenoids will be done in the next implementation. Fur-ther development will include a MIDI interface that willcontrol the fingering module from an external device andallow cooperation with other digital instruments. Evalua-tion of performances together with other human players isand more quantitative scales such as speed or accuracy alsoplanned in the near future.

5. REFERENCES

[1] A. Kapur, “A History of Robotic Musical Instruments,”in Proc. of the International Computer Music Conference,2005, pp. 21–28.

[2] G. Weinberg, and S. Driscoll, “Toward Robotic Musician-ship,” Computer Music Journal, 2006, Vol.30(4), pp. 28–45.

[3] K. Petersen, J. Solis, and A. Takanishi, “Development of theWaseda Flutist Robot No. 4 Refined IV: Implementation ofa Real-Time Interaction System with Human Partners,” inProc. of IEEE RAS/EMBS Intl. Conf. on Biomedical Roboticsand Biomechatronics, 2008, pp. 421–426.

[4] C. Oshima, K. Nishimoto, and M. Suzuki, “Family ensem-ble: a collaborative musical edutainment system for childrenand parents,” inProc. of Intl Conf. on Multimedia, 2004, pp.556–563.

[5] [Online] Handicap Recorder, http://www.earlymusic-tokyo.com/handicap/index.html (in Japanese)

[6] S. Yogo, and H. Konno, “A Development of MIDI Recorderfor Physically Handicapped People with One-Chip Micro-computer,” inIEICE technical report, 2006, pp. 59–64. (inJapanese)

[7] [Online] Concert Hands, http://www.concerthands.com/[8] S. Takashima, and T. Miyawaki, “Control of an automatic

performance robot of saxophone: Performance control us-ing standard MIDI files,” inProc. of the IROS Workshop onMusical Performance Robots and Its Applications, 2006, pp.30–35.

4