designing new music interac0on enhancing sound...
TRANSCRIPT
Ju young Shin
Sungkyunkwan Univ. Dept.ofHumanICTConvergence
DesigningNewMusicinterac0onenhancingsoundaffordance
2
• Introduc0on• ProjectGoal• RelatedWork• StudyingGestures• User-Centereddesign• Conclusion&FutureWork
Outline
Introduction
• OrdinarypeoplehavedevelopedaninterestinDJing(diskjockeying)
• CompanieshavereleasedvariousDJingapplica8onsfornovices-Sensorsandtechnologyembeddedinasmartphonecanperform
3
Introduction
• Difficultforordinaryuserstousewithoutprofessionalknowledgeormanuals
• Findaneffec8veinterac8onmethodforordinaryuserswhowanttomixsounds.
4
Project Goal
• Focusesuitableinterac0onformusicmixing
-Musicmixing:crea0onofnewmusicbyoverlappingtwoormoremusicalperformances
-basicmixingequipment:twoturntables,onemixer,andseveralrecordsarerequired
5
Related Work
• SoundBounce(Dahl.L,2010)
• Source:Dahl, L., & Wang, G. “Soundbounce:Physicalmetaphorsindesigningmobilemusicperformance.”
-Controlsasynthesizerwhenamo0onsuchashitorthrow-Usingtheaccelerometerandcompasssensor
“hit”gesture Throwingasound
6
Related Work
• MoBall(Rasamimanana.N,2011)
• Source:Rasamimanana,N.,Bevilacqua,F.,Schnell,N.,Guedy,F.,Flety,E.,Maestracci,C.,...&Petrevski,U.“Modularmusicalobjectstowardsembodiedcontrolofdigitalmusic”
-Tangibleinterac0onballwithwhichparametersofmusicarechangedaccordingtoac8on/mo8on-Changesmusicaccordingtousermovement
“hit”gesture UsingMoball
7
Studying Gestures
• ThegesturescanbedividedthreetypesonSmartphone
-Touchgesture-Mo0ongesture-Touch-mo0ongesture
8
Studying Gestures
Tap Double tap Drag Flick
Touch screen with finger
Touch screen twice with finger
Move finger over Screen
Brush screen with finger
Pinch Spread Press Press and tap
Touch screen with two fingers and bring them closer
Touch screen with two fingers and further away
Touch screen and hold Touch screen and hold with two fingers, one finger touch
• TouchGesture-Afingercontactorrubonadevice’stouchpane
• Source:Villamor, C., Willis, D., & Wroblewski, L. “Touch gesture reference guide“
9
Studying Gestures
• Mo8onGesture-Aphysicalchangebyusingvarioussensorsembeddedinasmartphone
Tilt Flick Push - pull Twist
Tilt the device left, right, up, down
Move quikly the device left, right, up, down and return
Push or pull the device toward their body
Turn the device in another direction
• Source:Ruiz, J., Li, Y., & Lank, E. “User-Defined Motion Gestures for Mobile Interaction”
10
Studying Gestures
• Touch-Mo8onGesture-Touch-gestureandmo0on-gesturearecombined-Canbeeffec0velyusedwhenexecu0ngaspecialfunc0on
• Source:Hinckley, K., & Song, H. “Sensor synaesthesia: touch in motion, and motion in touch”
Ex.Tiltup/downwiletouchingabu^on
11
User-Centered design
• FirstExperiment:Survey-Todeterminetheneedsofordinaryusersformixing-Discoverthefeatureeffectforsoundmixing
• SecondExperiment:Par8cipatoryDesign-5Taskaboutsoundmixingeffects-Par0cipantswereaskedtoexpressthegesturecametomindwithoutlimita8on.-Recordedtheac0onsandwordsofpar0cipantsandgatheredthequalita0veresult.
12
User-Centered design
• ThirdExperiment:DesignprototypeUsertest-Toimplementapplica0onconsideringnewinterac0ondesignelements-Conductusabilitytes0ng–3TypesA-Type:TouchOnlyB-Type:Touch+Mo0on
C-Type:Mo0onOnly-“Itera0veUserInterfaceDesign”–JakobNielsen1)Easeofopera0on2)Efficienttouse3)Pleasanttouse
13
User-Centered design - Result
• Surveyresult-Fivefeatureeffectsappearing-datafromatotalof38people
Effect Descrip8on
Crossfader Afunc0onthatcontrolsthechannel(music)comingoutfromthetwosides.
Loop Afunc0onthatenablesaplayertosaveapar0cularselec0onofmusicandreplayit.
Flanger Aneffectisprovidedbyinser0nga0nydelayinsound.
Scratch Aneffectofdamagingbeatsar0ficially.
VCF Acut-offeffectforapar0cularselec0onofsound.(Expertscallitasteppingsound.)
14
User-Centered design - Result
• Par8cipatoryDesignresult-UIDesigner&engineer(19par0cipants).-Ques0on:“isthereanygestureyoucanthinkofagerlisteningtothiseffect?”-Answered:“Thereisagesturethatcomestomind,”-Expressedthefivesoundeffectssuccessfully.
15
User-Centered design - Result
• Par8cipatoryresult-Effect:CrossfaderNumber of Responses
Gesture Type
Image Description
5 or more Motion
A tilt to the down & up.
3 or more
Motion
A Flick to the up & down.
Touch
Tap left & right sides in turn.
16
User-Centered design - Result
• Par8cipatoryresult-Effect:LoopNumber of Responses
Gesture Type
Image Description
5 or more Touch
Tap according to the beats.
3 or more
Touch
Draw a circle.
Motion
A Flick to the up & down.
17
User-Centered design - Result
• Par8cipatoryresult-Effect:FlangerNumber of Responses
Gesture Type Image Description
5 or more Touch
Pinch & Spread
Less than 3
Touch
A drag to the Up & Down
Touch
Touch and hold a few seconds
18
User-Centered design - Result
• Par8cipatoryresult-Effect:ScratchNumber of Responses
Gesture Type Image Description
5 or more Touch
Flick to up & down
Less than 3
Motion
Shake the device
Touch
Draw a circle
19
User-Centered design - Result
• Par8cipatoryresult-Effect:VCFNumber of Responses
Gesture Type Image Description
5 or more Touch
Place hand & cover the device screen
3 or more Touch
Rotate the device 180 degrees
Less than 3 Touch
Push-pull motion that pulls/pull toward the user
20
User-Centered design - Result
21
• InterfaceDesign
4
2
3 1
1. A-type : Touch only
1 Crossfader-SwipeBujon
2 Scratch-SwipeDisk
3 Loop-TouchBujon
4 Effect(VCF,Flanger)-TouchBujon
User-Centered design - Result
22
• InterfaceDesign
2
3 1
2. B-type : Touch + Motion
1 Crossfader-SwipeBujon
2 Scratch-SwipeDisk
3 Loop-TouchBujon
4
Effect(VCF,Flanger)-VCF:MovesUpanddown-Flanger:Drawfigure“8”
4
User-Centered design - Result
23
• InterfaceDesign2. B-type : Touch + Motion
4
Crossfader-Tiltleg&rightLoop-FlipUp&DownScratch-TiltUp&DownEffect(VCF,Flanger)-VCF:MovesUpanddown-Flanger:Drawfigure“8”
1
User-Centered design - Result
• Usertestoverview-focusesonusabilityastheques0onofhowwelluserscanusethatfunc0onality
24
TestObjectComparisonofusabilityforsoundmixingprototypesDefineusabilityinthreequalityajributes(Easeofopera0on,Efficienttouse,Pleasanttouse)
Method Taskanalysis&interview-usesmartphone
Target7smartphoneusers-Ageof20~30,priorexamina0onof4user,uidesigner2,engineer1)
Period 2015.10.10~2015.10.13,a0meof30min
User-Centered design - Result
• Usertestoverview
25
User-Centered design - Result
• Usertest-measuredvalues
-measuredwithques0onnairesandra0ngscales-scalefrom1(poor)to5(excellent)
26
No. Values Ques0on
1 Easeofopera0on Intheuseofagivenprototype,Dohaveacomfortable?
2 Efficienttouse(Intui0ve)
Howmuchofthevarianceinsoundmixingprototypeefficienttouse?
3 Pleasanttouse Howmuchofthevarianceinsoundmixingprototypesa0sfac0on?
4 Comparisonofinterface What'sbejerinterfaceA,B,C-Type?
5 Pleasewritefreelytohopethatimprovement.
User-Centered design - Result
• Usertest–Result&Analysis-Easeofopera0on-AType/72%(5User),BType/42%(3user)-A(Touch),B(Touch+Mo0on)iseaseofopera0on
27
Value ScaleA-type B-Type C-Type
respondent Ra0o(%) respondent Ra0o(%) respondent Ra0o(%)
Easeofopera0on
1 0 0% 0 0% 0 0%2 0 0% 0 0% 2 28%3 1 14% 1 14% 4 57%4 1 14% 3 42% 0 0
5 5 72% 3 42% 1 14%
User-Centered design - Result
• Usertest–Result&Analysis-Efficienttouse-AType/57%(4User)-A(Touch)isefficienttouse
28
Value ScaleA-type B-Type C-Type
respondent Ra0o(%) respondent Ra0o(%) respondent Ra0o(%)
Efficienttouse
1 0 0% 0 0% 0 0%2 0 0% 0 0% 2 28%3 0 0% 2 28% 3 42%4 3 43% 3 42% 1 14
5 4 57% 2 28% 1 14%
User-Centered design - Result
• Usertest–Result&Analysis-Pleasanttouse-AType/85%(6User)-A(Touch)isPleasanttouse
29
Value ScaleA-type B-Type C-Type
respondent Ra0o(%) respondent Ra0o(%) respondent Ra0o(%)
Pleasanttouse
1 0 0% 0 0% 0 0%2 0 0% 0 0% 2 28%3 1 14% 4 57% 1 14%4 0 0% 1 14% 4 57%5 6 85% 2 28% 0 0%
User-Centered design - Result
• Usertest–Result&Analysis-ComparisonofinterfaceusabilityA,B,CType
30
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
A-Type(Touch)
B-Type(Touch+Mo0on)
C-Type(Mo0on)
Easeofopera0on Efficienttouse Pleasanttouse Bestinterface
A-Type(Touch)
B-Type(Touch+Mo0on)
C-Type(Mo0on)
Conclusion
• Qualita8veuserfeedback
31
-Mainlyinterestedinuserfeedbackthatprovidedgreatinsightsintohowtheinterac0onfelt-TouchbasedUIwasveryposi0velyassessed-par0cipantsalsoindicatedthattheTouch+Mo0onbasedUIsome0mesfeltinteres0ngwhensoundmixing-But,themo0on+touch-basedUIwasingeneraldescribedascomplicatedand0resome,asitrequiredmorephysicaleffortcomparedtothetouchgesturetechniques.-Par0cipantsexplainedthattheydidnotliketheonlyMo0on-based.becauseitwastoocomplicatedtocoordinate0l0ng,drawing
Conclusion
• Easyandintui8veinterac8onisrequired-Wewantedtofindouthowuserswouldenjoyandassess3typegestureusingtheexampleofsoundmixingsystem-Describedthreedifferentsoundmixingmodali0es(1)ATouchonlyUI,(2)ATouch+Mo0ononlyUI,and(3)onlymo0ongesturesonamodernsmartphone.-ThesegestureweredesignedbyPar0cipatorydesignmethod-Thesetechniques(3typeUI)weretestedinauserstudywithsevenpar0cipants.-Touchgesture,touch-mo0onwasassessedveryposi0velybythepar0cipants-Theseresultsareencouragingforfurtheradvancingsoundmixing-supportedtechniques
32
Thank you
[1] Bakker, S., Antle, A. N., & Van Den Hoven, E. (2012). Embodied metaphors in tangible interaction design. Personal and Ubiquitous Computing, 16(4), 433-449. [2] Antle, A. N., Droumeva, M., & Corness, G. (2008, June). Playing with the sound maker: do embodied metaphors help children learn?. In Proceedings of the 7th international conference on Interaction design and children (pp. 178-185). ACM. [3] Song, W., Xi, Y., Ikram, W., Cho, S., Cho, K., & Um, K. (2014). Design and Implementation of a Web Camera-Based Natural User Interface Engine. In Future Information Technology (pp. 497-502). Springer Berlin Heidelberg. [4] Beamish, T., Maclean, K., & Fels, S. (2004, April). Manipulating music: multimodal interaction for DJs. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 327-334). ACM. [5] Dahl, L., & Wang, G. (2010). Sound bounce: Physical metaphors in designing mobile music performance. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), Sydney, Australia. [6] Rasamimanana, N., Bevilacqua, F., Schnell, N., Guedy, F., Flety, E., Maestracci, C., ... & Petrevski, U. (2011, January). Modular musical objects towards embodied control of digital music. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction (pp. 9-12). ACM. [7] Gibbs Jr, R. W. (2014). Embodied Metaphor. The Bloomsbury Companion to Cognitive Linguistics, 167. [8] Boehm, J. (2012). Natural user interface sensors for human body measurement. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci, 39, B3. [10] Boulos, M. N. K., Blanchard, B. J., Walker, C., Montero, J., Tripathy, A., & Gutierrez-Osuna, R. (2011). Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation. International journal of health geographics, 10(1), 45. [11] Noessel,C., "cooper", http://www.cooper.com/journal/2013/05/summoning-the-next-interface-agentive-tools-sauna-technology, Last Accessed April 5, 2015
[12] Villamor, C., Willis, D., & Wroblewski, L. (2010). Touch gesture reference guide. Touch Gesture Reference Guide. [13] Joselli, M., & Clua, E. (2009, October). grmobile: A framework for touch and accelerometer gesture recognition for mobile games. In Games and Digital Entertainment (SBGAMES), 2009 VIII Brazilian Symposium on (pp. 141-150). IEEE. [14] Hinckley, K., & Song, H. (2011, May). Sensor synaesthesia: touch in motion, and motion in touch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 801-810). ACM. [15] Hinrichs, U., & Carpendale, S. (2011, May). Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 3023-3032). ACM. [16] Shin, D. H. (2012). 3DTV as a social platform for communication and interaction. Information Technology & People, 25(1), 55-80. [17] Izhaki, R. (2013). Mixing audio: concepts, practices and tools. Taylor & Francis. [18] Radkowski, R., & Stritzke, C. (2012, January). Interactive hand gesture-based assembly for augmented reality applications. In ACHI 2012, The Fifth International Conference on Advances in Computer-Human Interactions (pp. 303-308). [19] Steventon, J. (2014). DJing for dummies. John Wiley & Sons. [20] Ruiz, J., Li, Y., & Lank, E. (2011, May). User-defined motion gestures for mobile interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 197-206). ACM.
Reference