fellows l dissertation 2008 9

Upload: uditagarwal4314

Post on 29-May-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/9/2019 Fellows L Dissertation 2008 9

    1/93

    Wiinote - Musical Interface for the Wii Remote.

    Louis Fellows

    Bachelor of Science in Computer Science with Honours

    The University of Bath

    April 2009

  • 8/9/2019 Fellows L Dissertation 2008 9

    2/93

    This dissertation may be made available for consultation within the Uni-versity Library and may be photocopied or lent to other libraries for thepurposes of consultation.

    Signed:

  • 8/9/2019 Fellows L Dissertation 2008 9

    3/93

    Declaration

    Submitted by: Louis Fellows

    COPYRIGHT

    Attention is drawn to the fact that copyright of this dissertation rests with its author. TheIntellectual Property Rights of the products produced as part of the project belong to theUniversity of Bath (see http://www.bath.ac.uk/ordinances/#intelprop).This copy of the dissertation has been supplied on condition that anyone who consults itis understood to recognise that its copyright rests with its author and that no quotationfrom the dissertation and no information derived from it may be published without theprior written consent of the author.

    Declaration

    This dissertation is submitted to the University of Bath in accordance with the requirementsof the degree of Batchelor of Science in the Department of Computer Science. No portion ofthe work in this dissertation has been submitted in support of an application for any otherdegree or qualification of this or any other university or institution of learning. Exceptwhere specifcally acknowledged, it is the work of the author.

    Signed:

  • 8/9/2019 Fellows L Dissertation 2008 9

    4/93

    Abstract

    The aim of this project is to create an interface whereby the Wiimote can be used as aninstrument to create music. The idea allows for a brief exploration of the properties of theWiimote whilst providing a system with multiple methods of creating sound using all theinput methods available within the Wiimote. With an attempt to keep the Wiimote andthe system as separate as possible by allowing the system to be controlled remotely.

  • 8/9/2019 Fellows L Dissertation 2008 9

    5/93

    Contents

    1 Introduction 1

    1.1 Problem Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Aims . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    1.3 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

    1.3.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . 2

    1.3.2 Non-Functional Requirements . . . . . . . . . . . . . . . . . . . . . . 3

    1.4 Project Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    1.4.1 The System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    1.4.2 Required Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

    1.4.3 Gantt Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

    2 Literature Survey 6

    2.1 Gesture Capture Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

    2.1.1 Hidden Markov Models (HMM) . . . . . . . . . . . . . . . . . . . . . 7

    2.1.2 Conditional Random Fields (CRF) . . . . . . . . . . . . . . . . . . . 8

    2.2 Wii Remote Connection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

    2 2 1 Java 10

  • 8/9/2019 Fellows L Dissertation 2008 9

    6/93

    CONTENTS iii

    3 Design 14

    3.1 Features vs Playability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

    3.2 UI Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

    3.3 Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

    3.4 Gestures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

    3.5 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

    4 Requirements 19

    4.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

    4.1.1 System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

    4.1.2 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

    4.1.3 Gesture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

    4.1.4 Sound . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

    4.2 Non-Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . 22

    5 Implementation and Testing 24

    5.1 Gesture Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

    5.1.1 Recogniser Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

    5.1.2 Implementation in Java . . . . . . . . . . . . . . . . . . . . . . . . . 27

    5.1.3 New Gestures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

    5.2 Instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

    5.2.1 Pitch/Roll Instrument . . . . . . . . . . . . . . . . . . . . . . . . . . 31

    5.2.2 Flute Instrument . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

    5.2.3 IR Instrument . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

    5.3 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

  • 8/9/2019 Fellows L Dissertation 2008 9

    7/93

    CONTENTS iv

    5.6 Instrument Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

    5.6.1 Results Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

    6 Conclusions 42

    6.0.1 Further Developments . . . . . . . . . . . . . . . . . . . . . . . . . . 43

    A Design Diagrams 47

    B Raw results output 49

    B.1 User Questionnaires . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

    C A Final View of the System Requirements 57

    D Code 59

    D.1 wiinote.engine.ListenerFlute.java . . . . . . . . . . . . . . . . . . . . . . . . 60

    D.2 wiinote.engine.ListenerLeds.java . . . . . . . . . . . . . . . . . . . . . . . . . 60

    D.3 wiinote.engine.ListenerPitchRoll.java . . . . . . . . . . . . . . . . . . . . . . 60

    D.4 wiinote.engine.MidiOut.java . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

    D.5 wiinote.engine.MWProcess.java . . . . . . . . . . . . . . . . . . . . . . . . . 62

    D.6 wiinote.engine.Wiinote.java . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

    D.7 wiinote.gesture.AccDirectionObject.java . . . . . . . . . . . . . . . . . . . . 66

    D.8 wiinote.gesture.AccelerationArray.java . . . . . . . . . . . . . . . . . . . . . 67

    D.9 wiinote.gesture.ConvArray.java . . . . . . . . . . . . . . . . . . . . . . . . . 68D.10 wiinote.gesture.GestureObject.java . . . . . . . . . . . . . . . . . . . . . . . 69

    D.11 wiinote.gesture.GestureRecognisedEvent.java . . . . . . . . . . . . . . . . . 69

    D.12 wiinote.gesture.GestureRecogniser.java . . . . . . . . . . . . . . . . . . . . . 70

    D 13 ii Li G C j 71

  • 8/9/2019 Fellows L Dissertation 2008 9

    8/93

    List of Figures

    1.1 A Wii Remote . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

    1.2 Gantt Chart Showing the Planned Timing of Work Throughout the Project 5

    2.1 An example Hidden Markov Model . . . . . . . . . . . . . . . . . . . . . . . 7

    3.1 The Wiinote User Interface. . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

    3.2 The Message Window (left) and the Note Window (Right) . . . . . . . . . 16

    3.3 The two proposed IR instruments. Idea 1 (left) and the chosen method, idea

    2 (right) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

    5.1 A Visual Representation of the Array Condensing Procedure . . . . . . . . 25

    5.2 A Visual Representation of the Gesture Recognition Tree . . . . . . . . . . 26

    5.3 Diagram showing the 9 directions recognised in the system (left) and a pos-sible set of 15 directions for use in 3-D space (right) . . . . . . . . . . . . . 27

    5.4 Diagram showing how the contents of an AccelerationArray object representa gesture before and after calling the function removeLikeMotions() . . . . 28

    5.5 Buttons used with the Flute Instrument . . . . . . . . . . . . . . . . . . . . 32

    5.6 The Measurements Taken by the IR instrument . . . . . . . . . . . . . . . . 33

    5 7 The IR Drumsticks(left) Along with a Circuit Diagram of their Design(right) 34

  • 8/9/2019 Fellows L Dissertation 2008 9

    9/93

    LIST OF FIGURES vi

    A.1 UML Class Diagram of the Gesture Recognition Package . . . . . . . . . . . 48

  • 8/9/2019 Fellows L Dissertation 2008 9

    10/93

    List of Tables

    4.1 Table detailing the requirements covered in the system section. . . . . . . . 20

    4.2 Table detailing the requirements covered in the instruments section. . . . . 21

    4.3 Table detailing the requirements covered in the gesture section. . . . . . . . 21

    4.4 Table detailing the requirements covered in the sound section. . . . . . . . . 22

    4.5 Table detailing the requirements covered in the non-functional requirementssection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

    4.6 A Summary of all the requirements of the system. . . . . . . . . . . . . . . 23

    5.1 Flute instrument Button Combinations . . . . . . . . . . . . . . . . . . . . 32

    5.2 Successful gestures by number of repeat teachings for a 1 sided gesture . . . 36

    5.3 Successful gestures by number of repeat teachings for a 2 sided gesture . . . 36

    5.4 Successful gestures by number of repeat teachings for a 3 sided gesture . . . 36

    5.5 Successful gestures by number of repeat teachings for a 4 sided gesture . . . 37

    B.1 Table Containing Chris Questionaire Results . . . . . . . . . . . . . . . . . 50

    B.2 Table Containing Pauls Questionaire Results . . . . . . . . . . . . . . . . . 51

    B.3 Table Containing Dans Questionaire Results . . . . . . . . . . . . . . . . . . 52

    B 4 Table Containing Ellies Questionaire Results 53

  • 8/9/2019 Fellows L Dissertation 2008 9

    11/93

    Acknowledgements

    Id like to acknowledge:

    Liz, my Mum, for all the support,

    Chris for all the advice,

    and Ellie for keeping me sane! (Well, as sane as I was to start with!).

  • 8/9/2019 Fellows L Dissertation 2008 9

    12/93

    Chapter 1

    Introduction

    1.1 Problem Description

    Musical instruments have existed for many thousands of years and have defined culturessince the dawn of time. New musical instruments are still being created to this day withnotable modern examples such as the electric guitar and the digital keyboard.

    Most modern musical instruments are based heavily on classical examples (such as the

    aforementioned guitar and keyboard), However, advances in technology have allowed peopleto experiment with many weird and wonderful ways of creating music. One of the mostprominent of these being the Theremin, one of the earliest electronic instruments played bywaving ones hands around 2 radio antennas which in turn creates an often eerie, electronicsound.

    Recent times have also brought a wave of new human computer interaction methods suchas touchscreens and voice recognition. One of the more novel examples of hci is Nintendos

    Wii Remote (known informally as the Wiimote), used to control the Wii games console. Itcontains a three-axis linear accelerometer which can capture movements, allowing users tointeract with their games based on their movements.

    The Wiimote connects to the Wii console using Bluetooth. This means that a Wiimote canalso be connected to any system with Bluetooth (such as a PC) to provide accelerometer

  • 8/9/2019 Fellows L Dissertation 2008 9

    13/93

    CHAPTER 1. INTRODUCTION 2

    Figure 1.1: A Wii Remote

    1.3 Objectives

    The computer will need to take the raw data from the Wiimote, make the data useful tothe system and then use that data to decide what should be played. There should also besome method of gesture recognition to control the system functionality.

    For this to be viable as a musical instrument, this data refining and gesture recognitionmust happen in real time with a sufficiently high success rate to be reliable whilst giving amusical performance. A musical instrument that cannot be performed is not a particularly

    useful instrument.The gestures should also be customizable, to allow a user to perform whatever movementseems the most comfortable to them to perform an action. The system should also outputin a format that can be used by many different systems and thus give the possibility ofextending the system at a later date.

    Taking all this into account, below are the high level requirements of the system:

    1.3.1 Functional Requirements

    The system must be able to take the input from a Wiimote connected by Bluetoothin such a way that it is useful to the system.

    The system must be able to recognise a gesture made by the user with the Wiimote

  • 8/9/2019 Fellows L Dissertation 2008 9

    14/93

    CHAPTER 1. INTRODUCTION 3

    1.3.2 Non-Functional Requirements

    The system must work in real time to make giving a musical performance using theWiimote possible.

    The gesture recognition must be sufficiently accurate at judging gestures that the

    system can be reliably used to give a performance. The system should be able to play any song, given that the user has had enough

    training/practice (i.e. the system should act as an actual instrument, rather than atoy)

    1.4 Project Plan

    1.4.1 The System

    The completed system will consist of 4 main sections. These are:

    Raw Data Gathering

    This is the initial section of the system. Its purpose is to take a snapshot of the raw datafrom the Wiimote and hand it to the system. There are already libraries available that aredesigned for this task which can be used fr this project, therefore making this section theeasiest to complete.

    Music Generation

    This section takes the raw input from the section above and uses it to create sound. This

    would be done by taking the inputs and running an algorithm that generates a note and anyproperties of the note that are needed. These details are then passed to the next section.

    Midi Synthesis

  • 8/9/2019 Fellows L Dissertation 2008 9

    15/93

    CHAPTER 1. INTRODUCTION 4

    it. My initial reading has determined a number of methods which will attempt to decipherthe gestures from the data. This will also have to be done reliably and in real time.

    1.4.2 Required Resources

    Below is a list of resources that I foresee I will need to successfully complete the project.

    Software Resources

    Libraries to communicate with and retrieve raw data from the Wiimote. These areavailable freely on the Internet.

    Libraries to communicate with MIDI Systems. These are also available freely on the

    internet.

    A compiler for either the C, C++ or Java language, whichever becomes a more obviouschoice after the literature survey. Currently the most obvious language would seemto be Java. Compilers for both are available from the library computers.

    Hardware Resources

    A Wiimote. I own two of these so theyre easily available

    A Bluetooth Connection. My Computer has one of these, so it is already available.

    Computer for software creation and write-up of dissertation. I have one of these andthere are several available in the library.

    Literature Resources

    Papers regarding gesture capture

    Papers regarding projects dealing with accelerometers

  • 8/9/2019 Fellows L Dissertation 2008 9

    16/93

    CHAPTER 1. INTRODUCTION 5

    1.4.3 Gantt Chart

  • 8/9/2019 Fellows L Dissertation 2008 9

    17/93

    Chapter 2

    Literature Survey

    There are many areas of this project that need to be investigated before any major decisionscan be made. The title of the project leaves a large scope for creation of a system and whilstsimply jumping in and seeing what happens could work well for building this system, itleaves the possibility of suffering the same problems and making the same errors as foundby those who have attempted similar endeavours in the past. As Konrad Adenauer hassaid, History is the sum total of things that could have been avoided .

    This literature survey then, is to take the ideas of the project proposal, explore the pos-sibilities within them and discover the methods by which the project can progress in themost successful fashion possible.

    I have split this document into three main parts, the first looking at various methods ofcapturing gestures made by a user. The second section is a brief look at the various availablelibraries which exist to connect with a Wii Remote and form a choice over which to use tocomplete the project, and finally a look at musical instruments and a look at how they canbe represented in a computer system.

    2.1 Gesture Capture Methods

    The ability to recognise human movement with a computer system has been experimented

  • 8/9/2019 Fellows L Dissertation 2008 9

    18/93

    CHAPTER 2. LITERATURE SURVEY 7

    (Moeslund et al., 2001)(Moeslund et al., 2006).

    A method receiving much attention is gesture capture using accelerometers (Pylvninen,2005)(Hollar, Perng and Pister, 2000)(Sakaguchi et al., 1996). Accelerometers are instru-ments that measure the rate at which the velocity of an object is changing (Accelerometer:In Encyclopedia Britannica from Encyclopedia Britannica Online, 2009). This can be used

    to measure an objects movement in a 3D space. With the Wiimote, Nintendo has intro-duced accelerometers into its controllers, allowing its users to control onscreen actions bymoving the controller. The development of these cheap, widely available accelerometerbased systems has allowed accelerometers to be used for far wider purposes than originallypossible.

    There are several methods of recognising a gesture captures by a system. During myresearch, I learned of some of these methods which are described below:

    2.1.1 Hidden Markov Models (HMM)

    Hidden Markov Models (HMMs) are a generative approach to gesture capture (Morency,2007) and can be used when a system of states can be described as a Markov process1 withunknown parameters. Given a set of observable states an HMM can calculate the mostlikely sequence of hidden state transitions from a sequence of the observed states.

  • 8/9/2019 Fellows L Dissertation 2008 9

    19/93

    CHAPTER 2. LITERATURE SURVEY 8

    ities of transition between each state, along with a start vector containing the probabilitiesof starting at each state. The status of these hidden states is unknown by the system.In addition to these hidden states are a number of observable states, along with a matixcontaining the probability that the system is in a certain hidden state when an observedstate is seen.

    If a system can be described with an HMM then three separate problems can be solved.

    Firstly, given a sequence of observed states, the most likely sequence of hidden statescan be calculated (known as decoding).

    Secondly, given the HMM, the probability of a sequence of observed states occurringcan be calculated(known as evaluation).

    Finally, we can generate an HMM given a sequence of observations (Known as learn-

    ing).

    To use this to recognise gestures we would use the learning algorithm to create a HiddenMarkov Model for each of the gestures that could be recognised. Then when a gesture isperformed, we can use an evaluation of the sequence of observed states with each of theHMMs in the system, the HMM returning the highest probability is the most likely to bethe gesture the user was performing. Knowing this we can perform the action related to

    the gesture.HMMs are popular because they are simple and flexible (Murphy, 2002). They have beenused extensively in speech recognition due to their ability to model speech in a mathemat-ically tractable way (Warakagoda, 1996). Along with being used for speech recognition, ithas also been used in gesture recognition quite successfully (Pylvninen, 2005).

    2.1.2 Conditional Random Fields (CRF)

    Conditional Random Fields (CRFs) are a discriminative (as opposed to generative as theHMMs above) method of gesture recognition (Wang, Quattoni, Morency, Demirdjian andDarrell, 2006). It uses a discriminative sequence model with a hidden state structure(Wanget al., 2006) and attempts to model the entire sequence with the given input sequence.

  • 8/9/2019 Fellows L Dissertation 2008 9

    20/93

    CHAPTER 2. LITERATURE SURVEY 9

    The graph can be laid out in any arbitrary manner, however it is more often set in a chainof vertices with an edge between each Yi-1 and Yi vertex. This layout allows the use ofefficient algorithms for solving the three following problems:

    Calculating the most probable label sequence Y given X. (Known as decoding).

    Generating conditional distributions between vertices and functions from trainingdata (Known as training).

    Calculating the probability of a given sequence Y occurring given X (Known asinference).

    Here it is useful to note the similarities of the solutions between a CRF and an HMM. Thesame method of using these solutions can be applied to a CRF to determine the gesture

    attempted by the user from the observed data.

    According to (Wallach, H.M., 2004), the advantage of CRFs as opposed to HMMs is intheir conditional nature which allows the relaxation of independence assumptions whichare required in HMMs.

    The CRF method has been extended further into the two following methods:

    Dynamic CRFs

    Dynamic CRFs (DCRFs) are an extension of the CRF method whose structure and pa-rameters are repeated over a sequence (Morency, 2007). When using hidden variables withthis system the system becomes difficult to optimise (Morency, 2007).

    Latent Dynamic CRFs

    Latent Dynamic CRFs (LDCRFs) are an extension of the CRF and the DCRF methodswhich attempts to incorporate hidden fields and sub-structures to better recognise gestures(Morency, 2007). The original testing was based on video input from a camera which at-tempted to recognise the movements of a human subject. This method was seen to comparefavourably to other methods such as HMMs in the paper by Morency et al (Morency, 2007).

  • 8/9/2019 Fellows L Dissertation 2008 9

    21/93

    CHAPTER 2. LITERATURE SURVEY 10

    2.2.1 Java

    Java would be my preferred language to attempt the project with, I have used it extensivelyin the past and this experience would mean less work to learning a new language. Thereare three possible libraries for use with Java, these are WiiRemoteJ, Wiimote-Simple andWiiuseJ.

    WiiRemoteJ

    WiiRemoteJ is a library written in Java and is currently hovering between beta and fullrelease. There is very little documentation supporting it to be found on the net and noapparent official home for the system, although many videos and images can be founddisplaying some of its functionality (WiiRemoteJ Technical Demo, 2009). Attempting to

    run the demo of WiiRemoteJ on my system caused numerous errors and did not work,making it unfit for use within this system. It has been mentioned here however as it isconsidered one of the more feature-rich libraries for the Wiimote.

    Wiimote-Simple

    http://code.google.com/p/Wiimote-simple/

    Wiimote-Simple is a library designed as an alternative to WiiremoteJ, It is open source,however as the designer mentions, it has less functionality than WiiRemoteJ but is offeredas an alternative for people who could not get WiiRemoteJ to work.

    It does provide the ability to read accelerometer data, IR data and respond to buttonpushes which should be enough to use the Wiimote to interface with the system. As withWiiRemoteJ, there is little documentation available for the implementation, and is also notas well supported as WiiRemoteJ.

    WiiuseJ

    http://code.google.com/p/wiiusej/

    WiiuseJ is another library written in Java It provides a Java API for the wiiuse system

  • 8/9/2019 Fellows L Dissertation 2008 9

    22/93

    CHAPTER 2. LITERATURE SURVEY 11

    2.2.2 C++

    I have never used C++ before, and this would mean that choosing any of the followinglibraries would mean learning the language from scratch, this puts the library in thissection at a disadvantage, however, I will still summarise what I learnt of the systemfor completeness.

    WiiYourself

    http://wiiyourself.gl.tter.org/

    WiiYourself is a library for C++ which is quite well featured and has been used in thedevelopment of several projects (of which details can be found on the developers site).

    The system seems well supported and works with most commercially available Bluetoothstacks, although there doesnt seem to be a great deal of documentation to accompany thesystem. WiiYouself currently only works on Windows, which may limit the abilities of anysystem created with it.

    2.2.3 C

    The final language on the list is C. My knowledge of C is basic but functional making Ca viable choice for the development of the system. I discovered two C libraries during myresearch, CWiid and WiimoteAPI.

    CWiid

    http://abstrakraft.org/cwiid/

    CWiid is a library developed for use in C and is quite full of features (some unique to thisimplementation, such as an IR Tracker and an interface for Python). Like WiiuseJ andWiimote-Simple it is Open Source and is well supported (including a roadmap of futurefeatures to be implemented, as well as bugfixes)

    It is currently at version 0.6.00 (as of April 2009) and as well as being well supported (with) f

  • 8/9/2019 Fellows L Dissertation 2008 9

    23/93

    CHAPTER 2. LITERATURE SURVEY 12

    buttons, but seemingly not the accelerometers. The lack of accelerometer support greatlylimits the feasibility of using this library.

    There is a little documentation available, but not a great deal, and support for the systemseems to have ended about two year ago (as of April 2009)

    2.2.4 Decision

    From looking at the available options, It has become clear to me that due to my proficiencywith Java relative to the other available languages that I should be using a Java basedlibrary. This leaves WiiremoteJ, WiiSimple and WiiuseJ.

    Of these, as WiiremoteJ is not compatible with the computer Ill be coding on, and thevery basic nature of WiiSimple, the best choice of library would seem to be WiiuseJ, and

    as such, I will be using this for the development of the system.

    2.3 Musical Interfaces

    As this project is the creation of a musical instrument some time should be dedicated tothe study of current musical instruments and how they create music, as such I shall nowbriefly go through the ideas found in my reading on music.

    2.3.1 Physical Instruments

    With most physical instruments the player crates sound by manipulating a part of theinstrument which vibrates (NH Fletcher, 1998). This could be a string on a guitar, the airin a trombone, the skin of a drum etc. The note is altered by altering the vibrating part(holding a fret on a guitar, lengthening the tube of a trombone, tightening the skin on a

    drum.)

    This holds true for all but a few instruments, these few are instruments created in theelectronic age and are not reliant on the player creating the vibration themselves (Glinsky,1992). A major example of this is the Theremin. The worlds first electronic instrumentand the only instrument in the world played without physically touching it (Glinsky, 1992).

  • 8/9/2019 Fellows L Dissertation 2008 9

    24/93

    CHAPTER 2. LITERATURE SURVEY 13

    act as the oscillators, and create the timbre of the note.)

    2.3.2 Synthetic Instruments

    This means that the system will need some method of creating synthetic sounds. For this

    project this will be handled by an external process. The system will take the informationfrom the Wii remote and create from it the information it needs (gestures, note frequency,volume etc.) and this will be passed to a digital synthesizer which will use this to makesound which can be played out.

    As I see it, there are two ways to achieve this, either have the external synthesizer writteninto the system such that it has the data passed directly into it, Or have the system outputto an intermediate representation which can be understood by the (or possibly severaldifferent) synthesizer(s).

    Such an intermediary is MIDI. I feel that outputting via MIDI is the best idea for thesystem as it allows a far wider number of musical synthesis systems to be used at will,however, MIDI uses discrete codes for notes, which holds the instrument I create to certainnotes instead of being able to play all possible frequencies. This is a limitation on theinstrument, but may well make it easier to play (and to code!)

  • 8/9/2019 Fellows L Dissertation 2008 9

    25/93

    Chapter 3

    Design

    The design of the system created many challenges and in this section I will attempt todiscuss the decisions which led to the final system.

    3.1 Features vs Playability

    One of the biggest issues throughout the development was the balancing of the feature-set

    of the system with its usability. From the outset, I wanted the system to behave as muchlike a musical instrument as possible. By this I envisioned the computer-system as beingas invisible as possible, thus making the journey between the Wiimote and the soundsproduced perceivable as a single step, without any intermediate factors.

    To do this effectively, It seemed to me that the software between the Wiimote and theMIDI output would have to have enough features within it that nothing could be seenas missing, however it needed to be lightweight enough that the user could focus on the

    playing of the instrument and not on fiddling with options and settings within the system.This led to one of the largest questions of the project What features need to be in thissystem, and which should be removed?.

    The set of features that I had originally envisioned was cut down to a subset which could beimplemented and complement each other well without overburdening the user whilst they

  • 8/9/2019 Fellows L Dissertation 2008 9

    26/93

    CHAPTER 3. DESIGN 15

  • 8/9/2019 Fellows L Dissertation 2008 9

    27/93

    CHAPTER 3. DESIGN 16

    decision to limit the number of features, see the section 3.1 Features vs Playability above).

    This decision however led to a further issue, which was how to display information fromthe system to the user without further complicating the interface and in such a way thatthe user could ignore the messages if they needed to.

    This led to the development of two new Java classes, the MessageWindow class and the

    NoteWindow class (see Fig 3.2). I chose to add these windows to the system as a methodof presenting useful information to the user without interrupting the use of the system (forexample with dialog boxes).

    Figure 3.2: The Message Window (left) and the Note Window (Right)

    3.3 Control

    As the system was intended to perform as a musical instrument, It seemed that the systemshould be controllable from away from the computer. There were two ways this could beapproached. The first was to map the system commands to different buttons upon theWiimote and allow those options to be chosen by pressing the correct button. The secondwas to introduce a method of gesture recognition such that each gesture would triggerdifferent options within the system.

  • 8/9/2019 Fellows L Dissertation 2008 9

    28/93

    CHAPTER 3. DESIGN 17

    as an instrument and a much larger set of gestures could be added than that amount ofbuttons on a Wiimote.

    One issue would be the removal of the Accelerometers from the musical half of the system.This can be avoided by using the Nunkchuk extension controller to interpret gestures. Assuch, the entire Wiimote was free to be used as an instrument, and using their off-hand,the user could control system functions whilst still performing using the Wiimote.

    3.4 Gestures

    After the decision to use gesture recognition to issue system commands remotely a newset of choices had to be made, the most significant of these was Which method of gesturerecognition should be implemented?. After taking a look at the domain of gesture recog-

    nition (see section 2.1 above). The choices were narrowed down to using Hidden MarkovModels due to their proven abilities for gesture recognition or to using a new method whichwas unproven.

    Hidden Markov Models were a more simple choice, there were already implementationsavailable in Java and documentation supporting both how to create and use them andtheir effectiveness in practice. They have also been used previously with Wiimote basedgesture recognition pro jects (Schlomer, Poppinga, Henze and and Boll, 2008).

    The new method was an idea I imagined whilst thinking about gesture capture. Afterresearching the field it became apparent that there were no implementations of the methodpreviously attempted. This option seemed more risky as the implementation may not be auseful method of recognition (There may well be a reason why it has not been implementedbefore!). Notes on the implementation of this method can be found in the next chapter.

    In the end, the decision was taken to go with the unproven method, The thinking behindthis decision was that re-implementing an HMM would provide less worth than attempting

    a new method and observing its usefulness. The chance to create something new overtookthe rewards for taking the secure option.

    3.5 Instruments

  • 8/9/2019 Fellows L Dissertation 2008 9

    29/93

    CHAPTER 3. DESIGN 18

    acceleration of the Wiimote to determine its volume and the roll to determine the note.The second instrument proved too inaccurate when choosing a volume and was also farmore tiring in practice, reducing the time it could be played. The first method was farless tiring and seemed more intuitive to play. Therefore, the first style of instrument waschosen to be implemented into the system.

    The button input had less choices available to it (as any way of playing it will involvepressing the buttons to play notes). The only choices here were how to play notes usingthe buttons. There were not enough buttons to assign one to each note in a MIDI octave (asrequired by requirement FI02, table 4.1.2). This led to the idea of taking inspiration fromwind instruments which use a combination of different finger positions to reach differentnotes, Thus a combination of different buttons are used to represent each note.

    The third input method was the infra-red camera mounted on the front of the Wiimote.This picks up IR light sources and displays them as dots in an X-Y plane, then these values

    are passed to the system through the WiiuseJ system. There were many methods this couldhave been used to create music. The two foremost methods (see fig 3.3) were to either setup an array of IR LEDs which the Wiimote could be pointed at and used to measure theWiimotes position in space then map this value to a note (much like how the position of amusicians hand near a theremin leads to a note being produced.) The second method wasto have the Wiimote as a static position and move IR sources in front of it to create music,with the position of each LED controlling different aspects of the music (such as volume,pitch etc).

    In the end, the second idea was chosen. Both would require some manner of hardwarecreation (i.e. building an array of LEDs), however, the hardware for the second idea wasfar less complex requiring only a cell and an LED (see circuit diagram, fig 5.7 to work (withno capacitors or resistors). This simplicity made it far easier (and cheaper) to build twoLED instruments than it would have been to build an array of LEDs for the Wiimote tolook at.

  • 8/9/2019 Fellows L Dissertation 2008 9

    30/93

    Chapter 4

    Requirements

    The project contains a number of broad areas of work. As such it seems appropriate toorder the functional requirements of the system under these headings. The identified areasare:

    1. System Requirements

    2. Instrumental Requirements

    3. Gesture Requirements

    4. Sound Requirements

    The following section will look at each of these areas separately in order to devise a set ofrequirements that the system must meet.

    4.1 Functional Requirements

    4.1.1 System

    The system requirements are those that relate to the system as a whole and not to any of

  • 8/9/2019 Fellows L Dissertation 2008 9

    31/93

    CHAPTER 4. REQUIREMENTS 20

    present all options to the user without the need to learn a list of commands. It also opensthe instruments to use by less confident computer users.

    A less important extension to the development of a user interface would be to displaymusical information to the user. A useful idea of this would be to display the currentlyplaying note (or midi note number etc) to the user as they are playing, much like a digitaltuner is used with real instruments. This could then be used to help users create a tuneor develop a musical ear, expanding the possibilities of the system.

    Whilst talking about the user interface, it would also be a useful extension to provide awindow in which system events can be displayed to the user without interrupting theirmusical session. This would not be a high priority requirement, however it would be avaluable addition to the system.

    ID Description

    FSy01 The system should present the user with a graphical user interfaceFSy02 The system should gather Wiimote data using WiiuseJFSy03 The system should inform the user of the note playingFSy04 The system should inform the user of system developments

    Table 4.1: Table detailing the requirements covered in the system section.

    4.1.2 Instruments

    The instruments section is a central section of the system and much of the rest of thedevelopment is designed around it. In order to create a musical interface for the Wiimote, itis a high priority that the system delivers a method capable of supporting multiple musicalinstruments, by this it is meant that the system should contain some way of switchingbetween different methods of using the Wiimote as a tool to interface with the system.

    It is also of a high priority that any instruments developed for the purposes of the project arecapable of playing the entire range of notes in 12-TET tuning. This would (theoretically)mean that any musical piece written in this tuning could be played using the Wiimoteinstruments and thus creating the idea of them being used as real instruments.

    A feature which is of low priority but which would enhance the functionality of the system

  • 8/9/2019 Fellows L Dissertation 2008 9

    32/93

    CHAPTER 4. REQUIREMENTS 21

    ID Description

    FI01 The system must support multiple instrumentsFI02 Each instrument must be able to play notes A-G#FI03 Users should be able to add new instruments to the systemFI04 Should contain instruments based on all the Wiimotes input methods

    Table 4.2: Table detailing the requirements covered in the instruments section.

    4.1.3 Gesture

    The gesture system is another large section of the system. As such there are a numberof requirements which govern how it fits in with and interacts with the system. The firstrequirement that should be noted here is that the system MUST have some method ofrecognising gestures. This is of very high priority in the system as the gestures will be usedto control the system remotely. This should also be a requirement, that the system shouldbe controllable using the gesture recognition functionality.

    One thing to note here is how much functionality the gestures should be able to controlwithin the system. Giving too much control could lead to users having too many gesturesto memorise and the system becoming overwhelming. Having too few undermines thefunctionality of the feature. This collision of requirements can be solved by providing theability to control as many features of the system as possible, and at the same time allowing

    the user to select which actions are performed by which gestures. This allows the user todictate what actions are useful to them, and how to perform them.

    This connection of actions to gestures will also need some interface to the user to performthese operations. This then extends the requirements to provide some interface for thisoperation. Also useful for this would be the ability to give each gesture a human-friendlyname to make it simpler to differentiate between gestures in the system.

    Also, if it is possible to assign gestures to actions, it should be possible to define new

    gestures within the system. This would support both the preferences of the user and thepossibility for future growth to the systems functionality in the future.

    ID Description

    FG01 The system must have some method of identifying gestures.

  • 8/9/2019 Fellows L Dissertation 2008 9

    33/93

    CHAPTER 4. REQUIREMENTS 22

    4.1.4 Sound

    The final area of the system covers the creation of sound based on the inputs from the in-struments created. As decided upon in previous sections, the most straightforward methodof output was to output MIDI messages to another system which could then take careof the generation of sound (for example, a MIDI synth or CSound.) As such, the major

    requirement of this section is that the system is able to connect to a MIDI device and sendto it messages of what to play determined by the user input.

    Further to this are two lower priority requirements. The first is the option to switch betweenplaying a single note and playing a chord. This functionality would allow a user to createa range of different sounds and make it possible to form a lead/rhythm dynamic musically.

    Leading from this, the system could be made into a more powerful tool by the addition ofbasic recording functionality. This could be achieved by noting the MIDI messages sent

    by the system and whilst at the same time as sending them, recording a copy into a MIDIsequence file. This recording could then be played as a backing track allowing the user aricher musical experience. However it is not core to the systems functionality and as suchis still a low priority.

    ID Description

    FSo01 The system should support MIDI.FSo02 The system should be able to record the instruments playing.

    FSo03 The system should allow the playing of both chords and single notes

    Table 4.4: Table detailing the requirements covered in the sound section.

    4.2 Non-Functional Requirements

    As always, alongside the functional requirements of the system are the non-functionalrequirements. These are requirements used to judge the operation of the system.

    A major requirement here is that the delay between the playing of a note on the Wiimoteand the sounding of the note by the computer is perceived as instantaneous, by this it ismeant that there is no visible delay between cause and effect. This should be viable under

  • 8/9/2019 Fellows L Dissertation 2008 9

    34/93

    CHAPTER 4. REQUIREMENTS 23

    ID Description

    NF01 The system should be able to play sounds in real timeNF02 The system must recognise gestures with at least 75% accuracy

    Table 4.5: Table detailing the requirements covered in the non-functional requirements

    section.

    Functional Requirements

    System Requirements

    ID Description

    FSy01 The system should present the user with a graphical user interface

    FSy02 The system should gather Wiimote data using WiiuseJFSy03 The system should inform the user of the note playingFSy04 The system should inform the user of system developments

    Instrument Requirements

    ID Description

    FI01 The system must support multiple instrumentsFI02 Each instrument must be able to play notes A-G#FI03 Users should be able to add new instruments to the systemFI04 Should contain instruments based on all the Wiimotes input methods

    Gesture Requirements

    ID Description

    FG01 The system must have some method of identifying gestures.FG02 The gesture system must be able to control system functionality.FG03 The system must be able to learn new gestures.FG04 Gestures should have a friendly Name available to the user.

    FG05 The user should be able to assign actions to gestures.FG06 The system should provide some interface for managing gestures.

    Sound Requirements

    ID Description

    FSo01 The system should support MIDI.

  • 8/9/2019 Fellows L Dissertation 2008 9

    35/93

    Chapter 5

    Implementation and Testing

    5.1 Gesture Recognition

    As the gesture recognition system used is a new system, I will now describe how it works.Following that, I will describe how it was implemented in Java.

    5.1.1 Recogniser Method

    The gesture recogniser is split into three distinct sections. The first captures the movementsand organises them into a state that the next section can use. The second section takesthe input captured by the first, then attempts to match what it has found to gestures ithas learnt previously, the third section then acts upon what the second section has found.

    In this project, the system knows a gesture is occurring as the user holds the C buttonon the nunchuk extension whilst they are performing the gesture. During this time all

    readings coming from the accelerometers within the nunchuk are stored as samples of thewhole gesture.

    Each of these samples are assigned a direction based on the accelerometer measurementstaken at that time. All these samples are stored in an array with their direction.

    When the user releases the C button the system traverses the array and removes any sample

  • 8/9/2019 Fellows L Dissertation 2008 9

    36/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 25

    A f A C

  • 8/9/2019 Fellows L Dissertation 2008 9

    37/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 26

  • 8/9/2019 Fellows L Dissertation 2008 9

    38/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 27

    which needs to be followed is null, the tree returns a Gesture Not Recognised value. Avisual representation is found in figure 5.2

    The third section takes the ID returned from the tree and uses it to determine the actionof the system. This section is different in every system as each system will have differentactions to perform.

    5.1.2 Implementation in Java

    The recogniser was implemented in it own Java package as to allow its reuse in futureprojects. This package contained several classes each covering separate parts of the imple-mentation. A class diagram of the gesture recognition system can be found in the appendixA.1.

    The system recognises gestures in 2 dimensions, the X and Z directions1. Thus performinga gesture is akin to drawing an image on a blackboard. This was done to reduce the numberof directions being used. The system recognises 9 directions (as seen in fig 5.3) which areNorth, North-East, East, South-East, South, South-West, West, North-West anda point representing no movement called Hold. Figure 5.3 also shows a possible expansionto this to cover 3D space.

  • 8/9/2019 Fellows L Dissertation 2008 9

    39/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 28

    as part of its construction takes the input of the X and Z acceleration and decides upon thedirection that this represents. Each sample that is taken is stored in an AccDirectionObjectand all of the AccDirectionObjects are stored, in order, in an AccelerationArray object.This AccelerationArray object can then be thought of as the entire gesture.

    When the gesture has finished being captured the AccelerationArray object calls its re-moveLikeMotions() function. This removes all AccDirectionObjects from the array where

    the direction is the same as the previous direction (fig 5.1, above) which condenses thearray down to a series of directions which define the gesture.

    Figure 5.4: Diagram showing how the contents of an AccelerationArray object represent agesture before and after calling the function removeLikeMotions()

    The learnt gestures are all stored in a tree, in this implementation the tree is built usingPathObject objects as nodes. Within each PathObject there is an array of PathObjectswhich represent all the connected nodes beneath the current node. The PathObject tree istraversed recursively by passing in an array of integers2 and following the nodes recursivelyuntil one of three possibilities occurs:

  • 8/9/2019 Fellows L Dissertation 2008 9

    40/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 29

    Once the tree has been traversed and a gesture ID returned (either, the ID of the gesturewhich has been performed or the Gesture Not Recognised ID) a GestureRecognisedEventis created and thrown to all GestureListeners listening to the GestureRecogniser class.

    The GestureListeners are where the system decides what to do when a gesture occurs. Eachsystem would create a different GestureListener from the GestureListener interface.

    In this system, the ListenerGestureCapture object implements GestureListener, within thisobject is defined several system events. When a GestureRecognisedEvent occurs the Ges-tureListener receives the gesture ID and looks up the system event related to it in theConvArray (short for Conversion Array) which is stored in the GestureRecogniser class.This array stores GestureObjects which contain the Gesture ID, the related system eventand a name for the gesture which can be displayed in the GUI. Once the ListenerGesture-Capture object has received the ID if the system event, the event is performed and thesystem returns to a dormant state to listen for further gestures.

    5.1.3 New Gestures

    New gestures are added by first capturing that gesture (as above, using the Acceleration-Array object) and then reducing the array in the same manner as before. Now, instead oftraversing the tree till its end we start at its root node and move through the tree followingthe directions in the array. There are 4 outcomes that could happen at each branch:

    1. If the branch we need to follow has a PathObject node at its end, we move forwardto that node, remove the top object in the array and repeat.

    2. If the branch we need to follow has a null value at its end, we create a new PathObjectobject and place it at the end of the branch as a new node. Then we move forwardto that node, remove the top object in the array and repeat.

    3. If we reach a node with no gesture ID and the array is exhausted then we set theGesture ID of that node to be the gesture being recorded.

    4. If we reach a node with a gesture ID and the array is exhausted there are two possibleoutcomes:

    ( ) Th t ID f th d d f th t t l th f

  • 8/9/2019 Fellows L Dissertation 2008 9

    41/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 30

    5.2 Instruments

    The instruments in the system are all implemented in the MWProcess class. Each instru-ment implements a different ActionListener which receives the necessary data from theWiimote and calls the functions in the MWProcess class with a different function for eachinstrument.

    The ActionListener classes (named ListenerPitchRoll, ListenerFlute and ListenerLeds) areattached to WiiuseJs Wiimote object when the instrument is being played, to switch in-struments the listeners are swapped out for the listener of the new instrument. By usingthis method it is simple to direct the correct data to the functions requiring it, withoutmaking a single, complex, listener. It also allows for new instruments to be added simplyin the future.

    The MWProcess class contains the functions that turn the raw data input from the Wiimote

    into a MIDI message output (which is sent using the MidiOut class). Each function takesthe raw data and determines a MIDI note number which is stored in a global variable.Each instrument then calls the function play() which sends two MIDI messages to theconnected MIDI device, the first is the message to stop playing the current note, and thesecond is to start playing the new note determined by the function3. An excerpt of theplay function can be found below:

    Listing 5.1: function play()

    // i f t h e n ot es a re t h e same , c on ti nu e p l a yi n g c u rr en t n ot ei f (newNote != play in gNote ) {

    // i f t h er e i s c u r re n tl y a n o te p l ay i ng i f ( p l a y i n g N o t e ! = 1) {

    S ho rt Me ss ag e o f f = n u l l ;try {

    o f f = W i i no t e . m i d i o u t . c r e a t e S h o r t M e s s a g e (ShortMe ssage .NOTE OFF, 0 , play ingN ote , 90) ;

    Wiin ote . midi ou t .sendMSG( o f f ) ;. . .

    } catch ( I n v a l i dM i d i D a t aE x c e p t io n e 1 ) {

  • 8/9/2019 Fellows L Dissertation 2008 9

    42/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 31

    // i f t h er e i s a new n ot e t o p la y i f ( newNote != 1) {

    S h o r tM e s sa g e o n = n u l l ;try {

    on = Wiinote . midiout . crea teSh ortM es s a ge (Sho rtMe ssag e .NOTE ON, 0 , newNote , 90 ) ;

    Wiin ote . mid iou t .sendMSG( on) ;. . .} catch ( I n v a l i dM i d i D a t aE x c e p t io n e 1 ) {

    e1 . pri ntSt ack Tra ce () ;} catch ( M i d iP o r tN o t Se t E xc e p ti o n e ) {

    Wiinote . gui .msgWindow . newMessage ( Midi Port Not Set ,3 ) ;

    }

    }// t h e new n ot e i s now t h e n ot e p l a y i n g

    play in gNote = newNote ;}

    5.2.1 Pitch/Roll Instrument

    The Pitch/Roll instrument takes the roll of the Wiimote as the note to play. So by leaningthe Wiimote to the left or right different notes can be chosen. The volume of the currentnote is determined by the pitch of the instrument with raising the Wiimote to verticalcreating silence and lowering it to horizontal creating maximum volume.

    5.2.2 Flute Instrument

    The flute instrument works by pressing a combination of buttons to receive a note. Thebuttons used are the Up,A,B,1 & 2 (as seen in figure 5.5).

    The Up button is used to raise the selected note by an octave, The 2 button raisesthe selected note by a semitone (thus playing a sharp of the note). Table 5.1 shows thecombinations used to play each note

  • 8/9/2019 Fellows L Dissertation 2008 9

    43/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 32

    Figure 5.5: Buttons used with the Flute Instrument

    Combination Note

    1 AA B

    1+A CB D

    1+B EA+B F

    1+B+A G

    Table 5.1: Flute instrument Button Combinations

    Hardware

    To play the IR instrument a pair of handheld IR devices had to be created. These weremade by creating a simple circuit composed of an AA Battery, a switch and an Infra-Red

    LED. These were then attached to a drumstick to make the circuit more robust and to aidin playing the instrument. An image of the drumstick controllers and a scientific view ofthe circuit can be found in figure 5.7.

    5 3 T ti

  • 8/9/2019 Fellows L Dissertation 2008 9

    44/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 33

    Figure 5.6: The Measurements Taken by the IR instrument

    5.3.1 Testing the Gesture Recogniser

    The testing of the gesture recogniser was carried out by 6 volunteers. Each of the volunteerswas shown an image of a gesture to map into the system and asked to perform the gesturea number of times such that the system could learn the gestures. The test went as follows:

    1. Perform 10 teaching gestures to allow the system to learn the gesture.

    2. Attempt the gesture 50 times, noting the number of successes/failures.

    3. Perform 40 further teaching gestures (bringing the number to 50).

  • 8/9/2019 Fellows L Dissertation 2008 9

    45/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 34

    Figure 5.7: The IR Drumsticks(left), Along with a Circuit Diagram of their Design(right)

    5.3.2 Testing the Instruments

    Using the same group of volunteers, each instrument was given to the user in turn and theuser given 5 minutes to play with the instrument in order to learn how it worked. Afterthe 5 minutes play time the experiment was performed as follows:

    1. Ask the user to play a specific note (C) and time their response

    2. Ask the user to play another note (E) and time their response

    3. Ask the user to play a third note (F#) and time their response

    4. Ask the user to play a three note tune (C, E, F#) and time their response

    Of the 6 volunteers in this test, 3 were musicians and 3 were non-musicians, this wasd l t h th i t t h dl d i th h d t l kill d i

  • 8/9/2019 Fellows L Dissertation 2008 9

    46/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 35

    Figure 5.8: The 4 Gestures the Volunteers were asked to perform

    4. Favourite Wiinote instrument (+ Why?).

    5. Easiest Wiinote instrument to play (+ Why?).

    6 L t f it Wii t i t t (+ Wh ?)

  • 8/9/2019 Fellows L Dissertation 2008 9

    47/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 36

    5.5 Gesture Recognition Results

    5.5.1 Results

    Tables 5.2, 5.3, 5.4 and 5.5 are the results of the gesture recognition testing (described in5.3.1). These results display the number of successful recognitions as a number (out of 50

    repetitions) and as a percentage, the final column is a mean of the 6 columns of results.

    Chris Ellie Paul Liam

    10 Reps 12/50 (24%) 14/50 (28%) 13/50 (26%) 10/50 (16%)50 Reps 37/50 (74%) 40/50 (80%) 37/50 (74%) 34/50 (68%)

    100 Reps 42/50 (84%) 45/50 (90%) 41/50 (82%) 40/50 (80%)

    Liz Dan Mean

    10 Reps 12/50 (24%) 14/50 (28%) 12.5/50(25%)

    50 Reps 38/50 (76%) 39/50 (78%) 37.5/50(75%)100 Reps 41/50 (82%) 43/50 (86%) 42/50(84%)

    Table 5.2: Successful gestures by number of repeat teachings for a 1 sided gesture

    Chris Ellie Paul Liam

    10 Reps 11/50 (22%) 13/50 (26%) 11/50 (22%) 9/50 (18%)50 Reps 37/50 (74%) 38/50 (76%) 35/50 (70%) 35/50 (70%)

    100 Reps 39/50 (78%) 41/50 (82%) 38/50 (76%) 41/50 (82%)

    Liz Dan Mean

    10 Reps 10/50 (20%) 12/50 (24%) 11/50 (22%)50 Reps 34/50 (68%) 37/50 (74%) 36/50 (72%)

    100 Reps 40/50 (80%) 41/50 (82%) 40/50 (80%)

    Table 5.3: Successful gestures by number of repeat teachings for a 2 sided gesture

    Chris Ellie Paul Liam

    10 Reps 9/50 (18%) 10/50 (20%) 9/50 (18%) 8/50 (16%)50 Reps 32/50 (64%) 35/50 (70%) 34/50 (68%) 34/50 (68%)

    100 Reps 42/50 (84%) 35/50 (70%) 36/50 (72%) 35/50 (70%)

  • 8/9/2019 Fellows L Dissertation 2008 9

    48/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 37

    Chris Ellie Paul Liam

    10 Reps 8/50 (16%) 10/50 (20%) 9/50 (18%) 5/50 (10%)50 Reps 30/50 (60%) 31/50 (62%) 32/50 (64%) 27/50 (54%)

    100 Reps 32/50 (64%) 36/50 (72%) 34/50 (68%) 31/50 (62%)

    Liz Dan Mean

    10 Reps 7/50 (14%) 9/50 (18%) 8/50 (16%)

    50 Reps 28/50 (56%) 35/50 (70%) 30.5/50 (61%)100 Reps 36/50 (72%) 35/50 (70%) 34/50 (68%)

    Table 5.5: Successful gestures by number of repeat teachings for a 4 sided gesture

    Figure 5.9: Graph depicting recognition accuracy against gesture complexity after 10, 50and 100 repetitions

    5.5.2 Results Analysis

    It should be noted that with each volunteer, the four stages of tests (1, 2, 3 and 4 sides)were performed within one session (usually lasting around an hour). Something noted byalmost all of the volunteers at some point was it was quite a tiring exercise physically and

  • 8/9/2019 Fellows L Dissertation 2008 9

    49/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 38

    usual, and that the numbers should be looked at as a higher than true count.

    This ambiguous outcome means that without an in-depth look at the testing method wecannot tell which outcome should be followed. Due to the time constraints on this projectit is not possible to perform testing on the testing of the project. However, the results takenshould be close enough to what would be found in real-world use to take conclusions fromthem.

    5.6 Instrument Results

    Figure 5.6 charts depicting the answers given in questionnaires by the volunteers to thequestions 4

    1. Favourite Wiinote instrument.

    2. Easiest Wiinote instrument to play.

    3. Least favourite Wiinote instrument.

    4. Most difficult Wiinote instrument to play.

    Figure 5.11 displays two graphs. The first depicts the average time taken to hit each of thethree notes individually for each of the three instruments. The graph on the right showsthe average time taken to play the three note tune for each of the instruments.

    5.6.1 Results Analysis

    I believe these results give a good idea of how difficult each instrument was to play. Theonly downside I can see to this method of quantifying each instrument was the uncertaintyin the timing (involving a human operator and a stopwatch). As such there may be a

    margin of error on each side of the measurements taken. Most of this has been takencare of by rounding the number to a single decimal place, the space of time between eachinstruments results is sufficiently large to make the ordering of the instruments obviousand as such, the remaining uncertainty can be ignored.

    With hi d i ht it h l b d id t h th t b t ti i th

  • 8/9/2019 Fellows L Dissertation 2008 9

    50/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 39

    way to solve this would be to run the experiment again and include a number of real mu-sical instruments as a comparison of real instruments to their synthetic siblings. This,however, is beyond the specifications of the project. Here it is enough to know that theywork.

  • 8/9/2019 Fellows L Dissertation 2008 9

    51/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 40

  • 8/9/2019 Fellows L Dissertation 2008 9

    52/93

    CHAPTER 5. IMPLEMENTATION AND TESTING 41

    Figure 5.11: Graphs depicting the average times taken to hit 3 notes on each instru-ment(left), and the average time to play a three note tune on each instrument (right)

  • 8/9/2019 Fellows L Dissertation 2008 9

    53/93

    Chapter 6

    Conclusions

    The gesture recognition system worked reasonably well, If we look at figure 5.9 we cansee that as the complexity of the gesture increased, the systems accuracy in recognising itdecreased. By adding a line at 75% it is possible for us to compare the gesture recognitionability of the system to the requirement NF02 (section 4.2). We can see that the gesturerecognition abilities of the system are within the requirements only up to gestures of 2sides. We can also assume that this trend continues and that as the complexity of gesturesincreases the accuracy in recognising them will continue to reduce.

    If we look at the differences between the number of repetitions and accuracy in figure 5.9we can see that the accuracy of the system increases after each set of repetitions. We canalso see that the 2 sided gesture only reaches an average of more than 75% accuracy afterit has been trained 100 times. With another round of teaching the system, the 3-sidedgesture may also cross the 75% threshold.

    It is worth noting that this may hold true with any gesture, that after an ever-greateramount of training any more complex gesture might be recognisable with a significant

    accuracy.Therefore, I believe it is possible to call the gesture recogniser a success, if only a limitedsuccess as it works well but for it to work well with a more complex range of gestures agreat deal of training must be performed.

    Th i t t l ti f th j t i i i t E h f

  • 8/9/2019 Fellows L Dissertation 2008 9

    54/93

    CHAPTER 6. CONCLUSIONS 43

    awkward to be useful as a musical instrument.

    The other instruments faired better, with the IR instrument being hailed as the favouriteof the volunteers, and the flute being labelled easiest to play. The flute provided furtherquestions to the author, as the volunteers chose the flute as easiest to play, however the timetaken to play each note on the flute is by far the highest. After reviewing the volunteersanswers, I believe this can be attributed to the learning curve of the flute instrument.

    The positive comments about the flute instrument seem to be based on the fact that thenotes were playable based on a discrete action (i.e. pressing the A button) as opposed toattempting to find an arbitrary angle.

    The IR instrument was the favourite of the volunteers (and of the author) and also providedthe quickest timings during testing. I believe that this makes the IR instrument the mostsuccessful of the three instruments created.

    The instruments are limited by the way they each attempt to use only one input method of

    the Wiimote. I believe that by making better use of the entire Wiimote and its functionality,more creative instruments could be designed which provide the user with better methodsof creating and performing music, however as a proof of concept this set of instrumentsworks well.

    6.0.1 Further Developments

    It could be beneficial to extend the gesture recogniser to support gestures in 3 dimensions.This would be done by adding extra directions (as in figure 5.3). The PathObject treewould also have to be extended to have a path for each added direction. This would allowthe system to recognise gestures in 3 directions. However, this may reduce the accuracy ofthe system. The only way to know for sure would be to implement it.

    There were also two of the initial requirements which were not met by the project, firstlyFI03 - Users should be able to add new instruments to the system and FSo02 - The system

    should be able to record the instruments playing. It was envisioned that users could beable to write their own instruments within the system, however, due to time constraints theinstruments were hard coded into the system. It should still be simple to implement newinstruments given the source code and a knowledge of Java, but a further improvementwould be to provide functionality to create instruments within the system and without

    l h

  • 8/9/2019 Fellows L Dissertation 2008 9

    55/93

    CHAPTER 6. CONCLUSIONS 44

    track could also be controlled using gestures to perform actions upon it (such as telling thesystem to repeat a bar or skip forward a bar mid-song). These extensions of the projectwould build on the possibilities of the system by providing further functionality useful tomusicians.

  • 8/9/2019 Fellows L Dissertation 2008 9

    56/93

    Bibliography

    Accelerometer: In Encyclopedia Britannica from Encyclopedia Britannica Online(2009)http://www.britannica.com/EBchecked/topic/2859/accelerometer RetrievedJanuary 2009.

    Apple IPhone (2009)http://www.apple.com/iphone Retrieved January 2009.

    Glinsky, A. V. (1992), The theremin in the emergence of electronic music.

    Hollar, S., Perng, J. K. and Pister, K. S. J. (2000 ), Wireless static hand gesture recognitionwith accelerometers- the acceleration sensing glove.

    Kapoor, A. (2001), A real-time head nod and shake detector, in in Proceedings from theWorkshop on Perspective User Interfaces.

    Louis-Philippe Morency, Ariadna Quattoni, T. D. (2007), Latent-dynamic discriminativemodels for continuous gesture recognition.

    Microsoft Surface (2008), http://www.microsoft.com/surface/ Retrieved December 2008.

    Moeslund, T. B. and Granum, E. (2001), A survey of computer vision-based human motioncapture, Computer Vision and Image Understanding 81, 231268.

    Moeslund, T. B., Hilton, A. and Krger, V. (2006), A survey of advances in vision-based hu-man motion capture and analysis, Computer Vision and Image Understanding104(2-3), 90 126. Special Issue on Modeling People: Vision-based understanding of apersons shape, appearance, movement and behaviour.

  • 8/9/2019 Fellows L Dissertation 2008 9

    57/93

    BIBLIOGRAPHY 46

    Sakaguchi, T., Kanamori, T., Katayose, H., Sato, K. and Inokuchi, S. (1996), Humanmotion capture by integrating gyroscopes and accelerometers, in IEEE/SICE/RSJInternational Conference on Multisensor Fusion and Integration for Intelligent Sys-tems, 1996., IEEE/SICE/RSJ, Washington, DC, USA, pp. 470475.

    Schlomer, T., Poppinga, B., Henze, N. and Boll, S. (2008), Gesture recognition with a Wiicontroller, in TEI 08: Proceedings of the 2nd international conference on Tangible

    and embedded interaction, ACM, New York, NY, USA, pp. 1114.

    Wallach, H.M., (2004), Conditional Random Fields: An Introduction. , University of Penn-sylvania CIS Technical Report, MS-CIS-04-21.

    Wang, S. B., Quattoni, A., Morency, L.-P., Demirdjian, D. and Darrell, T. (2006), Hiddenconditional random fields for gesture recognition, in Computer Vision and PatternRecognition, 2006 IEEE Computer Society Conference on, Vol. 2, pp. 15211527.

    Warakagoda, N. (1996), Narada Warakagodas hmm tutorial -http://jedlik.phy.bme.hu/ gerjanos/HMM/node3.html Retrieved January 2009.

    WiiRemoteJ Technical Demo (beta version 0.6) (2009), http ://www.dailymotion.com/Ctta0s/video/x1hrflwiiremotej Accessed April 2009.

    Wilson, A. D. (2004), Touchlight: an imaging touch screen and display for gesture-based in-teraction, inICMI 04: Proceedings of the 6th international conference on Multimodal

    interfaces, ACM, New York, NY, USA, pp. 6976.

  • 8/9/2019 Fellows L Dissertation 2008 9

    58/93

    Appendix A

    Design Diagrams

  • 8/9/2019 Fellows L Dissertation 2008 9

    59/93

    APPENDIX A. DESIGN DIAGRAMS 48

  • 8/9/2019 Fellows L Dissertation 2008 9

    60/93

    Appendix B

    Raw results output

    B.1 User Questionnaires49

  • 8/9/2019 Fellows L Dissertation 2008 9

    61/93

    A

    PPENDIXB.RAW

    RESULTSOUTPUT

    50

    Name: Chris

    Instruments Played Piano

    Years Played 3 Years

    Favourite Wiinote Instrument (+ Why)IR Instrument - It was the most engaging to play, Felt more like a game than an instrument.

    Easiest Wiinote Instrument to Play (+ Why)Flute Instrument - The notes were more discrete, with the other instruments the thresholds between notes were ill-defined.

    Least Favourite Wiinote Instrument (+ Why)Pitch/Roll Instrument - It wasnt particularly nice to play

    Most Difficult Wiinote Instrument to Play (+ Why)Pitch/Roll Instrument - the angles for each note were too difficult to get right.

    Pitch/Roll Results

    timed note (C): 1.3stimed note (E): 1.3s

    timed note (f#): 1.2stune (c, e, f#): 4.8s

    Flute Results

    timed note (C): 2.3stimed note (E): 2.5s

    timed note (f#): 2.0stune (c, e, f#): 5.9s

    IR Resultstimed note (C): 1.6stimed note (E): 1.5s

    timed note (f#): 1.7stune (c, e, f#): 5.5s

    Table B.1: Table Containing Chris Questionaire Results

  • 8/9/2019 Fellows L Dissertation 2008 9

    62/93

    A

    PPENDIXB.RAW

    RESULTSOUTPUT

    51

    Name: Paul

    Instruments Played Guitar/Keyboard

    Years Played Guitar - 9 Years, Keys - 6 Months

    Favourite Wiinote Instrument (+ Why)Flute: it felt like a real instrument and the notes were easier to find.

    Easiest Wiinote Instrument to Play (+ Why)Flute: See Above

    Least Favourite Wiinote Instrument (+ Why)IR: It was difficult to get right, seemed more like a gimmick to have the IR sensor working.

    Most Difficult Wiinote Instrument to Play (+ Why)Pitch/Roll: Finding the notes with any accuracy was too difficult.

    Pitch/Roll Results

    timed note (C): 1.1stimed note (E): 1.0s

    timed note (f#): 1.1stune (c, e, f#): 4.5s

    Flute Results

    timed note (C): 2.1stimed note (E): 2.3s

    timed note (f#): 1.9stune (c, e, f#): 5.4s

    IR Resultstimed note (C): 1.3stimed note (E): 1.5s

    timed note (f#): 1.6stune (c, e, f#): 5.0s

    Table B.2: Table Containing Pauls Questionaire Results

  • 8/9/2019 Fellows L Dissertation 2008 9

    63/93

    A

    PPENDIXB.RAW

    RESULTSOUTPUT

    52

    Name: Daniel

    Instruments Played Drums

    Years Played 6 Years(On and Off)

    Favourite Wiinote Instrument (+ Why)The Flute - Nicest to play, less messing around!

    Easiest Wiinote Instrument to Play (+ Why)The Flute - it was easier to hit the correct notes once Id got used to where they were

    Least Favourite Wiinote Instrument (+ Why)The Pitch Roll Instrument - Playing it too long made my wrist ache! and it was tough to find the notes.

    Most Difficult Wiinote Instrument to Play (+ Why)The Pitch Roll Instrument - It was tough to find the right notes.

    Pitch/Roll Results

    timed note (C): 1.3stimed note (E): 1.2s

    timed note (f#): 1.4stune (c, e, f#): 5.7s

    Flute Results

    timed note (C): 2.5stimed note (E): 2.6s

    timed note (f#): 2.0stune (c, e, f#): 5.8s

    IR Resultstimed note (C): 1.6stimed note (E): 1.6s

    timed note (f#): 1.7stune (c, e, f#): 5.7s

    Table B.3: Table Containing Dans Questionaire Results

  • 8/9/2019 Fellows L Dissertation 2008 9

    64/93

    A

    PPENDIXB.RAW

    RESULTSOUTPUT

    53

    Name: Ellie

    Instruments Played None

    Years Played n/a

    Favourite Wiinote Instrument (+ Why)The IR Instrument was my favourite, it was fun to play around with!

    Easiest Wiinote Instrument to Play (+ Why)The IR Instrument was the easiest, after I found where the notes were it was quite easy to play them again!

    Least Favourite Wiinote Instrument (+ Why)The Pitch/Roll Instrument wasnt nice to play, it was too awkward

    Most Difficult Wiinote Instrument to Play (+ Why)The Pitch/Roll Instrument as the notes were too close together, which maed them difficult to get right.

    Pitch/Roll Results

    timed note (C): 2.0stimed note (E): 1.8s

    timed note (f#): 1.9stune (c, e, f#): 7.9s

    Flute Results

    timed note (C): 2.8stimed note (E): 2.8s

    timed note (f#): 2.2stune (c, e, f#): 6.0s

    IR Resultstimed note (C): 1.9stimed note (E): 2.0s

    timed note (f#): 2.1stune (c, e, f#): 6.4s

    Table B.4: Table Containing Ellies Questionaire Results

  • 8/9/2019 Fellows L Dissertation 2008 9

    65/93

    A

    PPENDIXB.RAW

    RESULTSOUTPUT

    54

    Name: Liam

    Instruments Played None

    Years Played n/a

    Favourite Wiinote Instrument (+ Why)IR Instrument: It was quite fun using the drumsticks to play it.

    Easiest Wiinote Instrument to Play (+ Why)IR Instrument: It was easiest o figure out where different notes were

    Least Favourite Wiinote Instrument (+ Why)Flute Instrument: The Button Combinations made it too difficult to play

    Most Difficult Wiinote Instrument to Play (+ Why)Flute Instrument: Remembering all the button combinations was too much.

    Pitch/Roll Results

    timed note (C): 1.5stimed note (E): 1.3s

    timed note (f#): 1.5stune (c, e, f#): 6.2s

    Flute Results

    timed note (C): 3.0stimed note (E): 3.4s

    timed note (f#): 2.6stune (c, e, f#): 6.8s

    IR Resultstimed note (C): 1.7stimed note (E): 1.8s

    timed note (f#): 1.9stune (c, e, f#): 6.1s

    Table B.5: Table Containing Liams Questionaire Results

  • 8/9/2019 Fellows L Dissertation 2008 9

    66/93

    A

    PPENDIXB.RAW

    RESULTSOUTPUT

    55

    Name: Liz

    Instruments Played None

    Years Played n/a

    Favourite Wiinote Instrument (+ Why)The IR Instrument was good! It was very different!

    Easiest Wiinote Instrument to Play (+ Why)

    The Flute Instrument was easiest to get the notes right with!Least Favourite Wiinote Instrument (+ Why)The Pitch/Roll Instrument wasnt much fun after the first couple minutes!

    Most Difficult Wiinote Instrument to Play (+ Why)The Pitch/Roll Instrument had too many notes on it to make it easy to play.

    Pitch/Roll Results

    timed note (C): 2.1s

    timed note (E): 1.9stimed note (f#): 2.0stune (c, e, f#): 8.3s

    Flute Results

    timed note (C): 2.7stimed note (E): 2.9s

    timed note (f#): 2.1stune (c, e, f#): 6.4s

    IR Results

    timed note (C): 2.1stimed note (E): 1.9s

    timed note (f#): 2.0stune (c, e, f#): 6.5s

    APPENDIX B RAW RESULTS OUTPUT 56

  • 8/9/2019 Fellows L Dissertation 2008 9

    67/93

    APPENDIX B. RAW RESULTS OUTPUT 56

    C E F# Tune (C, E, F#)

    Chris 1.5 1.3 1.2 5.9Paul 1.3 1.2 1.1 5.5Dan 1.3 1.2 1.4 6.1Ellie 1.7 1.8 1.9 6.9

    Liam 1.5 1.3 1.5 6.2Liz 2.1 1.7 2.0 7.3

    Total 9.4 8.5 9.1 37.9

    Mean 1.6 1.4 1.5 6.9

    Table B.6: Table combining the results of all Pitch/Roll tests.

    C E F# Tune (C, E, F#)

    Chris 2.3 2.5 2.0 5.9Paul 2.1 2.3 1.9 5.4Dan 2.5 2.6 2.0 5.8Ellie 2.8 2.8 2.0 6.0Liam 3.0 3.4 2.6 6.8Liz 2.7 2.9 2.1 6.4

    Total 15.4 16.5 12.8 36.3

    Mean 2.6 2.8 2.1 6.1

    Table B.7: Table combining the results of all flute tests.

    C E F# Tune (C, E, F#)

    Chris 1.6 1.5 1.7 5.5Paul 1.3 1.5 1.6 5.0

  • 8/9/2019 Fellows L Dissertation 2008 9

    68/93

    Appendix C

    A Final View of the SystemRequirements

    APPENDIX C. A FINAL VIEW OF THE SYSTEM REQUIREMENTS 58

  • 8/9/2019 Fellows L Dissertation 2008 9

    69/93

    APPENDIX C. A FINAL VIEW OF THE SYSTEM REQUIREMENTS 58

    Functional Requirements

    System Requirements

    ID Description Complete?FSy01 The system should present the user with a graphical user interface !

    FSy02 The system should gather Wiimote data using WiiuseJ !

    FSy03 The system should inform the user of the note playing !

    FSy04 The system should inform the user of system developments !

    Instrument Requirements

    ID Description Complete?

    FI01 The system must support multiple instruments!

    FI02 Each instrument must be able to play notes A-G# !

    FI03 Users should be able to add new instruments to the system%

    FI04 should contain instruments based on all the Wiimotes input methods !

    Gesture Requirements

    ID Description Complete?

    FG01 The system must have some method of identifying gestures !

    FG02 The gesture system must be able to control system functionality!

    FG03 The system must be able to learn new gestures !

    FG04 Gestures should have a friendly name available to the user !

    FG05 The user should be able to assign actions to gestures !

    FG06 The system should provide some interface for managing gestures !

    Sound Requirements

    ID Description Complete?

    FSo01 The system should support MIDI. !

    FSo02 The system should be able to record the instruments playing.%

    FSo03 The system should Allow the playing of Both chords and single notes !

    Non-Functional Requirements

  • 8/9/2019 Fellows L Dissertation 2008 9

    70/93

    Appendix D

    Code

  • 8/9/2019 Fellows L Dissertation 2008 9

    71/93

    A

    PPENDIXD.CODE

    60

    D.1 wiinote.engine.ListenerFlute.java

    / A L i s t e n e r f o r t h e LED b a s e d i n s t r um e n t . T ak es t h e B u tt o n

    D at a f ro m t h e W ii mo te a nd p a s s e s i t t o t h e MWPro cess o b j e c t .

    W hi ls t t h e w i im ot e o b j e c t i s a t ta c he d t o t h i s t h e f l u t e

    i n st r um e nt w i l l b e a c ti v e , A dd t h i s t o t h e w i i mo te s a c ti o n l i s t e n e r s t o u se t h e

    f l u t e i n st r um e nt a nd r em ov e i t t o s t o p u s i ng i t .

    05Apr2009 @ au th or L o ui s F e l lo w s @ ve rs io n 1 . 0 .0 . 0 /

    p ub l ic c l a s s L i s t e n e r F l u t e implements W i i m o t e L i s t e n e r {

    @Overridep u b l i c v o id o n B u t t o n sE v e n t ( W i i m o te B u t t o ns E v e nt a r g 0 ) {

    i f ( a r g 0 . i s B u t t o n A P r e s s e d ( )| | a r g 0 . i s B u t t o n B P r e s s e d ( )| | a r g 0 . i s B u t t o n U p P r e s s e d ( )| | a r g 0 . i s B u t t o n O n e P r e s s e d ( )| | a r g 0 . i s B u t t o n T w o P r e s s e d ( )| | a r g 0 . i s B u t t o n A J u s t R e l e a s e d ( )| | a r g 0 . i s B u t t o n B J u s t R e l e a s e d ( )| | a r g 0 . i s B u t t o n U p J u s t R e l e a s e d ( )| | a r g 0 . i s B u t t o n O n e J u s t R e l e a s e d ( )| | a r g 0 . i s B u t t o n T w o J u s t R e l e a s e d ( ) ) {

    W i i n o t e . p r o c e s s . B u t t o n s t o M i d i ( a r g 0 . g e t B u t t o n s H e l d ( ) ) ;}

    }}

    D.2 wiinote.engine.ListenerLeds.java

    / A L i s t e n e r f o r t h e L ED b a s ed i n s t r um e n t . T ak es t h e L ED D a ta f ro m t h e

    W ii mo te a nd p a s s e s i t t o t h e MWPro cess o b j e c t .

    W hi ls t t h e w i im ot e o b j e c t i s a t ta c he d t o t h i s t h e IR

    i n st r um e nt w i l l b e a c ti v e , A dd t h i s t o t h e w i i mo te s a c ti o n l i s t e n e r s t o u se t h e

    I R i n s t r u m e n t and r em ove i t t o s t op u si n g i t .

    05Apr2009 @ au th or L ou is F e ll o ws @ v e r s io n 1 . 0 . 0 . 0

    /p u b li c c l a s s

    L i s t e n e r L e d simplements

    W i i m o t e L i s t e n e r {

    @Overridep u b l i c v o id o n I r E v en t ( I R Ev e nt a r g 0 ) {

    I R S ou r c e [ ] p t s = a r g 0 . g e t I R P o i n t s ( ) ;

    i f ( p t s . l e n g t h == 2 ) {W i i n ot e . p r o c e s s . Le d sT o Mi d i ( p t s [ 0 ] . ge t X ( ) , p t s [ 0 ] . g e tY ( ) ,

    p t s [ 1 ]. g e tX ( ) , p t s [ 1 ] . g et Y ( ) ) ;

    }}

    }

    D.3 wiinote.engine.ListenerPitchRoll.java

    / A L i s t e n e r f o r t h e LED b a s e d i n s t ru m e n t . T ak es t h e

    A c c e l l e r o m e t e r D at a f r om t h e W ii mo te a nd p a s s e s i t t o t h e M WProcess o b j e c t .

    W hi ls t t h e w ii mo te o b j e c t i s a t ta c he d t o t h i s t h e P i t ch / R o l l

    i n s t r u m e n t w i l l b e a c t iv e , A dd t h i s t o t h e w i im o te s a c ti o n l i s t e n e r s t o

    u se t h e P i tc h / R o l l i n st r um e n t a nd r e mo ve i t t o s t o p u s in g i t .

    05Apr2009 @ au th or L ou is F el l ow s @ v e r s i o n 1 . 0 . 0. 0 /

    p u b li c c l a s s L i s t e n e r P i t c h R o l l implements W i i m o t e L i s t e n e r {

    @Overridep u b l i c v o id o n M o ti o n S e ns i n g Ev e n t ( M o t i o n S e n s in g E ve n t a r g 0 ) {

    O r i e n t a t i o n o = a r g 0 . g e t O r i e n t a t i o n ( ) ;W i i n ot e . p r o c e s s . mo t io n To M id i ( o . g e t P i t c h ( ) , o . g e t R o l l ( ) ) ;

    }}

    D.4 wiinote.engine.MidiOut.java

    / C o n ta i n s a n um be r o f MIDI h e l p e r f u n c t i o n s w h ic h a r e u s e d t o

    c o n n e c t t o a M IDI D e vi c e a n d t o b u i l d a nd s e nd MIDI M e s s ag e s t o i t .

    The Aim o f t h i s c l a s s i s t o k ee p a l l MIDI f u n c t i o n s

    t o g e t h e r i n o ne c l a s s t o i n cr e as e r eu se and r ed uc e t h e w ork i n a l t e r i n g t h e

    MIDI

    A

  • 8/9/2019 Fellows L Dissertation 2008 9

    72/93

    A

    PPENDIXD.CODE

    61

    f u nc t i on s o f t h e P r oj e ct

    05Apr2009 @ au th or L ou is F e ll o ws @ v e r s io n 1 . 0 . 0 . 0 /

    p ub l ic c l a s s MidiOut {

    p r i v a t e M i d i D e v i c e md ;

    p u b l i c MidiOut() {md = n u l l ;

    }

    p u b l i c v o id c o n n e c t T o P o r t ( S t r i n g p o rt N am e ) throws

    M i d i U n a v a i l a b l e E x c e p t i o n {i f (md != n u l l ) {i f (md. isOpen () ) {

    md . c l o s e ( ) ;}

    }

    M i d i D ev i c e . I n f o [ ] ms = M i di S ys t em . g e t M i d i D e v i c e I n f o ( ) ;md = n u l l ;fo r ( M i d i De v ic e . I n f o i n f : ms ) {

    i f ( i n f . t o S t r i n g ( ) . e q u a l s I g n o r e C a s e ( p or tN am e ) ) {tr y {

    md = M i d i Sy s t e m . g e t M i d i D e v i c e ( i n f ) ;md. open( ) ;return ;

    }catch

    ( M i d i U n a v a i l a b l e E x c e p t i o n e ) {throw new M i d i U n a v a i l a b l e E x c e p t i o n ( ) ;}

    }}throw new M i d i U n a v a i l a b l e E x c e p t i o n ( ) ;

    }

    p u b l i c S h o r t Me s s a g e c r e a t e S h o r t M e s s a g e ( in t command, i nt d1 ,i nt d2 )

    throws I n v a l i d M i d i D a t a E x c e p t i o n {ShortMe ssage myMsg = new S h o r t M e s s a g e ( ) ;tr y {

    myMsg. set Mess age (command, d1 , d2) ;} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e ) {

    throwe ;}

    return myMsg ;}

    p u b l i c S h o r t Me s s a g e c r e a t e S h o r t M e s s a g e ( in t command, i ntc h a n n e l , i nt d1 ,

    in t d2 ) throws I n v a l i d M i d i D a t a E x c e p t i o n {ShortMe ssage myMsg = new S h o r t M e s s a g e ( ) ;tr y {

    myMsg. set Mess age (command, channe l , d1 , d2) ;} catch ( I n v a l i d M i d i D a t a E x c e p t i o n e ) {

    throw e ;}return myMsg ;

    }

    p u b l i c v o id h o l d ( i nt t i m e ) {tr y {

    T h re a d . s l e e p ( t i m e ) ;} catch ( I n t e r r u p t e d Ex c e p t i o n e ) {

    e . p r i n t S t a c k T r a c e ( ) ;}

    }

    p u b l i c S t r i n g [ ] o u t p ut P o rt s ( ) {M i d i D e vi c e . I n f o [ ] ms = M i di S ys t em . g e t M i d i D e v i c e I n f o ( ) ;A r r a y L i s t r e t u r n S t r i n g s = new A r r a y L i s t() ;

    f or ( M i d i D ev i ce . I n f o i n f : ms ) {

    r e t u r n S t r i n g s . a d d ( i n f . t o S t r i n g ( ) ) ;}

    S t r i n g [ ] r e t u r n V a l s ;r e t u r n V a l s = new S t r i n g [ r e t u r n S t r i n g s . s i z e ( ) ]