gesture based computing

52
GESTURE BASED COMPUTING Paul Coulton Banksy

Upload: paul-coulton

Post on 28-Jan-2015

111 views

Category:

Education


0 download

DESCRIPTION

Talk for Infolab21 Brunch Bytes series on 'Gesture Based Computing' covering both touch and free form gestures.

TRANSCRIPT

Page 1: Gesture Based Computing

GESTURE BASED COMPUTING

Paul Coulton

Banksy

Page 2: Gesture Based Computing

GESTURE BASED COMPUTING

Paul Coulton

Banksy

O’Sullivan and Igoe

Page 3: Gesture Based Computing

GESTURAL INTERFACESAllows physical movements to be detected by a digital system

without the aid of a traditional pointing device. Dan Saffer

Page 4: Gesture Based Computing

GESTURAL INTERFACESAllows physical movements to be detected by a digital system

without the aid of a traditional pointing device.

A wave, a head nod, a touch, a toe tap, or even a raised eyebrow could

be a gesture

Dan Saffer

Page 5: Gesture Based Computing

TANGIBLE INTERFACESProviding a physical form of digital information and facilitates the

direct manipulation of the associated bits

Page 6: Gesture Based Computing

Mimetic interfaces require players to perform actions that closely resemble the physical activity required in reality.

Text

MIMETIC INTERFACESJesper Juul

Page 7: Gesture Based Computing

NATURAL USER

INTERFACES“natural refers to the user’s

behaviour and feeling during the experience rather than being the product of some organic

process” Wigdor and Wixon and indeed suggest a natural

experience “is NOT best achieved through mimicry”

Page 8: Gesture Based Computing

TYPES OF GESTURAL INTERFACE

Most gestural interfaces can be categorised as either

touchscreen or free form.Touchscreens require the user to be touching device directly whereas freeform systems do

not.Dan Saffer

Page 9: Gesture Based Computing

DIRECT VS INDIRECT MANIPULATION

Page 10: Gesture Based Computing

DIRECT VS INDIRECT MANIPULATION

Direct manipulation features a natural representation of task objects and actions promoting

the notion of people performing a task themselves

(directly) not through an intermediary.

Page 11: Gesture Based Computing

MICE VS FINGERSDan Saffer

Page 12: Gesture Based Computing

MICE VS FINGERS

A cursor is often unnecessary since the user is not constantly pointing at something. Finger moves from point to point

whereas mouse makes a trail.

Dan Saffer

Page 13: Gesture Based Computing

MICE VS FINGERS

A cursor is often unnecessary since the user is not constantly pointing at something. Finger moves from point to point

whereas mouse makes a trail.

Hovers and mouse over events are not employed as this cannot

be detected through touch screens.

Dan Saffer

Page 14: Gesture Based Computing

MICE VS FINGERS

A cursor is often unnecessary since the user is not constantly pointing at something. Finger moves from point to point

whereas mouse makes a trail.

Hovers and mouse over events are not employed as this cannot

be detected through touch screens.

Double click can be done but should be used with caution. A threshold has to be set during

which two touch events at same location count a double click.

Dan Saffer

Page 15: Gesture Based Computing

MICE VS FINGERS

A cursor is often unnecessary since the user is not constantly pointing at something. Finger moves from point to point

whereas mouse makes a trail.

Hovers and mouse over events are not employed as this cannot

be detected through touch screens.

Double click can be done but should be used with caution. A threshold has to be set during

which two touch events at same location count a double click.

In general gesture interfaces don’t employ the right click to bring up another option as this tends to go

away from direct manipulation philosophy

Dan Saffer

Page 16: Gesture Based Computing

MICE VS FINGERS

A cursor is often unnecessary since the user is not constantly pointing at something. Finger moves from point to point

whereas mouse makes a trail.

Hovers and mouse over events are not employed as this cannot

be detected through touch screens.

Double click can be done but should be used with caution. A threshold has to be set during

which two touch events at same location count a double click.

In general gesture interfaces don’t employ the right click to bring up another option as this tends to go

away from direct manipulation philosophy

Drop down menus generally don’t work very well for same reason as right click menus combined

with limitations over hover.

Dan Saffer

Page 17: Gesture Based Computing

MICE VS FINGERS

A cursor is often unnecessary since the user is not constantly pointing at something. Finger moves from point to point

whereas mouse makes a trail.

Hovers and mouse over events are not employed as this cannot

be detected through touch screens.

Double click can be done but should be used with caution. A threshold has to be set during

which two touch events at same location count a double click.

In general gesture interfaces don’t employ the right click to bring up another option as this tends to go

away from direct manipulation philosophy

Drop down menus generally don’t work very well for same reason as right click menus combined

with limitations over hover.

Cut and Paste is now implemented on touch screen

device although presents difficulty for accurate placement due to

size of fingers

Dan Saffer

Page 18: Gesture Based Computing

MICE VS FINGERS

A cursor is often unnecessary since the user is not constantly pointing at something. Finger moves from point to point

whereas mouse makes a trail.

Hovers and mouse over events are not employed as this cannot

be detected through touch screens.

Double click can be done but should be used with caution. A threshold has to be set during

which two touch events at same location count a double click.

In general gesture interfaces don’t employ the right click to bring up another option as this tends to go

away from direct manipulation philosophy

Drop down menus generally don’t work very well for same reason as right click menus combined

with limitations over hover.

Cut and Paste is now implemented on touch screen

device although presents difficulty for accurate placement due to

size of fingers

As humans have a limited number of fingers we have a limit on

selecting multiple items. Normally this means some form of select

mode is used.

Dan Saffer

Page 19: Gesture Based Computing

MICE VS FINGERS

A cursor is often unnecessary since the user is not constantly pointing at something. Finger moves from point to point

whereas mouse makes a trail.

Hovers and mouse over events are not employed as this cannot

be detected through touch screens.

Double click can be done but should be used with caution. A threshold has to be set during

which two touch events at same location count a double click.

In general gesture interfaces don’t employ the right click to bring up another option as this tends to go

away from direct manipulation philosophy

Drop down menus generally don’t work very well for same reason as right click menus combined

with limitations over hover.

Cut and Paste is now implemented on touch screen

device although presents difficulty for accurate placement due to

size of fingers

As humans have a limited number of fingers we have a limit on

selecting multiple items. Normally this means some form of select

mode is used.

Its hard to do an Undo gesture once a gesture is done therefore its better to have an easy way to

cancel or directly undo the action.

Dan Saffer

Page 20: Gesture Based Computing

FITTS LAW

Page 21: Gesture Based Computing

FITTS LAW

Fitts's law is a model of human movement relating to pointing

that predicts that the time required to rapidly move to a target area is a function of the distance to the target and the

size of the target.

Page 22: Gesture Based Computing

FITTS LAW

Fitts's law is a model of human movement relating to pointing

that predicts that the time required to rapidly move to a target area is a function of the distance to the target and the

size of the target.

Put simply a large object closer to the user is easier to point to

than a large one far away.

Page 23: Gesture Based Computing

FITTS LAW

Page 24: Gesture Based Computing

FITTS LAW

This law is equally applicable to gestures. Visual targets should be

designed to be close to the user to avoid reaching across the interface. Objects to be manipulated should be large enough to accomodate

human finger

Page 25: Gesture Based Computing

FINGER TIPSA general guide for the size acceptable targets ideally

should be no smaller than the smallest average finger pad, typically a 1cm (0.4 inch)

diameter is used.

Page 26: Gesture Based Computing

PPIWhat 1cm translates to in

pixels depends on the pixel density or Pixels Per Inch (PPI).

Once you have PPI simply times by 0.4 to get number of

pixels of your touch point

Page 27: Gesture Based Computing

EXAMPLE PPI

Model Diagonal Pixels PPI

iPhone 3GS 3.5” 320X480 163

iPhone 4 3.5” 640x960 326

iPad, iPad2 9.7” 1024x768 132

Google Nexus 1 3.7” 480x800 252

Motorola Droid X 4.3” 854x480 228

Nokia N8 3.5” 640x360 209

Nexus S 4.0” 480x800 235

Page 28: Gesture Based Computing

ICEBERG TIPSThese are controls that have larger targets than what is visible.

The implication is you need more space between objects.

Dan Saffer

Page 29: Gesture Based Computing

ICEBERG TIPSThese are controls that have larger targets than what is visible.

The implication is you need more space between objects.

OK

Dan Saffer

Page 30: Gesture Based Computing

ADAPTIONThe keyboard on the iPhone

actually uses some of the smallest targets at 5mm (0.2

inches). It uses adaptive targets to get over this

limitation.

Dan Saffer

Page 31: Gesture Based Computing

ADAPTIVE TARGETSThese are created algorithmically by guessing the next item the user will touch and increasing the touch target appropriately.

Dan Saffer

Page 32: Gesture Based Computing

ADAPTIVE TARGETSThese are created algorithmically by guessing the next item the user will touch and increasing the touch target appropriately.

1st 2nd

Dan Saffer

Page 33: Gesture Based Computing

TOUCHSCREEN PATTERNS

Select Tap Drag Flick

Pinch Spread Slide Left to Right Slide Up and Down

Page 34: Gesture Based Computing

CURSE OF TWO FINGERED ZOOM!

Page 35: Gesture Based Computing

CURSE OF TWO FINGERED ZOOM!

The more complicated the gesture the fewer people will

use it!

Page 36: Gesture Based Computing

USER GENERATED

Page 37: Gesture Based Computing

FREE FORM GESTURES

Page 38: Gesture Based Computing

Spatial Gesture Models

3D Model Based

Skeletal Volumetric

NURBS Primitives Super Quadratics

Appearance Based

Deformable 2D Templates

Image Sequences

SPATIAL GESTURE MODELS

Page 39: Gesture Based Computing

EXISTING FREE FORM GESTURE PATTERNS

Pointing PointingWiimote

Page 40: Gesture Based Computing

EXISTING FREE FORM GESTURE PATTERNS

ProximityFukuda’s

Automatic Door

WaveAirSwitch

Light

Page 41: Gesture Based Computing

EXISTING FREE FORM GESTURE PATTERNS

Hands InsideDyson Air

Blade

RotateNokiaN93

Page 42: Gesture Based Computing

EXISTING FREE FORM GESTURE PATTERNS

StepDanceMat

ShakeSE W910i

Page 43: Gesture Based Computing

EXISTING FREE FORM GESTURE PATTERNS

Clap Tilt Wii Balance Board

Page 44: Gesture Based Computing

MATCHING GESTURE TO ACTIVITY

Moving a Cursor or Avatar

Slide

Head Tilt

Turn Head Left/Right

Lean Torso Left/Right

Point

Confirmation

Nod

Smile

Okay

Thumbs Up

Nose Tap

Select

Tap

Stare

Point

Hand Gun

Cancel

Shake No

Frown

Thumbs Down

Stop

Swith On/Off

Tap

Flick

Stomp

Wave

Clap

Snap

Dan Saffer

Page 45: Gesture Based Computing

COMMUNICATING GESTURESTimo Arnall

Page 48: Gesture Based Computing

KINECT SDK

Page 49: Gesture Based Computing

KINECT SDK

SDK includes:1. Raw sensor streams: Access to low-level streams from the depth sensor, colour camera sensor, and four-element microphone array.2. Skeletal tracking: The capability to track the skeleton image of one or two people moving within the field of view for gesture-driven applications.

Page 50: Gesture Based Computing

KINECT SDK

SDK includes:1. Raw sensor streams: Access to low-level streams from the depth sensor, colour camera sensor, and four-element microphone array.2. Skeletal tracking: The capability to track the skeleton image of one or two people moving within the field of view for gesture-driven applications.

3. Advanced audio capabilities: Audio processing capabilities include sophisticated acoustic noise suppression and echo cancellation, beam formation to identify the current sound source, and integration with the Windows speech recognition API.4. Sample code and Documentation.

Page 51: Gesture Based Computing

CONCLUSIONS

• Currently gestures can be classified predominantly as touch or free form• The finger is not a direct replacement for the mouse.• The more complicated the gesture the fewer the number of

people who will use it successfully.• If gestures are not obvious to the user the need to be clearly

communicated• As yet our vocabulary of free form gestures is limited and

needs much more development.• Free form gestures should make the user embarrassed.

Page 52: Gesture Based Computing

QUESTIONS

@mysticmobile