19 february 2008

27
The University of Auckland New Zealand Matthias Wimmer Matthias Wimmer Technische Universitat M Technische Universitat M ü ü nchen nchen Bruce MacDonald, Dinuka Jayamuni, and Arpit Yadav Bruce MacDonald, Dinuka Jayamuni, and Arpit Yadav Department of Electrical and Computer Engineering, Department of Electrical and Computer Engineering, Auckland Auckland http://robotics.ece.auckland.ac.nz http://robotics.ece.auckland.ac.nz 19 February 2008 19 February 2008 Facial Expression for Facial Expression for Human-Robot Interaction – Human-Robot Interaction – A prototype A prototype

Upload: chakra

Post on 11-Jan-2016

33 views

Category:

Documents


2 download

DESCRIPTION

19 February 2008. Facial Expression for Human-Robot Interaction – A prototype. http://robotics.ece.auckland.ac.nz. Matthias Wimmer Technische Universitat M ü nchen Bruce MacDonald, Dinuka Jayamuni, and Arpit Yadav Department of Electrical and Computer Engineering, Auckland. Outline. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Matthias WimmerMatthias WimmerTechnische Universitat MTechnische Universitat Müünchennchen

Bruce MacDonald, Dinuka Jayamuni, and Arpit YadavBruce MacDonald, Dinuka Jayamuni, and Arpit YadavDepartment of Electrical and Computer Engineering, AucklandDepartment of Electrical and Computer Engineering, Auckland

http://robotics.ece.auckland.ac.nzhttp://robotics.ece.auckland.ac.nz

19 February 200819 February 2008

Facial Expression for Human-Facial Expression for Human-Robot Interaction – A prototypeRobot Interaction – A prototype

Page 2: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Outline

MotivationBackgroundFacial expression recognition methodResults on a data set

Results with a robot (the paper contribution)

Conclusions

Page 3: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motivation: Goal

Our Robotics group goals:To create mobile robotic assistants for humansTo make robots easier to customize and to program

by end usersTo enhance interactions between robots and humansApplications: healthcare, eg aged careApplications: agriculture (eg Ian's previous presentation)

(Lab visit this afternoon)Robotface

Page 4: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motivation: robots in human spaces

Increasingly, robots live in human spaces and interact closely

InTouch remote doctor

Page 5: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motivation: close interactions

RI-MAN

http://www.bmc.riken.jp/~RI-MAN/index_us.html

Page 6: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motivation: different types of robot

Robots have many forms; how do people react?

Pyxis HelpMate SP Robotic Courier SystemDelta Regional Medical Centre, Greenville, Mississippi

Page 7: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motovation: different robot behaviour

AIBO (Sony)

Paro the therapeutic baby seal robot companionhttp://www.aist.go.jp/aist_e/latest_research/2004/20041208_2/20041208_2.html

Page 8: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motivation: supporting the emotion dimension

Robots must give support with psychological dimensionshome and hospital helptherapycompanionship

We must understand/design the psychology of the exchangeEmotions play a significant roleRobots must respond to and display emotionsEmotions support cognitionRobots must have emotional intelligenceEg during robot assisted learningEg security screening robotsHumans’ anxiety can be reduced if a robot responds well [Rani et al, 2006]

Page 9: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motivation: functionality of emotion response

Not just to be “nice”; the emotion dimension is essential to effective robot functionality [Breazeal]

Page 10: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motivation: robots must distinguish human emotional state

However, recognition of human emotions is not straightforwardOutward expression versus internal mood states

People smile when happy AND they are interacting with humans

Olympic medalists don’t smile until the presenter appears (eg 1948 football team)

Ten pin bowlers smile when they turn back to their friends

Page 11: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motivation: deciphering human emotions

Self-reports are more accurate than observer ratingsCurrent research attempts to decipher human emotions

facial expressionsspeech expressionheart rate, skin temperature, skin conductivity

www.cortechsolutions.com

Page 12: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Motivation: Our focus is on facial expressions

Despite the limitations, we focus on facial expression interpretation from visual information.Portable, contactlessNeeds no special nor additional sensorsSimilar to humans' interpretation of emotions (which is by vision and speech)No interference with normal HRI

www.euron.org

Asimo

Page 13: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Background

Six universal facial expressions (Ekman et al.) Laughing, surprised, afraid, disgusted, sad, angry

Cohn-Kanade-Facial-Expression database (488 sequences, 97 people)PerformedExaggerated

Determined byShapeMuscle motion

Page 14: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Background: Why are they difficult to estimate?

Different faces look differentHair, beard, skin-color, …

Different facial posesOnly slight muscle activity

Page 15: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Background

Typical FER process [Pantic & Rothkrantz, 2000]

Page 16: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Background: Challenges

1. Face detection and 2. feature extraction challenges:Varying shape, colour, texture, feature location, hairSpectacles, hatsLighting conditions including shadows

3. Facial expression classification challenges:Machine learning

Page 17: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Background: related work

Cohen et al: 3D wireframe with 16 surface patchesBezier volume parameters for patchesBayesian network classifiersHMMs model muscle activity over time

Bartlett et al: Gabor filters using AdaBoost, Support Vector93% accuracy on Cohn-Kanade DBIs tuned to DB

Page 18: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Background: challenges for robots

Less constrained face pose and distance from cameraHuman may not be facing the robotHuman may be movingMore difficulty in controlling lightingRobots move away!Real time result is needed (since the robot moves)

Page 19: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Facial expression recognition (FER) methodMatt’s model based approach

Page 20: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

FER method

Cootes et al statistics based deformable model (134 points)Translation, scaling, rotationVector b of 17 face configuration parametersRotate head b1, open mouth b3, change gaze direction b10

Page 21: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

FER method: Model-based image interpretation

The model The model contains a parameter vector that represents the model’s configuration.

The objective function Calculates a value that indicates how accurately a parameterized model matches an image.

The fitting algorithm Searches for the model parameters that describe the image best, i.e. it minimizes the objective function.

Page 22: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

FER method

Two step process for skin colour: see [Wimmer et al, 2006]Viola & Jones technique detects a rectangle around the faceDerive affine transformation parameters of the face modelEstimate b parametersViola & Jones repeatedFeatures are learned to localize face featuresObjective function compares an image to a modelFitting algorithm searches for a good model

Page 23: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

FER method: learned objective function

Reduce manual processing requirements by learning the objective function [Wimmer et al, 2007a & 2007b]Fitting method: hill-climbing

Page 24: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

FER method

Facial feature extraction:Structural (configuration b) and temporal features (2 secs)

Expression classificationBinary decision tree classifier is trained on 2/3 of data set

Page 25: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Results on a dataset

Happiness and fear have similar muscle activity around the mouth, hence theconfusion between them.

Page 26: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Results on a robot

B21r robotSome controlled lightingHuman about 1m away120 readings of three facial expressions12 frames a second possibleTests at 1 frame per second

Page 27: 19 February 2008

The

Uni

vers

ity

of A

uckl

and

New

Zea

land

Conclusions

Robots must respond to human emotional statesModel based FER technique (Wimmer)70% accuracy on Cohn-Kanade data set (6 expressions)67% accuracy on a B21r robot (3 expressions)

Future work: better FER is neededImproved techniquesBetter integration with robot softwareImprove accuracy by fusing vital signs measurements