icra2009 evaluation of a robot as embodied interface for brain computer interface systems e....

1
ICRA2009 Evaluation of a robot as embodied interface for Brain Computer Interface systems E. Menegatti , L. Tonin Intelligent Autonomous System Laboratory (IAS-Lab) Department of Information Engineering University of Padua, Italy F. Piccione, S. Silvoni I.R.C.C.S. San Camillo Venice, Italy K. Priftis Department of General Psychology University of Padua, Italy ICRA2009 2009 IEEE International Conference On Robotics and Automation Kobe, Japan, May 12-17, 2009 Goals: Evaluate the advantages of a BCI system when the actions triggered by the subject brain activity are performed by a physical device in the real world (i.e. a mobile robot instead of a GUI). Motivations and purposes: Robot and on-board camera feedback can lead to higher engagement of the subjects? can lead to a better BCI- performance? Telepresence can improve patients’ quality of life? Figure 1: BCI with a robot as physical device providing feedback from real world. Figure 2: P300 related peaks in EEG graphic sample. The holonomous robot: First experiment, the robot just replicates the motion of the virtual cursor cursor on the screen in the real world. We used an holonomous robot with an omnidirectional camera because: it can move to any position in the plane without needing to rotate. the omnidirectional video, can be used both for streaming images to the user and can generate feedback to the user. Figure 3: BENDER, the holonomous robot with omnidirectional camera. BCI data acquisition Registration electrodes were placed according to the international 10-20 system at Fz, Cz, Pz and Oz; the Electrooculogram (EOG) was recorded from a pair of electrodes below and laterally to the right eye; all electrodes were referenced to the left earlobe. Figure 4: Tuning–up the electrodes configuration. The five channels were amplified, band-pass filtered between 0.15 Hz and 30 Hz, and digitized (with a 16-bit resolution) at 200 Hz sampling rate. Every ERP epoch, synchronized with the stimulus, began 500 ms before the stimulus onset, up to 1000 ms after stimulus trigger signal (tot. 1500 ms). Thus, after each stimulus (trial) presentation the system recorded a matrix of 300 samples per 5 channels, available for on-line and off- line data processing. Experiments: We performed two experiments: Experiment 1: task performed using only graphical interface as user feedback . The environment is the monitor showing graphical interface. User command a virtual object moving it for reach one of the four virtual goal-icons displayed. Experiment 2: task performed using robot as actuator and robot camera view as user feedback . The robot is positioned in the middle of a square room. Physical goal- objects are positioned in the room according to the goal- icon used in the virtual interface during Experiment 1. The feedback for the subject is the change in the image (grabbed by the robot) displayed on the screen (the image of goal-object grows if commands move the robot closer to a physical goal- object). Figure 5: representation of a trial during Experiment 1 and 2.The central position of the virtual cursor (1a) and of the robot (1b), with goal-objects and four arrows with flashed arrow; the movement of the cursor (2a) and of the robot (2b) after P300 recognition . (1a) (1b) (2a) (2b) T1 T2 T3 T4 40 50 60 70 80 90 100 testing days c l a s s i f i c a t i o n a c c u r a classification accuracy (5 healthy subjects w ith cursor feedback;1 subjects w ith robot-cam feedback) Results by using robot: Performances are comparable to those of the old GUI Performances achieved encourage further evaluations of the BCI with the robot embodied interface. Figure 6: classification accuracy (%) of the 5 healthy subjects who performed Experiment 1, and classification accuracy (%) of one subject who performed Experiment 2. Figure 7: Screenshot from the new museum Graphic user interface, and Rovio, the house holonomous robot. Future goals: Telepresence for museum visits: autonomous navigation, new graphical user interface, with masterpieces closest to current location. Telepresence for household rehabilitation: consumer robot Rovio holonomous robot with frontal camera for remote perception. 5 sbj.cursor: m ean ofm ax 5 sbj.cursor: m ean 5 sbj.cursor: m ean ofm in 1 sbj.robot-cam :m ean ofm ax 1 sbj.robot-cam :m ean 1 sbj.robot-cam :m ean ofm in

Upload: zoe-potter

Post on 03-Jan-2016

219 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: ICRA2009 Evaluation of a robot as embodied interface for Brain Computer Interface systems E. Menegatti, L. Tonin Intelligent Autonomous System Laboratory

ICRA2009

Evaluation of a robot as embodied interface for

Brain Computer Interface systemsE. Menegatti, L. ToninIntelligent Autonomous System Laboratory (IAS-Lab)

Department of Information EngineeringUniversity of Padua, Italy

F. Piccione, S. SilvoniI.R.C.C.S. San Camillo

Venice, Italy

K. PriftisDepartment of General Psychology

University of Padua, Italy

ICRA2009

2009 IEEE International Conference

On Robotics and Automation

Kobe, Japan, May 12-17, 2009

Goals:Evaluate the advantages of a BCI system when the actions triggered by the subject brain activity are performed by a physical device in the real world (i.e. a mobile robot instead of a GUI).

Motivations and purposes: Robot and on-board camera

feedback can lead to higher engagement of the subjects? can lead to a better BCI-performance?

Telepresence can improve patients’ quality of life?

Figure 1: BCI with a robot as physical device providing feedback from real world.

Figure 2: P300 related peaks in EEG graphic sample.

The holonomous robot:First experiment, the robot just replicates the motion of the virtual cursor cursor on the screen in the real world. We used an holonomous robot with an omnidirectional camera because:

it can move to any position in the plane without needing to rotate.

the omnidirectional video, can be used both for streaming images to the user and can generate feedback to the user.

Figure 3: BENDER, the holonomous robot with omnidirectional camera.

BCI data acquisitionRegistration electrodes were placed according to the international 10-20 system at Fz, Cz, Pz and Oz; the Electrooculogram (EOG) was recorded from a pair of electrodes below and laterally to the right eye; all electrodes were referenced to the left earlobe.

Figure 4: Tuning–up the electrodes configuration. The five channels were amplified, band-pass filtered between 0.15 Hz and 30 Hz, and digitized (with a 16-bit resolution) at 200 Hz sampling rate. Every ERP epoch, synchronized with the stimulus, began 500 ms before the stimulus onset, up to 1000 ms after stimulus trigger signal (tot. 1500 ms). Thus, after each stimulus (trial) presentation the system recorded a matrix of 300 samples per 5 channels, available for on-line and off-line data processing.

Experiments:

We performed two experiments:

Experiment 1: task performed using only graphical interface as user feedback. The environment is the monitor showing graphical interface. User command a virtual object moving it for reach one of the four virtual goal-icons displayed.

Experiment 2: task performed using robot as actuator and robot camera view as user feedback. The robot is positioned in the middle of a square room. Physical goal-objects are positioned in the room according to the goal-icon used in the virtual interface during Experiment 1.

The feedback for the subject is the change in the image (grabbed by the robot) displayed on the screen (the image of goal-object grows if commands move the robot closer to a physical goal-object).

Figure 5: representation of a trial during Experiment 1 and 2.The central position of the virtual cursor (1a) and of the robot (1b), with goal-objects and four arrows with flashed arrow; the movement of the cursor (2a) and of the robot (2b) after P300 recognition.

(1a)

(1b)

(2a)

(2b)

T1 T2 T3 T4 40

50

60

70

80

90

100

testing days

classification accuracy (%)

classif ication accuracy (5 healthy subjects w ith cursorfeedback; 1 subjects w ith robot-cam feedback)

5 sbj. cursor: mean of max5 sbj. cursor: mean 5 sbj. cursor: mean of min1 sbj. robot-cam: mean of max1 sbj. robot-cam: mean 1 sbj. robot-cam: mean of min

Results by using robot:

Performances are comparable to those of the old GUI

Performances achieved encourage further evaluations of the BCI with the robot embodied interface.

Figure 6: classification accuracy (%) of the 5 healthy subjects who performed Experiment 1, and classification accuracy (%) of one subject who performed Experiment 2.

Figure 7: Screenshot from the new museum Graphic user interface, and Rovio, the house holonomous robot.

Future goals:

Telepresence for museum visits: autonomous navigation, new graphical user interface, with masterpieces closest to current location.

Telepresence for household rehabilitation: consumer robot Rovio holonomous robot with frontal camera for remote perception.

T1 T2 T3 T4 40

50

60

70

80

90

100

testing days

classification accuracy (%)

classif ication accuracy (5 healthy subjects w ith cursorfeedback; 1 subjects w ith robot-cam feedback)

5 sbj. cursor: mean of max5 sbj. cursor: mean 5 sbj. cursor: mean of min1 sbj. robot-cam: mean of max1 sbj. robot-cam: mean 1 sbj. robot-cam: mean of min