moving through virtual reality without really moving?ber1/web/presentations/riecke_2010_04... ·...

24
Moving Through Virtual Reality Without Really Moving? Spatial orientation, self-motion perception, & perceptually-oriented self-motion simulation in virtual reality Bernhard Riecke 23. April 2010, Locomotion Lab, SFU 2 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke My own background (Spatially) Max Planck Institute for Biological Cybernetics, Tübingen, Germany, www.kyb.mpg.de Cyberneum.de

Upload: others

Post on 03-Jul-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

Moving Through Virtual Reality

Without Really Moving?

Spatial orientation, self-motion perception,

& perceptually-oriented self-motion simulation in virtual reality

Bernhard Riecke

23. April 2010, Locomotion Lab, SFU

2 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

My own background (Spatially)

Max Planck Institute for Biological

Cybernetics, Tübingen, Germany,

www.kyb.mpg.de

Cyberneum.de

Page 2: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

3 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

Virtual Reality group of the Max Planck Institute for

Biological Cybernetics in Tübingen, Germany

15 x 12m Free walking area Kuka robot arm-based motion simulator

230° spherical/toroid projection screen

6DoF Stewart motion

platform A Morphable Model for 3D Face

Synthesis

omnidirectional treadmill

4 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

Page 3: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

5 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

2007/2008 Vanderbilt University (Nashville, TN, USA)

6 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

Since May 2008:

Simon Fraser University

Page 4: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

8 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

Thanks!

Applied Acoustics,

Gothenburg, Sweden www.poems-project.info

B2: Object

and scene

recognition through

interaction

B8:

Cognitive

influence on reflex

behavior

Prof. Heinrich Bülthoff

Fr anck Caniard

Dr Douglas W. Cunningham

Dr. Markus von der Heyde

Jörg Schulte - Pelkum

Daniel

Feuereissen

Prof. John

Rieser

Prof.

Timothy P.

McNamara Prof. Bobby

Bodenheimer

NIMH grant 2-R01-MH57868, P30-EY008126, NSF Grant 0705863

NSERC DRG 611547, RTI 613337, SFU 877317

Daniel

Feuereissen

Dinara

Moura

Etienne

Naugle

Adam Hoyle

Anton

Brosas

9 Bernhard Riecke SIAT, Simon Fraser Univ.,

Research Interest & Motivation

•! Amazing technological and scientific progress in electronic media, HCI, & VR

•! Novel opportunities for research, interface design

& applications

•! Novel challenges

–!Spatial disorientation, reduced task performance, insufficient transfer of training, unconvincing

! Reduced usability & user acceptance

Approach:

Fundamental research

with an eye toward the applied

Page 5: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

10 . Bernhard E. Riecke

Old debate in experimentation

Naturalistic observation

–!Realistic stimuli/environment

–!naturalism & ecological

validity of stimuli

–!natural interaction

–!Closed action-perception loop

–!multimodal stimuli

Lab experiments

–!Full stimulus control

–!Repeatability

–!Flexibility

–!Confound & Error reduction

Virtual Reality?

Do we perceive and behave similarly in real and computer-simulated

environments?

“is virtual reality real enough?”

11 . Bernhard E. Riecke

Overview

Part 1: Spatial orientation Part 2: Self-motion illusions

Page 6: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

Part 1: Investigating & Enabling Robust and

Effortless Spatial Orientation in VR

•! VR: Disorientation

–!even for simple navigation tasks

–! lack of intuitive & robust spatial orientation, esp. if no landmarks ! what is missing?

! computationally expensive strategies

! long response times

! ineffective & unnatural behavior

! impaired usability, user acceptance, and real-world transfer

–! Darken and Sibert 1996;

–! Darken and Peterson 2002;

–! Grant and Magee 1998;

–! Klatzky et al. 1998;

–! Lawson et al. 2002;

–! Péruch and Gaunet 1998;

–! Riecke & Wiener, 2006, 2007;

–! Ruddle and Jones 2001;

–! Ruddle et al. 1998

•! Real world: robust &

effortless spatial orientation

–!automatized and reflex-like “spatial updating”

! minimal cognitive load

–! Farrell & Robertson 1998, 2000; Klatzky

et al. 1998; Loomis et al. 1996; May & Klatzky 2000; Presson & Montello 1994;

Rieser 1989; Wraga et al. 2004

Approach:

Basic research with an eye toward the applied

Applied goal: Natural, effective and unencumbered

behavior in computer-mediated environments

Basic research perspective:

Multi-modal sensory integration & cues required for automatic

spatial updating

Applied perspective: Enable

robust & effortless spatial orientation & interaction in VR

with minimal simulation effort

Approach: Experiments in real room & VR replica

Ecologically relevant task: Rapid pointing

Methodology: human in

ecologically valid context

Theory: unifying framework,

! explanations & predictions

Page 7: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

Methods - Setup

•! Task: After real/

simulated self-

rotations: Point accurately & quickly

to targets outside of

FOV

–!Ecologically valid

–!Reduced cognitive

strategies

•! Unrestricted FOV improves

spatial orientation performance

•! VR performance ! Real world

for same FOV

•! Physical motion cues don’t

improve performance

Results

Abs

olut

e po

intin

g er

ror

[°]

Rea

l wor

ld fu

ll F

OV

Rea

l wor

ld li

mite

d F

OV

HM

D w

ith p

hysi

cal m

otio

n

HM

D w

ithou

t phy

sica

l mot

ion

Page 8: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

•! Small response times & high pointing accuracy

! Spatial orientation was easy and intuitive, even in VR

•! Response times independent of turning angle

! Spatial updating occurred automatically during motion

! Natural, automatic spatial updating

•! VR performance ! Real world (if FOV matched)

! Real-world like spatial orientation in VR is possible

! Suggests real world transferability

! Validates VR paradigm

•! Critical for research & applications

–! Riecke et al., Psychological Research (2006)

–! Riecke et al., ACM Transactions on Applied Perception (TAP) (2005)

–! Riecke et al., ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization (APGV) (2004)

Results & Conclusions

Res

pons

e tim

e [s

] C

onfig

urat

ion

erro

r [°

] S

elf-

orie

ntat

ion

erro

r [°

]

Rea

l wor

ld fu

ll F

OV

Rea

l wor

ld li

mite

d F

OV

HM

D w

ith p

hysi

cal m

otio

n

HM

D w

ithou

t phy

sica

l mot

ion

VR Real World

How far can we get with just optic flow?

•! Optic flow is heavily used in basic research (scholar.google.com: 14,000 hits)

•! We can estimate all kinds of things form optic flow

–! “Optic flow is used to control human walking” (Warren et al., Nature 2001)

–! “Perception of self-motion from visual flow” (Lappe et al., TICS 1999)

–! “Heading detection from optic flow” (Lappe & Rauschecker, Nature 1994)

–! “Humans can use optic flow to estimate distance of travel” (Redlick et al., Vision Res.,2001)

–! “Optic flow helps humans learn to navigate through synthetic environments” (Kirschen et a;., Perception 2000)

–! “Path integration from optic flow and body senses in a homing task” (Kearns et al., Perception 2002)

–! “Visually induced illusion of self-motion” (Howard, 1982)

•! But: if optic flow specifies a self-motion: Do we have a clue where we are?

Does optic flow enable natural, life-like spatial orientation?

•! ! use simplest non-trivial spatial orientation task: “point-to-origin”

Page 9: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

Main question: How good is spatial orientation in

VR given optic flow and no feedback training?

•! Do participants have a natural “feeling” of where they are, just like in the real world?

(qualitative errors)

•! Approach: Use task that’s easy if natural spatial orientation (automatic spatial

updating) works and else difficult

–! Point-to-origin after simple excursion (translation, rotation, (translation))

–! Similar to homing/triangle completion tasks in RW

General methods

•! Passive motions, path integration using optive flow without landmarks

•! flat proj. Screen, FOV: 84° x 63°, 89cm distance, 1400 x 1050 pixel JVC D-ILA LCoS proj.; N=16

•! Indep. Var.

–! Turning angles 0°, 30°, 60°, 90°, 120°, 150°, 170°, announced before each trial on screen

–! Turning angle left/tight (alternating)

–! Length of first segment s1: 16m or 24m (s2 = 16m)

Page 10: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

Example of individual subject data

•! Top-down view:

Riecke & Wiener, APGV 2007: 2-segment encoding control exp., turn angles announced

•! non-inverters (13/24)

Page 11: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

2-segment encoding control exp., turn angles announced

•! Left-right inverters (11/24)

Conclusions

•! Despite expensive & immersive VR, people are still easily lost in VE…

•! Standard VR visualization setups seem insufficient for enabling natural, accurate and

intuitive, spatial orientation/updating based on visual path integration

•! Challenge: How could we improve spatial orientation in VR? –! Landmarks & naturalistic scene

•! (e.g., Riecke et al., 2002, 2005, 2006)

–! Active control? Maybe (not) •! (e.g., Wraga et al., 2004)

–! FOV •! (e.g., Arthur, 2000, Riecke et al., 2005)

–! Size of display •! (Tan et al., 2003-6)

–! Display type & geometry •! (e.g., Riecke et al., 2002, 2005)

–! Feedback training & thinking (but: still no intuitive spatial orientation) •! (e.g., Gramann et al., 2005, Riecke et al., 2002)

–! Physical locomotion (or at least rotation), self-motion illusions? •! (e.g., Klatzky et al., 1998, Riecke et al., 2005ff)

•! Outlook: Rapid point-to-origin paradigm is simple yet powerful tool to

–! Evaluate perceptual & behavioral quality of VR setup

–! Capability of enabling natural & unencumbered spatial behavior and performance

Page 12: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

•! Real-world-like spatial orientation in VR is possible

•! Even without physical motions

! challenges prevailing opinion

Riecke et al., APGV (2004)

•! Optic flow insufficient for automatic spatial updating

•! Even when passive physical motions added

Riecke et al., Psychological Res. (2006)

Riecke et al., TAP (2005)

Similar results for more complex & irregular scene

Projection screen & larger FOV more compelling than HMD

Riecke et al., SPIE Invited paper (2005)

Display setup critical for turn estimation

•! HMD < screen

•! FOV

•! Screen curvature Landmarks ! perfect homing performance

Riecke et al., Presence (2002)

Continuous visual motion not required for spatial updating — static images of new orientations can be sufficient

Riecke et al., TAP (2005)

Riecke & Wiener VR (2007) Riecke, Presence (2008)

No/unreliable landmarks: ! Systematic errors in mental spatial reasoning

!Striking qualitative & quantitative errors

Even for simplest tasks

Literature

Riecke & von der Heyde, (2002, 2005)

Riecke & McNamara (2007)

Theoretical spatial orientation framework

!explanation of exp. findings

!Predictions

!Deeper understanding

Bernhard E. Riecke

Part 2: Self-Motion Illusions (“Vection”) &

Perceptually-Oriented Self-Motion Simulation in VR

•! “Train illusion” •! Normally, the world doesn’t move… scene

from Top Secret

Page 13: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

38 Multi-modal contributions for vection Max Planck Inst. – Vanderbilt – Simon Fraser Univ. Bernhard E. Riecke

What cues can induce vection?

•! Visual vection (Flowing river; waterfall, train illusion)

–! Mach (1875), Helmholz (1896), Fischer & Kornmüller (1930) Tschermak (1931) “linear/circular Vection”

•! Auditory vection

–! 20-60% can perceive auditorily induced vection

•! Dodge, 1923; Hennebert, 1960; Lackner, 1977; Marmekarelse & Bles, 1977; Larsson et al., 2004ff; Väljamäe

et al., 2004ff, …

–! But: Much weaker than visual vection

•! Biomechanical (stepping around) circular vection

–! Most (>90%?) perceive it (e.g., Bles & Kapteyn, 1977, Bles, 1981)

–! Vection onset time: ca. 20s (Becker et al., 2002, Bruggeman et al, (submitted))

Vec

tion

inte

nsity

time

Vec

tion

onse

t

Saturated vection no vection partial vection

object-motion mixed object &

self-motion pure self-motion

39 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

Motivation:

Why study self-motion illusions (“vection”)?

•! Theoretically interesting: Cue combination

•! Compelling (“embodied”)

•! Related to challenges in VR applications:

–!Convincingness and naturalism

–!Presence & immersion (Prothero, 1995; Riecke et al, 2005ff;

Stanney, 2002)

–!Spatial orientation (Hypothesis by Riecke & von der Heyde, 2003)

Page 14: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

Fulfill vision of Virtual Reality:

Computer-generated world accepted as alternate „Reality“ ! as compelling, immersive, & empowering as real world

Fundamental research perspective

Goal: Understand human system Self-motion perception

Multi-modal & higher-level contributions

Applied perspective: Goal: How can we

simulate self-motions more effectively at minimal cost? ! Empower humans to

effectively & intuitively interact with comp./VR

Approach:

Perceptual/ behavioral experiments in naturalistic, multi-modal VR

Methodology: human in

ecologically valid context whenever possible

Theory: unifying framework

! deeper understanding, predictions, hypothesis-testing

Research philosophy: Human-centered,

perceptually- and behaviorally-oriented perspective

inspires, motivates,

enables

VR as Enabling Technology: Multi-modal, naturalistic & immersive VR provides the unique opportunity to study human perception &

behavior in reproducible, clearly defined & controllable experimental conditions in a closed action-perception loop

42 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

•! Main idea: Scene scrambling should reduce global scene consistency, depth

cues & spatial presence (higher-level factors) while hardly altering image statistics (bottom-up factors).

Visual Vection – Why use naturalistic stimuli? higher-level influences

“Globally consistent”

“Globally inconsistent” (scrambled)

Page 15: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

43 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

"! Relaxed viewing without

fixation

Demo & Experimental Design

Joy

stic

k t

ime

cou

rse

(=v

ecti

on

in

ten

sity

)

time

Maximum

vection

intensity [%]

Vection

onset

time

•! Convincingness of motion [%]

•! Presence questionnaire (Schubert et al., Presence 2001)

Vection

buildup

time

44 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

Global Scene Consistency

Vection Measures

Joy

stic

k t

imec

ou

rse

(=v

ecti

on

in

ten

sity

)

time

Maximum

vection

intensity

Vection

buildup

time

Vection

onset

time

Page 16: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

45 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ. Bernhard E. Riecke

Global Scene Consistency

Presence Ratings

con

sist

ent

inco

nsi

sten

t

(Pre

sence

ques

tionnai

re b

y S

chuber

t et

al.

, P

rese

nce

2001)

Global Scene Consistency

Conclusions

•! Stimuli depicting a consistent natural scene produce a faster, stronger, and more

convincing sensation of illusory self-motion

•! Higher-level processes (global scene consistency & pictorial depth cues) are

important factors for ego-motion sensation

•! Hypothesis: stable reference frame ! “spatially presence” & involved

Page 17: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

Email: [email protected] Vanderbilt - MPI for Biological Cybernetics – Simon Fraser Univ.

Can auditory cues facilitate biomechanical

vection?

Bernhard E. Riecke

Daniel Feuereissen

John J. Rieser

&

Timothy McNamara

birds

sound

river

sound

270°

circular treadmill

51 Vanderbilt - MPI for Biological Cybernetics – Simon Fraser Univ. Bernhard E. Riecke

Setup

Page 18: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

52 Vanderbilt - MPI for Biological Cybernetics – Simon Fraser Univ. Bernhard E. Riecke

Methods – Auditory stimulus generation

•! Binaural recording via in-ear mics

•! Core Sound Binaural Microphone Set

birds

sound

river

sound

270°

53 Vanderbilt - MPI for Biological Cybernetics – Simon Fraser Univ. Bernhard E. Riecke

door

45°

mat

135° beer

225°

ape

315°

door

45°

mat

135° beer

225°

ape

315°

Methods – Auditorily induced circular

vection

•! Binaural display via headphones

–! Audiotechnica AT-7ANC active noise canceling headphones

birds

sound

river

sound

270°

birds

sound

river

sound

270°

Page 19: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

54 Vanderbilt - MPI for Biological Cybernetics – Simon Fraser Univ. Bernhard E. Riecke

Time-course & measures of vection

Vec

tion

inte

nsity

time

Vec

tion

onse

t Saturated vection no vection partial vection

object-motion

mixed object &

self-motion

pure self-motion

Vection

onset time

Vection Intensity

at onset

Vection

Intensity at

end of trial

Overall vection

Intensity

0 10 20 30 40 50 60 70 80 90 100 110 Stim

ulus

vel

ocity

[°/s

]

Time [s]

10s

3s

90s

9s

3s 0

60

40

20

55 Multi-modal contributions for vection Max Planck Inst. – Vanderbilt – Simon Fraser Univ.

Can auditory cues facilitate biomechanical vection? Auditorily + biomechanically induced circular vection

Bernhard E. Riecke

birds

sound

river

sound

270°

Biomechanical

vection

birds

sound

river

sound

270°

Auditory &

biomechanical vection

•! Methods:

–! 3 stimulus combinations x 2 directions (L/R) x

2 repetitions = 12 trials; N=19 currently

•! Questions: Can auditory cues

–! Induce circular vection?

–! Facilitate biomechanical vection? (even if too

weak to induce vection by themselves?)

birds

sound

river

sound

circular treadmill

circular treadmill

Auditory vection

Page 20: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

56 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ.

8.1 63.5 45.4 0

10

20

30

40

50

60

70

80

Just

Sound

Sound &

Platform

Just

Platform

Ove

rall

vect

ion

inte

nsity

[%]

64.9 16.1 23.9 0

10

20

30

40

50

60

70

80

90

100

Just

Sound

Sound &

Platform

Just

Platform

Est

imat

ed v

ectio

n on

set t

ime

[s]

7.9 64.7 46.5 0

10

20

30

40

50

60

70

80

Just

Sound

Sound &

Platform

Just

Platform

Rea

lism

of a

ctua

lly r

otat

ing

in r

oom

[%]

43.4 93.4 90.8 0

10

20

30

40

50

60

70

80

90

100

Just

Sound

Sound &

Platform

Just

Platform

Per

cent

age

of tr

ials

with

vec

tion

[%]

Bernhard E. Riecke

Results

•! Rotating sound field itself was hardly able to evoke circular vection

•! Yet, auditory cues significantly enhanced biomechanical vection

•! Role of auditory cues for vection:

–! facilitation/supporting others modalities?

•! Applied perspective:

–! audio is affordable & robust yet HiFi

***

m

***

•! Moving headphone-based 3D audio rendering can induce vection

•! “acoustic landmarks” help

•! spatialization quality not critical for vection auditory

Larsson et al., Presence (2005) Riecke et al., APGV (2008)

visual + auditory cross-modal facilitation

Adding spatialized sound of visual land-mark (fountain) enhances vection &presence

Riecke et al., Presence (2005), TAP (2009)

Additional cross-modal benefits for adding matching

Riecke et al., VRST (2006)

vestibular & proprioc.:

self-generated motion cueing

Summary

Riecke et al., IMRF (2006), Schulte-Pelkum (2008) Wong & Frost, 1981

Vestibular: Minimal motion cueing (jerks of a few cm)

Natural, globally consistent scene & depth cues enhance visually-induced vection & presence

visual

Riecke et al., APGV (2005) TAP (2006) Schulte-Pelkum & Riecke HOP (2009)

#Cognitive & higher-level influences

Audio + Infrasound or

vibrations

Riecke et al., HCI (2005), IEEE VR (2005)

Audio + Biomechanical

cues (stepping on circ. treadmill)

Riecke et al., Cyberwalk (2008)

Further info: www.siat.sfu.ca/faculty/Bernhard-Riecke/

www.poems-project.info

#multi-modal spatio-temporal consistency

Vection can be reliably induced

and studied using a VR setup

Riecke et al. VR (2005), APGV

(2008), Schulte-Pelkum (2008)

Vibrations

Possibility of actual motion

Riecke et al. TAP(2009) Lepecq et al. (1995) Wright et al. (2006)

Page 21: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

iSpace lab [email protected]

www.siat.sfu.ca/faculty/Bernhard-Riecke/

Exploit multi-modal self-

motion illusions

Leverage

spatial updating

minimize reference

frame conflicts

Long-term vision: Understanding & enabling effective spatial perception and behaviour in VR,

and use this knowledge to design novel, more effective human-computer interfaces/interaction paradigms

Cognitive Science Mechatronics

Engineering

Computer Science

Psychology

Human Factors & HCI

VR as Enabling Technology: Multi-modal, naturalistic & immersive VR provides the unique opportunity to study human perception &

behavior in reproducible, clearly defined & controllable experimental conditions in a closed action-perception loop

Fundamental research perspective

Understand human multi-modal perception/interaction with

real/virtual env.: what constitutes natural, robust, intuitive, & embodied human perception & behavior?

Applied perspective

Empower humans to effectively & intuitively

interact with computers/computer-generated environments

inspires, motivates,

enables

61 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ.

Page 22: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

62 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ.

Systems Engineering

Rotating Rings

Frame & Supports

Motors

Drive Train

63 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ.

Mechanical Engineering

•! Frame Design and Analysis

Stress Analysis

Displacement Analysis

Page 23: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

65 Self-Motion Illusions (Vection) SIAT, Simon Fraser Univ.

Mechanical Engineering

•! Drive Train

Inner Disk & Foot Ring

Double Flanged Bearings

Ball Bearing

Synchronous Belt Sheaves

Drive Shaft

iSpacer’s

Daniel Feuereissen (MSc)

Dinare Moura (PhD)

Mechatronics CoOp’s

Etienne Naugle

Adam Hoyle

Anton Brosas

Soon:

Lonnie Hastings (PhD)

Jay Vidyarthi (MSc)

Rances Narvaez (PhD)

Page 24: Moving Through Virtual Reality Without Really Moving?ber1/web/presentations/Riecke_2010_04... · Theory: unifying framework ! deeper understanding, predictions, hypothesis-testing

•! Moving headphone-based 3D audio rendering can induce vection

•! “acoustic landmarks” help

•! spatialization quality not critical for vection auditory

Larsson et al., Presence (2005) Riecke et al., APGV (2008)

visual + auditory cross-modal facilitation

Adding spatialized sound of visual land-mark (fountain) enhances vection &presence

Riecke et al., Presence (2005), TAP (2009)

Additional cross-modal benefits for adding matching

Riecke et al., VRST (2006)

vestibular & proprioc.:

self-generated motion cueing

Summary

Riecke et al., IMRF (2006), Schulte-Pelkum (2008) Wong & Frost, 1981

Vestibular: Minimal motion cueing (jerks of a few cm)

Natural, globally consistent scene & depth cues enhance visually-induced vection & presence

visual

Riecke et al., APGV (2005) TAP (2006) Schulte-Pelkum & Riecke HOP (2009)

#Cognitive & higher-level influences

Audio + Infrasound or

vibrations

Riecke et al., HCI (2005), IEEE VR (2005)

Audio + Biomechanical

cues (stepping on circ. treadmill)

Riecke et al., Cyberwalk (2008)

Further info: www.siat.sfu.ca/faculty/Bernhard-Riecke/ www.poems-project.info

#multi-modal spatio-temporal consistency

Vection can be reliably induced

and studied using a VR setup

Riecke et al. VR (2005), APGV

(2008), Schulte-Pelkum (2008)

Vibrations

Possibility of actual motion

Riecke et al. TAP(2009) Lepecq et al. (1995) Wright et al. (2006)