virtual reality input devices. technologies for …virtual/rva/courseslides/03. vr input...

86
VIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Upload: phungthuy

Post on 09-Jun-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

VIRTUAL REALITY

Input devices. Technologies for the

Curs 2012/2013

direct interaction

Page 2: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Motivation

• Virtual Reality systems use, apart from conventional hardware, a specific hardware which is worth to know.

• In general this hardware is not well known, but:

2

• In general this hardware is not well known, but:– It spends a big part of the cost of a complete VR

system.

– The success or not of the system can depend on the correct choosing of these devices.

Page 3: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Motivation (ii)

• Moreover, the market of virtual reality hardware presents the following characteristics:

– A few distributors (and not well trained!)– Dificulties for testing before buying.

3

– Dificulties for testing before buying.– Dificulties to valuate the product from the

documentation.– Market in continuous evolution (products and

manufacturers).

� Importance of the training in VR input systems.

Page 4: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Overview

1. Concepts and classification 2. Obtaining position (trackers)3. Data gloves

4

3. Data gloves4. Other input devices5. Kinect6. Haptic devices (input / output devices)

Page 5: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

1. Concepts

• Input devices (sensors): capture the participant actions (for example the head movements) and send this information to the computer which is in charge of the interactive simulation.

• They are considered as virtual reality devices when:

5

– They use the paradigm of the implicit interaction, or

– They give 3D input to the system.

• They are related, but not considered from virtual reality:

– Devices for the aquisition of 3D models.

Page 6: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Classification• Implicit interaction:

– Trackers

– Data gloves

– Voice recognition

– Kinect

6

• 3D Input:– Mouses 3-6DOF.

– Space balls, etc.

• Haptic devices (Input-Output)

• Capture (used to create virtual worlds)– 3D scans

Page 7: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Trackers

Page 8: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Trackers

• Objective– The tracking input systems are sensors which are in

charge of capture the position and/or orientation of a real object.

• Possibilities

8

• Possibilities– Only position (3 DOF)– Only orientation (2-3 DOF)– Position and orientation (6 DOF)

Page 9: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Applications

• Head-tracking, eye-tracking

– Images eye-referenced

– Navigation based on head movements

9

Page 10: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Applications

• Hand-tracking, device-tracking

– Direct manipulation of objects (multiple metaphors).

– Navigacion hand based (together with gloves).

– Signs language. Comunication with other users.

10

Page 11: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Applications

• Face-tracking, Body-tracking

– Interaction based on avatar (games)

– Capture off-line of complex sequences (sports)

11

Page 12: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

About position

The position can be done by two ways:

• Absolute position (x, y, z): the position is given in knonw units with respect to a fixed coordinated system.

• Relative position (dx, dy, dz): the position is given by a

12

displacement with respect to the initial position or to the one before.– Require calibration (should be in a known position at the initial).

– The error is acumulated.

– Non exact.

Page 13: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

About orientation

• Two reference systems are defined:– The global one (fixed, referenced for example to the emitter)– The local one (dynamic, referenced to the object)

• We are interested on knowing the rotation (GT matrix) which transforms the global RS in the local RS.

13

which transforms the global RS in the local RS.• The rotation can be represented as follows:

– Direction cosines– Euler angles– Angle axis representation– Spherical coordinates, polar coord– Quaternion

Page 14: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Direction cosines

• The orientation is represented by 9 values:ux, uy, uz << Unit vector in the direction of X vx, vy, vz << Unit vector in the direction of Ywx, wy, wz << Unit vector in the direction of Z

14

• The rotation matrix is obtained by having the 3 vectors in columns.

• Is the more complete and intuitive representation.• Is very redundant (9 reals!)

Page 15: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Euler angles

• The orientation is represented by 3 values:x, y, z << Rotation angles with respect each axe

• The rotation is obtained by:

Rotx, Roty, Rotz

15

x y z

• In order to avoid the ambiguation the following should be fixed:

– an order to apply the rotations (XYZ, XZY...) – an interpretation (alibi, alias).

• Is very intuitive.

Page 16: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Euler angles (ii)Two interpretations or ways to visualize a rotations composition Rx , Ry , Rz:

• Alibi:All rotations are done with respect a one of the three axes of a Global Reference System which is fixed, in this

16

Reference System which is fixed, in this order Rx, Ry, Rz.

• Alias: The rotations are done with respect to a Local Reference System attached to the object, in the reverse order: Rz , Ry , Rx.

Page 17: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Euler angles (iii)

• Problems:– The rotations are not defined in a unique form:

(z, x, y) = (90, 45, 45) = (45, 0, -45)

17

(z, x, y) = (90, 45, 45) = (45, 0, -45)

– The cartesian coordinates are independent among them; the Euler angles no.

Page 18: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Euler angles (v)

• Sometimes, some mnemotechnics are used for the angles, which come from the aeronautic:

yaw (azimuth, twist)

18

yaw (azimuth, twist)

pitch (elevation, tilt)

roll (somersault)

Page 19: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Angle axis representation

• The orientation is represented with four values:x, y, z << Unit vector (arbitrary rotation axe) angle << Rotation angle with respect to the vector

19

• The rotation is obtained by rotating angle grades around the axe determined by the vector (x, y, z).

Page 20: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Spherical coordinates

• The orientation is represented by 3 values:long, lat << Longitud/latitud of a unit vectorangle << Rotation angle with respect to the vector

20

• Is equivalent to the angle axis representation, but the unit vector is represented in spherical coordinates.

Page 21: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Quaternion

• The orientation is represented with four values:x, y, z << Unit vector (arbitrary rotation axe) w << Rotation with respect to the vector

• The rotation is obtained by rotating 2*acos(w) around the axe determined by the vector (x, y, z).

21

axe determined by the vector (x, y, z).• Is a notation very similar to the angle axis

representation; it has advantages in animation, because it’s easy to interpolate between two orientations.

• Is less intuitive.

Page 22: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Technologies

Technology of trackers:– Magnetic– Optic

22

– Optic – Acoustic – Mechanic– Innertial

Page 23: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Magnetic trackers

How they work:• Are based on the attenuation of

the orthogonal magnetic fields.• Each system includes:

23

- Emitter: generates the magnetic field.- Receptors: detect the magnetic field.- Control unit.

Page 24: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Emitter:– Is fixed. Defines the global RS.– It has three orthogonal bobines

through which the electric power

Magnetic trackers

24

through which the electric power is passing in a rotative way.

– Each bobine produces a magnetic field perpendicular to the others (sequential in time).

Page 25: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Sensor:– They are fixed to the object

you want to follow.– They have 3 bobines, but in

Magnetic trackers

25

– They have 3 bobines, but in this case the mesurement the electrical power induced by the magnetic field which envolves the sensor.

– They are usually tied to the central unit by a cable.

Page 26: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Control unit:– Gives power to the emitter– Read data from the receptors– Solve an equacions system:

Magnetic trackers

26

– Solve an equacions system:• Variables: 6 (3 position + 3 orientation)• Equations: 9 (3 lectures/field * 3 fields) • Is undetermined (hemisphere).

– Send the data (via RS-232, for ex.)

Page 27: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Advantages

• Are the most used.

• Do not require “line-of-sight”

• Very precise.

Magnetic trackers

27

• Very precise.

• Multiple receptors; small.

Inconvenients:

• Area of action limited (2 and 10 meters)

• Sensitive to ferromagnetic surfaces

• Sensitive to other magnetic fields

• Cables

Page 28: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Optical trackers

How they work:

• Based on image processing.

• Different sort of:– Estereo cameras (2 or more known

cameras)

28

cameras)

– Làsser

– Other techniques of image processing (siluet...)

Page 29: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Estereo cameras (2 or more known cameras)– For each point p to be registered:

• It detects the position in each image (x, y), (x’, y’) (correspondence problem).

Optical trackers

29

(correspondence problem).

• A point (x,y,z) is obtained by intersecting 2 straight lines.

– Usually require markers:• Pasives: color points

• Actives: LEDs

Page 30: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Advantages

• They can register a big quantity of points

• Relatively cheap

Inconvenients

Optical trackers

30

Inconvenients

• Line-of-sight

• Limited work space to the FOV of cameras

• Overhead (or dedicated processor).

• Latency time variable.

Page 31: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

How they work:• Based on mesuring the propagation time (time-

of-flight) of acoustic signals (ultrasons) generated by loudspeakers and registered by

Acoustical trackers

31

generated by loudspeakers and registered by mycrophons.

• Different sort of:– Fixed emitters, mobile receptors– Mobile emitters, fixed receptors

Page 32: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Several combinations:– A mobile emitter and three fixed receptors:

• Enough to calculate position (3 DOF).• Intersection of three spheres.

Acoustical trackers

32

• Intersection of three spheres.

– Three emitters and three receptors: • Enough to calculate position and orientation (6 DOF)• The orientation is trivial from the 3 points (for ex. origin, point

in X axe and point in Y axe).

Page 33: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Fixed receptors

Mobile Transmittersdistance1

Acoustical trackers

33

distance2

distance3

Page 34: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Advantages– Very reduced cost

• Inconvenients– Line-of-sight– The signals are distorted very easily.

Acoustical trackers

34

– The signals are distorted very easily.– Sensitives to ambient noise, eco...– Latency proporcional to the distance (ex. 3 meters, 10ms

(minimum).– Not very precise.

Page 35: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Mecanical trackers

How they work• Articulated structure with

potenciometres attached to the joins.• Each potenciometre mesure the

angle of rotation.

35

angle of rotation.• The base point is fixed.• The other side point is free• The position and orientation of the

free part: concatenation of translations and rotacions

Page 36: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Advantages• Fast (almost no latency)• Very precise • Electronic very simple.

Mecanical trackers

36

• Electronic very simple.

Inconvenients• Require to fix the object to the free part of the structure.• Limits the freedom of movements.• Cost depends on the mechanical structure.

Page 37: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Innertial trackers

How they work• Based on small devices which mesure

the acceleration which they have been submitted.

• They detect acceleration (accelerometres) and the rotatory

37

• They detect acceleration (accelerometres) and the rotatory orientation (giroscops).

• By integring over the time the acceleration we obtain the velocity; by integring the velocity we obtain the position.

Page 38: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Advantages:

• Unlimited area of action. Do not require cables and are not tied to a fixed RS.

Inconvenients:

Innertial trackers

38

• The position is relative.

• The errors are accumulative.

• Non exact to mesure absolute position.

• Sensitives to the movement velocity!

Page 39: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Data Gloves

Page 40: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Data gloves

Function• Allow to detect the position of the hand fingers, expresed

by the flexion angles of each finger.• Three angles per finger (except the thumb - 2).• They are used combined with trackers.

40

• They are used combined with trackers.

Aplications• Direct manipulation of objects.• Intuitive interaction (navigation, menus)• Interaction based in signals

Page 41: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Technologies• Optical fiber• Mechanic• Resistence

Data gloves

41

• Resistence• Optic

Page 42: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Gloves of optical fiber

How they work• Atenuation of the light which

goes through the optical fiber.• Each finger: one or two circuits

of optical fiber especialy treated.

42

• The attenuation of light depends on the curvature.

• Diference between emmitted light and received light.

Page 43: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Advantages• Very precise• Ergonomic• Veri fastInconvenients:

Gloves of optical fiber

43

Inconvenients:• Non exact.• Require constant calibrations.• The curvature of the fiber do not always correspond to

the one on the fingers (friction).

Page 44: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Mechanical gloves

How they work• Similar to mechanical trackers.• The potenciometres are

incorporated in a structure similar to an exoesquelet.

44

similar to an exoesquelet.• Other models mesure the

elongation of a telescopic tube.

Page 45: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Advantages• Exacts and precise

• Very fast

• Possibility of force-feedback.

Mechanical gloves

45

• Possibility of force-feedback.

Inconvenients• Expensive

• Non ergonomic

Page 46: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Resistence gloves

How they work:• Resistive ink: special cover which offers electrical

resistence variable depending on the elongation. • Technology of the popular PowerGlobe of MattelAdvantages:

46

Advantages:• Very cheapInconvenients:• Non precise.

Page 47: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Optical gloves

How they work:

• Based on LEDs or passive markers, captured by video cameras.

Advantages:

47

• No cables.

• Ergonomic.

Inconvenients:

• Line-of-sight

• Lack of precision. High latency.

Page 48: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Other devices

• Microphon– Voice recognition– Distance interaction (teleconference)

• Devices of aquisition of 3D models

48

• Devices of aquisition of 3D models• Devices of 3D input

Page 49: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Devices of 3D input

2 DOF:– With base surface:

• Classical mouse, classical joystick

– Without base surface:• Track ball, joystick PSX

49

3-6 DOF: – With base surface:

• Space ball: displacement and angular moment

– Without base surface:• 3D balls, Stylus, 3D joysticks...

Page 50: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Kinect

Page 51: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Microsoft Kinect

• New controller for Microsoft’s Xbox 360• Full-body tracking, face and voice recognition• Low cost (99 Euros as of April 2011)

51

Page 52: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Main components

52

Page 53: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Main components

Video• Color CMOS camera• Infrared (IR) CMOS camera• Infrared projector - 830nm, 60mW laser diode.

Audio

53

Audio• Four microphones

Tilt control• Motor• Accelerometer (3-axes)

Processors & memory• PrimeSense chip PS1080-A2• 64 MB DDR2 SDRAM

Page 54: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Image sensors

• Color camera: 640x480 sensor, 640x480@30fps output• IR camera: 1280x1024 sensor, 640x480@30fps output• Operation range (depth sensor) = 0.4m - 3.5m • FOV = 58°H, 45°V, 70°D • Spatial resolution (@ 2m distance) = 3mm

54

• Depth resolution (@ 2m distance) = 1cm

Page 55: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Depth sensing

• The IR emitter projects an irregular pattern of IR dots of varying intensities.

• The IR camera reconstructs a depth image by recognizing the distortion in this pattern.

55

Page 56: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Connecting the Kinect to a PC

Trivial for “old” Kinect models (those sold separately): USB-2 standard connector

56

Page 57: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Further info on Kinect’s hardware

http://www.ifixit.com/Teardown/Microsoft-Kinect-Teardown/4066/1http://openkinect.org/http://www.freepatentsonline.com/7433024.pdf

57

Page 58: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Drivers

libfreenet low-level data(unofficial) color & depth map

Kinect OpenNI + NITE high-level data Appl.

58

Kinect OpenNI + NITE high-level data Appl.dirvers + Sensor kinect gestures / poses

Windows drivers

Page 59: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

libfreenect

• www.openkinect.org• Unofficial, open source driver (GLP2)• Linux, Windows, OS X• Raw data (color map, IR map, depth map)

59

• Raw data (color map, IR map, depth map)• Installation: trivial

Page 60: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

OpenNI + NITE + SensorKinect

• OpenNI– SDK for natural interfaces; open source– www.openni.org

• NITE– OpenNI Plugin for gesture/pose recognition; closed source

60

– OpenNI Plugin for gesture/pose recognition; closed source– www.openni.org

• SensorKinect– Driver for the Kinect (open source)– github.com/avin2/SensorKinect

Page 61: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

OpenNI

• Framework for modules/sensors providing depth maps, color maps, scene maps, gesture recognition, user pose (skeleton)

• www.openni.org

61

Page 62: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

OpenNI

OpenNI aims at abstracting developers from sensor hardware (Kinect, PrimeSensor…) and CV algorithms

(scene analysis, gesture recognition, pose recognition…)

62

Page 63: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

NITE

Middleware for pose/gesture recognition

63

Page 64: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

NITE

• Joint positions: given in real world coords (mm)• Joint orientations: 3x3 rotation matrix representing the rotation of

joint’s local coords w.r.t world coords (the first column is the direction of the joint’s +X axis in world coordinates, and so on).

64

Page 65: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Another related sensor

65

Page 66: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Haptic devices

Page 67: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Touch: feeling caused when the skin receives a mechanical, heating, chemical or electrical stimulus

The sense of touching

67

– Thermoreceptors: they respond to the heating changes

– Mecanoreceptors: they respond to a mechanical action (force, vibration, sliding)

– Nocioreceptors: they transmit pain/ache

Page 68: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Sensorial accomodation: it quantifies the variation on time of the electrical discharge number generated by a certain receiver in

The sense of touching (II)

68

number generated by a certain receiver in response to a stimulus.– slow accomodation– fast accomodation (glasses, gloves)

Page 69: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Spatial resolution: The reception field is defined as the area where a stimulus can excite the receiver (beteween 1-2 mm2 and 45 mm2)

The sense of touching (III)

69

45 mm )– it depends of the body part (lower leg and leg

around 45 mm2, while hand fingers between 1 and 2 mm2)

– directly related with how big is the area of the brain that interpret the stimulus (higher spatial resolution, bigger area in the brain)

Page 70: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

The sense of touching (IV)

70

Page 71: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

The sense of touching (V)

71

Page 72: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Time resolution: minimum time needed to be able to distinguish between two consecutive successes– aproximately around 5ms, much shorter than the

one of the view, 25ms (“hand is much faster than

The sense of touching (VI)

72

one of the view, 25ms (“hand is much faster than eye”)

– But in order to recognize the order the stimuli have been produced it requires 20 ms aprox.

– Welch i Warren (1986): response to a touching stimulus = 0.11 s, but to recognize braille code response = 0.87-1.56 s

Page 73: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Propioception: sense in charge of inform of the position the body parts have

• Cinestesia: feeling that gives information of their movement

The sense of touching (VII)

73

their movement– the resolution is mesured in grades and indicates

the difference of angle which is non perceived as a change of position in a body join (the hip has the biggest precision, while the foot fingers the smallest)

Page 74: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Passive touching: feeling when an object touch the skin (only stimulates the cutaneous receivers, non owned movement).

• Active touching: feeling produced when we

The sense of touching (VIII)

74

• Active touching: feeling produced when we touch in an active form an object (push buttons, slide over surfaces, …)– active better to identify objects by touching– Haptic perception: capacity to identify solids by

touching them.

Page 75: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Requirements to simulate touching

• The most used contact element is the hand• We need the force range a user can produce

and feel (values between 16N and 102N, different for man/woman)

75

different for man/woman)• We have to distinguish between the

maximum force for a short time or a continued force

Page 76: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Sensitive bandwidth: frequence to what the touching stimuli is noticed, propioceptives and cinestetics

• Active bandwidth: velocity with which one can respond to the stimuli

Requirements to simulate touching (II)

76

• Active bandwidth: velocity with which one can respond to the stimuli

– the sensitive is much bigger than the active– for the human finger, the active is between 1 and

16 Hz while the sensitive is between 30 Hz and 10 kHz

– having a frequency of 1000 Hz we can simulate almost all the touching efects

Page 77: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Haptic systems

• The haptic feedback is related with the sense of touching (stimulus of contact) and the propioceptive and cinestetic

• The most important characteristic is the bidireccionality

77

bidireccionality

User Virtual environmentTouch

Audio

Video

ForceVelocity

ForceVelocity

Info. sensors

Info. actors

Page 78: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Haptic system: allow the user to touch, feel and manipulate virtual objects by restitution tactile and cinestetical.

– it consists of: haptic interface + virtual environment

Haptic systems (II)

78

– it consists of: haptic interface + virtual environment

User Virtual environmentHaptic device

Haptic interface

Virtual coupling

Haptic system

REAL VIRTUAL

Page 79: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

• Haptic devices:

Haptic systems (III)

79

Page 80: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Haptic rendering

• The objective of the haptic rendering is to allow the user to touch, feel and manipulate virtual objects by using a haptic interface

80

Requirement:

frequency = 1000 Hz

Page 81: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Haptic rendering (II)

• When the device position is detected colliding with the virtual object, a force is produced to move the device to the object surface

81

Page 82: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Haptic rendering (III)

• Existing interaction methods:

82

Page 83: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Haptic rendering (IV)

• Interaction with an only point:

Polygonal models, flat and smoth faces

Polygonal models with effects of surface (friction, textures,…)

83

Page 84: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Haptic rendering (VII)

• Interaction with an object:

The collision detection is more complicated

84

It requires position and orientation

Resultant forces over different points

Acceleration compute techniques are needed (bounding boxes, voxels, …)

Deformable objects?

Page 85: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

Applications• Medicine:

– Surgery simulation– Tele-medicine– Training– Patient rehabilitation pacients for neurologic

problems• Mechanical design:

85

• Mechanical design:– Pieces assembling

• Entertainment:– Paint in 3D– Characters animation

• Scientífic:– Geophysical data analysi– Molecular manipulation

Page 86: VIRTUAL REALITY Input devices. Technologies for …virtual/RVA/CourseSlides/03. VR Input Hardware.pdfVIRTUAL REALITY Input devices. Technologies for the Curs 2012/2013 direct interaction

VIRTUAL REALITY

Input devices. Technologies for the

Curs 2012/2013

direct interaction