ma final: 2d magic lens implementation using a handheld device in a 3d virtual environment
DESCRIPTION
MA Final: 2D Magic Lens Implementation using a Handheld device in a 3D Virtual Environment. Student: Alba Huelves Director: Prof. Gudrun Klinker (Ph.D.) Supervisor: Amal Benzina and Marcus Tönnis. Contents. Introduction Related Works System Architecture - PowerPoint PPT PresentationTRANSCRIPT
MA FINAL: 2D MAGIC LENS IMPLEMENTATION USING A HANDHELD DEVICE IN A 3D VIRTUAL ENVIRONMENT
Student: Alba Huelves
Director: Prof. Gudrun Klinker (Ph.D.)
Supervisor: Amal Benzina and Marcus Tönnis
CONTENTS
Introduction Related Works System Architecture Exploring the VE with the Hand Held device:
approaches Motion control Viewpoint Lens frustum computation Rendering the Magic Lens GUI Averaging and thresholds
Conclusions Video demo
2
3
INTRODUCTION
Human interaction techniques with 3D VE are important for: Selection Manipulation of 3D graphical info.
Purpose: Different interaction techniques using Magic Lens
metaphor to explore the VE Obtain an alternative focus view of the scene. Intersection with terrain surface for below
surface exploration in the future.
4
RELATED WORKS
‘Toolglass and Magic Lenses: The See-Through Interface ’ Bier et al.1993
‘3D Magic Lenses’ Viega et al.1996
‘Magic Lenses for Augmented Virtual Environments’ Leonard D. Brown 2006
‘The Through-The-Lens Metaphor: Taxonomy and Application’ S. Stoev et al. 2002
5
SYSTEM ARCHITECTURE
System elements: Android Client running on the tablet Glasses target FRAVE Fraveui0 ART System
6
Communication Procedure1. User starts the Android application
and data is sent to the servers (UDP wireless connection).
2. User can navigate and travel through terrain by touching the screen which sends a message to servers to enable the tracking.
3. User selects Magic Lens mode and another message is sent to servers so it is rendered. The FRAVE server sends the Fraveui0 the Magic Lens data.
4. User explores VE by touching the screen and moving the tablet. Message is sent to servers to enable tracking. FRAVE updates Fraveui0.
5. User takes snapshot of selected region. Message is sent to Fraveui0 to obtain image.
6. Fraveui0 sends image (TCP) to client.
7
EXPLORING THE VE WITH THE HAND HELD DEVICE
Tracked Android tablet controls the Magic Lens virtual avatar.
Motion control options: Rate control Direct avatar - position control
Viewpoint options: Fixed viewpoint Tracked viewpoint
Translation and orientation of the tablet are mapped to the VE depending on the motion control option used.
8
RATE CONTROL
Mapping from the tablet’s translation and rotation to the Magic Lens’ translation and rotation. Initial pose of the tablet is obtained as user
touches screen. Delta translation and delta orientation relative to
initial pose mapped to the Magic Lens. The delta translation is scaled by the
sensitivity factor When the delta translation and the delta
orientation exceed a threshold a rate factor is increased and added to the current translation or orientation.
9
TRANSLATION
• Magic Lens centre position is fixed at a starting position 300m in front of the camera:• centreStartPos = camPosition + 300 * directionCam
• Translation X Right direction• Translation Y Up direction• Translation Z - View direction
• Centre position of the Magic Lens is updated:• centrePos = centreStartPos + translationx * rightCam
+ translationy * upCam - translationz * directionCam
10
ORIENTATION The Magic Lens has an initial 90 degrees pitch, and
zero roll and heading.
• pitch rotation around X• roll rotation around Y• heading rotation around Z(not considered)
• pitch rotation around Right vector• roll rotation around View direction vector• heading rotation around Up vector
Pitch PitchRoll HeadingHeading Roll 0
o The delta rotation from the initial pose is obtained for each frame and multiplied by the elapsed time since the last frame. This is added to the accumulated rotation. o If the pitch and the heading of the Magic Lens exceed a certain threshold the Magic Lens will rotate faster.
11
DIRECT AVATAR – POSITION CONTROL
Approximation of direct avatar by tracking the left and right walls of the FRAVE.
Initial position of the Magic Lens depends of the proximity to each of the walls of the FRAVE.
Delta translation scaled by sensitivity factor obtained by testing to give the impression the lens follows.
Delta rotation is obtained relative to the orientation with zero values for the pitch, heading and roll.
The virtual camera’s pitch and heading are added to that of the Magic Lens.
12
TRACKED VIEWPOINT
The Magic Lens viewpoint is fixed 40m from its centre position in the negative direction of the view vector: viewpoint= centrePos - 40 * viewVector
Symmetric lens frustum where the Magic Lens is the near plane.
FIXED VIEWPOINT
The Magic Lens viewpoint is set to the user’s eye tracked by the glasses.
Relative position of the viewpoint to the centre of the tablet is mapped to the VE. The virtual viewpoint has the same relative position to the centre of the Magic Lens.
13
LENS FRUSTUM COMPUTATION
Relative position of the virtual viewpoint (Evirtual)
to that of the lower left corner (Lleft) of the Magic Lens: R vp = E virtual-L left
dNear =Rvp ●Zaxis
dLeft= -Rvp ●Xaxis dRight = width-dLeftdBottom = Rvp ●Yaxis
dTop = height-dBottom
Set projection matrix with the distances
to the near, far, left, right, top and bottom
planes.
14
RENDERING THE MAGIC LENS Rectangle centre position compute upper left, upper
right, lower left and lower right corners. Viewpoint is represented by a small sphere. The surface of the pyramid formed by the viewpoint,
lens corners and terrain intersection is shaded:
15
GRAPHICAL USER INTERFACE
Android app running on the tablet
Navigation View Magic Lens View
User touches the screen to navigate or to explore the VE with the Magic Lens enables tracking.
Snapshot requests the server in Fraveui0 to send the image of the lens frustum.
16
SENDING THE SNAPSHOT
Upon the request of the Android Client in Fraveui0 the image displayed is captured, compressed to JPEG and sent through TCP connection.
TCP receiver in the client, reads the stream of bytes, decodes the bytes and displays the image in the tablet.
17
AVERAGING AND THRESHOLDS
To filter unintended hand or head movements for translation and rotation.
If the difference between the previous value and the current one doesn’t exceed a threshold, it will be added to a variable.
When the difference exceeds a value the average of the previously added values will be assigned to the current value smoothness
18
CONCLUSIONS Three metaphors:
Rate control with fixed viewpoint Avoids hand fatigues, and the user moving. A larger 3D space can be explored
Position control (direct avatar) with fixed viewpoint Needs to move to explore the VE More immersive experience the avatar behaves as in the real world
Position control (direct avatar) with tracked viewpoint Tracking the viewpoint allows user to explore the terrain also with his
head more immersion Not very intuitive because the focus view is on Fraveui0 and not in
the tablet.
Future work: User evaluation Be able to display in the tablet in real time the focus view Below surface exploration
19
VIDEO DEMO