[ieee 2013 ieee virtual reality (vr) - lake buena vista, fl (2013.3.18-2013.3.20)] 2013 ieee virtual...

2
Natural Head Motion for 3D Social Games Andrei Sherstyuk Sunflower Research Labs Anton Treskunov Samsung ABSTRACT Natural head motion is the key component in Virtual Reality sys- tems where users are immersed via personal eyewear. The advent of webcam tracking technologies made head motion available on every desktop, for any 3D game, calling for revised methods of handling user head motion in new, non-immersive settings. We present several novel techniques of using head motion in 3D social games. We show that head tracking is not only a useful, but also an enabling technology, that creates new means of interaction between players, makes gameplay more personal and game controls more flexible. 1 I NTRODUCTION Processing of head movements until recently required special track- ing equipment, per-user calibration or limited operating conditions. Yet, use of natural head motion in games was deemed as certainty. In 2008, J. J. LaViola Jr. wrote: We’re in the early stages of a revolution in how video games are played. Three-dimensional spatial interaction and VR concepts such as 3D stereo rendering and head tracking will play a crucial role in generations of future video games (from [1]). Recently introduced webcam-based tracking technologies re- moved the restrictions, mentioned above. Head tracking for games become practical. Due to anatomical constraints, human head motion is nor- mally associated with visual tasks. Consequently, most research work on integrating head motion into games focused on first- person-shooters, where vision-related tasks are of utmost impor- tance [5, 6, 7]. Lately, other uses of head motion were recognized, such as pointing tasks and nonverbal information exchange in train- ing scenarios [2, 3]. 1.1 Goals and Methods In this work, we explore how natural head motion can be used in social 3D worlds, where personalization, interaction and expres- siveness are rules of the game. We will show that many existing interfaces and methods of controlling user avatars can be easily augmented with user motion, producing new visual effects and en- abling new behaviors. Also, we will discuss several ways of camera controls, for non-immersive game settings. For this project, we created a prototype using Blue Mars plat- form from Avatar Reality Inc, the creators of Blue Mars Online social world. Blue Mars SDK features photorealistic avatars with MOCAP animation, powered by CryENGINE2 from Crytek. For motion tracking, we used faceAPI system from Seeing Machines, which tracks head motion at 30 Hz, with 6 degrees of freedom [4]. FaceAPI was directly compiled with the Blue Mars Sandbox Editor tool, which allowed to load and explore scenes in game mode, with various options how head tracking should be used in the game. e-mail: [email protected] e-mail:[email protected] 2 APPLICATIONS OF HEAD MOTION IN SOCIAL GAMES Next, we present our findings, illustrated by practical examples. 2.1 Avatar Pose Control This is the most straightforward example of motion data transfer. The user head rotation is applied to the avatar neck joint, making the avatar reproduce user head movements. Head rotation is added in a layered fashion, blending user motion with the current avatar pose. Although the technique is simple, it allows to create expres- sive poses, demonstrated in Figure 1. Figure 1: Avatar Pose Control. Free-form user rotation is directly applied to avatar’s neck, yielding new expressive poses. 2.2 Avatar Facial Expressions in Close Encounters In Blue Mars, avatars have an in-built social behavior that makes them temporarily direct their heads and eyes towards other avatars, when they enter the avatar’s viewing range. This feature is called “look-around”, and it helps to convey a message to other players that their presence is noted. We have augmented this automatic behavior with user head ro- tation. While the avatar’s eyes remain locked on the object of inter- est (i.e., the other avatar’s face), small head rotations produce new facial expressions, such as teasing, disbelief, or turning someones nose up, as illustrated in Figure 2. Figure 2: Changing avatar facial expressions. Two avatars are facing each other. Left pair: normal idle behavior when neither avatar shows signs of noticing its neighbor. Middle pair: “look-around” feature is in effect, making the avatars look at each other. Right pair: user head rotation is added, changing the avatar’s facial expressions. 69 IEEE Virtual Reality 2013 16 - 20 March, Orlando, FL, USA 978-1-4673-4796-9/13/$31.00 ©2013 IEEE

Upload: anton

Post on 11-Mar-2017

216 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: [IEEE 2013 IEEE Virtual Reality (VR) - Lake Buena Vista, FL (2013.3.18-2013.3.20)] 2013 IEEE Virtual Reality (VR) - Natural head motion for 3D social games

Natural Head Motion for 3D Social Games

Andrei Sherstyuk∗

Sunflower Research Labs

Anton Treskunov†

Samsung

ABSTRACT

Natural head motion is the key component in Virtual Reality sys-tems where users are immersed via personal eyewear. The adventof webcam tracking technologies made head motion available onevery desktop, for any 3D game, calling for revised methods ofhandling user head motion in new, non-immersive settings.

We present several novel techniques of using head motion in 3Dsocial games. We show that head tracking is not only a useful, butalso an enabling technology, that creates new means of interactionbetween players, makes gameplay more personal and game controlsmore flexible.

1 INTRODUCTION

Processing of head movements until recently required special track-ing equipment, per-user calibration or limited operating conditions.Yet, use of natural head motion in games was deemed as certainty.In 2008, J. J. LaViola Jr. wrote:

We’re in the early stages of a revolution in how videogames are played. Three-dimensional spatial interactionand VR concepts such as 3D stereo rendering and headtracking will play a crucial role in generations of futurevideo games (from [1]).

Recently introduced webcam-based tracking technologies re-moved the restrictions, mentioned above. Head tracking for gamesbecome practical.

Due to anatomical constraints, human head motion is nor-mally associated with visual tasks. Consequently, most researchwork on integrating head motion into games focused on first-person-shooters, where vision-related tasks are of utmost impor-tance [5, 6, 7]. Lately, other uses of head motion were recognized,such as pointing tasks and nonverbal information exchange in train-ing scenarios [2, 3].

1.1 Goals and Methods

In this work, we explore how natural head motion can be used insocial 3D worlds, where personalization, interaction and expres-siveness are rules of the game. We will show that many existinginterfaces and methods of controlling user avatars can be easilyaugmented with user motion, producing new visual effects and en-abling new behaviors. Also, we will discuss several ways of cameracontrols, for non-immersive game settings.

For this project, we created a prototype using Blue Mars plat-form from Avatar Reality Inc, the creators of Blue Mars Onlinesocial world. Blue Mars SDK features photorealistic avatars withMOCAP animation, powered by CryENGINE2 from Crytek. Formotion tracking, we used faceAPI system from Seeing Machines,which tracks head motion at 30 Hz, with 6 degrees of freedom [4].FaceAPI was directly compiled with the Blue Mars Sandbox Editortool, which allowed to load and explore scenes in game mode, withvarious options how head tracking should be used in the game.

∗e-mail: [email protected]†e-mail:[email protected]

2 APPLICATIONS OF HEAD MOTION IN SOCIAL GAMES

Next, we present our findings, illustrated by practical examples.

2.1 Avatar Pose Control

This is the most straightforward example of motion data transfer.The user head rotation is applied to the avatar neck joint, makingthe avatar reproduce user head movements. Head rotation is addedin a layered fashion, blending user motion with the current avatarpose. Although the technique is simple, it allows to create expres-sive poses, demonstrated in Figure 1.

Figure 1: Avatar Pose Control. Free-form user rotation is directlyapplied to avatar’s neck, yielding new expressive poses.

2.2 Avatar Facial Expressions in Close Encounters

In Blue Mars, avatars have an in-built social behavior that makesthem temporarily direct their heads and eyes towards other avatars,when they enter the avatar’s viewing range. This feature is called“look-around”, and it helps to convey a message to other playersthat their presence is noted.

We have augmented this automatic behavior with user head ro-tation. While the avatar’s eyes remain locked on the object of inter-est (i.e., the other avatar’s face), small head rotations produce newfacial expressions, such as teasing, disbelief, or turning someonesnose up, as illustrated in Figure 2.

Figure 2: Changing avatar facial expressions. Two avatars are facingeach other. Left pair: normal idle behavior when neither avatar showssigns of noticing its neighbor. Middle pair: “look-around” feature is ineffect, making the avatars look at each other. Right pair: user headrotation is added, changing the avatar’s facial expressions.

69

IEEE Virtual Reality 201316 - 20 March, Orlando, FL, USA978-1-4673-4796-9/13/$31.00 ©2013 IEEE

Page 2: [IEEE 2013 IEEE Virtual Reality (VR) - Lake Buena Vista, FL (2013.3.18-2013.3.20)] 2013 IEEE Virtual Reality (VR) - Natural head motion for 3D social games

2.3 Avatar Awareness of Player Presence

In third-person camera viewing mode, the “look-around” behaviorcan also be directed towards the players themselves. In this mode,the player’s location is represented by the virtual camera positionin the world space. By orienting its head and eyes towards the cam-era, the avatar appears looking straight in the “eye” of the player,as illustrated in Figure 3. This feature can be refined by adjust-ing the virtual camera position by the physical displacement of theplayer’s head in webcam space. As the result, the avatar will tracethe player’s head movements, while the “look-around” feature is ineffect. This behavior will strengthen the player’s impression thattheir avatars are aware of the player’s presence.

Figure 3: Avatar awareness of player presence. Left: normal idlepose. Right: avatar’s gaze locked on player. Head tracking will makethe avatar continuously maintain eye contact with the player.

2.4 Personalizing Avatar Behavior

All user-produced motions can be recorded and later reused, ei-ther on explicit command, such as key press, or embedded into au-tonomous behaviors, as the look-around feature, described above.In latter case, one can record a personalized head gesture, such as afriendly nod, that will be played when a recognized “friend” playerappears nearby. Conversely, recently un-friended players may begreeted by a head motion, indicating displeasure, for instance, turn-ing head away. Such personalized autonomous behaviors will sup-port the illusion of presence, even when the player is temporarilyaway from keyboard.

2.5 Head Motion and Camera Control

In immersive VR systems that utilize head mounted displays(HMD), head motion is almost always directly transferred to cam-era position and orientation, using one-to-one mapping. Exceptionsare made only for systems that aim to compensate for limited fieldof view of an HMD, by amplifying horizontal or vertical head ro-tation. On the contrary, non-immersive games are more flexibleabout camera controls, providing a variety of viewing options, suchas free-camera mode, third-person or aerial view. Thus, user headmotion can be treated as loosely coupled with various viewing tasks.The source data (head pose) and destination object (camera) can belinked, as deems convenient for each particular task.

In casual settings, when players are comfortably seated in theirchairs, head rotation appears to be more useful for camera control,than head translation, mainly because it takes less physical effort.We used head rotation to trigger two view-changing commands:horizontal camera sliding and bird-eye view.

The first method allows to peek around occluding objects locatedat close range, similar to peering method [5]. In our case, the cam-era sliding is invoked when players turn their head left or right formore than 30 degrees. The camera moves along X-axis, proportion-ally to the amount of rotation. The test scene is shown in Figure 4.

The bird-eye view is triggered by tilting user head back. In thispose, the screen appears at the bottom of the user field of view. Thecamera moves up and points down, giving the aerial view of thescene. To avoid view oscillations, the amount of vertical translationis quantized to three predefined values, and minimal time betweenchanges is enforced.

Figure 4: Camera sliding technique. Top: winter terrace scene, withtwo players. Middle: first person view of the female player, with themale avatar occluded by a pole. Bottom: camera is shifted, promptedby user head rotation, removing occlusion.

3 DISCUSSION AND CONCLUSIONS

We presented a number of novel interfaces for controlling useravatar appearance and behavior and in-game camera settings. Suchtechniques demonstrate that head tracking is a powerful extensionto traditional game controls, especially in the context of 3D socialworlds, where head motion can be particularly effective.

Personalizing avatar behavior with natural motion adds anothermodality to interaction between players. Nodding, head shakingand more subtle uses of body and eye language, such as gaze avert-ing or seeking eye contact, become available with head tracking.Direct control of avatar head movements creates a special bondingbetween the player and the avatar, making game more enjoyable.

We believe that social 3D games, running on single-screen desk-tops, are potentially a richer soil for applying natural head motionthan fully immersive VR systems, due to loose coupling betweenviewing tasks and head movements, and a variety of options forpersonalizing avatars behavior.

REFERENCES

[1] J. J. LaViola Jr. Bringing VR and spatial 3D interaction to the masses

through video games. IEEE Comput. Graph. Appl., 28(5):10–15, 2008.

[2] S. Marks, J. A. Windsor, and B. Wunsche. Evaluation of the effec-

tiveness of head tracking for view and avatar control in virtual environ-

ments. In Proceedings of IVCNZ’10 25th Conference, 2010.

[3] S. Marks, J. A. Windsor, and B. Wunsche. Head tracking based avatar

control for virtual environment teamwork training. In In Proceedings

of GRAPP 2011, 2011.

[4] Seeing Machines. http://www.seeingmachines.com/product/faceapi/.

version 3.2.6, September 2010.

[5] T. Sko and H. J. Gardner. Head tracking in first-person games: Inter-

action using a web-camera. In International Conference on Human-

Computer Interaction, pages 342–355, Berlin, Heidelberg, 2009.

[6] S. Wang, X. Xiong, Y. Xu, C. Wang, W. Zhang, X. Dai, and D. Zhang.

Face-tracking as an augmented input in video games: enhancing pres-

ence, role-playing and control. In SIGCHI Conference on Human Fac-

tors in Computing Systems, pages 1097–1106, 2006.

[7] J. Yim, E. Qiu, and T. C. N. Graham. Experience in the design and

development of a game based on head-tracking input. In Future Play

Conference: Research, Play, Share, pages 236–239, 2008.

70