Transcript
  • FPSpaceInvaders: a VR game for

    the Oculus Rift

    Tiago Augusto Engel

    Department of Computer Science

    Swansea University

    This Project Report is submitted in partial fulfillment for the

    Science Without Borders Programme

    September 2014

  • Declaration

    This work has not previously been accepted in substance for any degree and is

    not being currently submitted for any degree.

    September 4, 2014

    Signed:

    Statement 1

    This dissertation is being submitted in partial fulfillment of the requirements for

    the Science Without Borders Programme.

    September 4, 2014

    Signed:

    Statement 2

    This dissertation is the result of my own independent work/investigation, except

    where otherwise stated. Other sources are specifically acknowledged by clear

    cross referencing to author, work, and pages using the bibliography/references. I

    understand that failure to do this amounts to plagiarism and will be considered

    grounds for failure of this dissertation and the degree examination as a whole.

    September 4, 2014

    Signed:

    Statement 3

    I hereby give consent for my dissertation to be available for photocopying and for

    inter-library loan, and for the title and summary to be made available to outside

    organisations.

    September 4, 2014

    Signed:

  • Abstract

    Virtual Reality (VR) is becoming every day more accessible. New

    technologies such as head mounted displays (HMDs) are finally avail-

    able to the public at an affordable price, increasing the power of en-

    gagement and immersion into the virtual world. These devices aim

    for an immersive experience, specially in games, and the applications

    vary widely. Such devices use stereoscopic 3D images and rendering

    them is nowadays a simple task, however the accuracy of the final

    result is key. In a synthetic environment dangers of poor stereoscopic

    renderings range from eyestrain and headaches, to users feeling nox-

    ious in the virtual world, to rapid loss of interest. In this project

    we look at the issue of rendering accurate stereoscopic images alone

    and within the augmented reality device called Oculus Rift. From

    the study of this new technology, we developed and evaluated a Space

    Invaders-based game. The preliminary results show that the HMD

    provides a more engaging and immersive experience, enhancing the

    sense of spatial presence, focus and enjoyment.

  • Contents

    Contents iii

    1 Introduction 1

    2 Background Information 3

    2.1 Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    2.2 Stereoscopy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

    2.2.1 Rendering Stereo Images . . . . . . . . . . . . . . . . . . . 6

    2.2.2 Health Concerns . . . . . . . . . . . . . . . . . . . . . . . 7

    2.3 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

    3 The Oculus Rift 10

    3.1 Technical Aspects . . . . . . . . . . . . . . . . . . . . . . . . . . 10

    4 Game Development 13

    4.1 Development Environment . . . . . . . . . . . . . . . . . . . . . . 13

    4.2 FPSpaceInvaders . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

    4.2.0.1 Design Level . . . . . . . . . . . . . . . . . . . . 15

    4.2.0.2 The Player . . . . . . . . . . . . . . . . . . . . . 17

    4.2.0.3 The Enemies . . . . . . . . . . . . . . . . . . . . 19

    4.2.0.4 Oculus Rift Integration . . . . . . . . . . . . . . 19

    4.2.0.5 3D Audio Implementation . . . . . . . . . . . . . 20

    4.3 Empirical Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . 21

    5 Results and Discussion 23

    iii

  • CONTENTS

    6 Summary 25

    6.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

    6.2 Future Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

    A Questionnaire 27

    References 32

    iv

  • Chapter 1

    Introduction

    Since the release of the movie Avatar in 2009, the world started to realize the po-

    tential of 3D technology and its applications. One of the latest ground-breaking

    inventions is the Oculus Rift [1]. It represents a new era on the gaming expe-

    rience, the first head mounted display (HMD) to be affordable for the general

    public, bringing virtual reality to a whole new level.

    HMDs rely on 3D stereoscopic images to enforce the sense of depth. Stere-

    oscopy increases the experience of immersion and spatial presence [2]. However,

    such images arent natural to our eyes because the stimulus they produce dif-

    fer from the ones of the real world [3]. This effect happens because the image

    provided to each eye is produced on a flat surface. Most common symptoms

    associated are eye strain, disorientation and nausea.

    The challenge nowadays is to create realistic images in a way that minimizes

    the problems it may cause to the users. Therefore, game designers have to take

    in account several factors when designing games for such devices, having in mind

    that an immersive experience depends on how the application is adapted to the

    new environment. For instance, the relation of size between objects must be re-

    alistic, otherwise the perception of depth may be inconsistent. Furthermore, the

    head tracking device must be able to reflect the same sings given by the player in

    the game, for instance if the player moves its head at a certain speed, the game

    must do it the same way, otherwise the brain may lose the sense of movement

    1

  • and acceleration, then the player may start feeling dizziness.

    We present FPSpaceInvaders, a VR Space Invaders-based game where the

    player relies on the Oculus Rift to track the head movements when aiming for the

    targets during the game. The view is from a first person perspective, additionally,

    the 3D sound system contributes to enhance the virtual reality sense, giving a

    more realistic experience during the game.

    The report is divided into 4 main chapters. Chapter 2 gives background

    information about stereo images and how to render them, as well as applications

    on HMDs such as the Oculus Rift, which is detailed in Chapter 3. In Chapter 4

    we detail the game we developed and the development process to build it. Finally,

    Chapter 5 evaluates the results obtained and discusses possible improvements.

    2

  • Chapter 2

    Background Information

    This chapter is about defining the necessary concepts that will be used during

    this report. Section 2.1 links stereoscopy and its role in the VR world, as well

    as within the Rift environment. Section 2.2 gives a basic explanation about

    stereoscopy and its drawbacks, while section 2.3 shows some cases where the Rift

    was used as VR display.

    2.1 Virtual Reality

    The term Virtual Reality has been target of the media in the last few years due

    to the to the incredible advances we are facing. The release of novel technologies

    has brought VR to a level never seen before. VR is by no means an original

    concept, as various similar concepts have been commercially available since the

    early 1970s [4]. It refers to a synthetic environment created to enhance the user

    interaction and immersion with the application, relying on three-dimensional,

    stereoscopic, head-tracked displays, hand/body tracking, and binaural sound [5].

    VR will be introduced into daily life and serve the people in various ways. Its

    applications, going beyond just entertainment, could bring strong contributions

    in fields such as: education, military training simulation and remote robot oper-

    ation. The VR market is expected to grow exponentially in the next few years

    as there are major enterprises interested in investing on these new technologies.

    3

  • The latest deal worth mentioning is the one made by Facebook, who bought the

    start-up named Oculus VR for 2 billion dollars. Sony is also working on a HMD

    called Morpheus.

    Stereoscopic displays play an important role in VR environments. They use

    stereoscopic images to improve the users perception of immersion in the virtual

    world, enabling a better understanding of the presented data, proportions and

    positions of objects. Devices that use stereoscopic displays include head mounted

    displays. The HMDs have become particularly popular since the release of Oculus

    Rift in the last year. The next section gives a historical and technical background

    about the process used to generate images used in these displays.

    2.2 Stereoscopy

    Ever since the early times, the man has realized that viewing with one eye (left

    or right) is slightly different than viewing with both at the same time. However,

    this phenomenon was not documented until 1838 when Charles Wheatstone first

    explained the binocular vision and invented the stereoscope (Figure 2.1). He

    demonstrated that our perception of depth is given by the combination our brain

    does of the images seen by each one of our eyes.

    Later on, by the end of the 19th and the beginning of the 20th century, stere-

    oscopy began to attract interest from the cinematographic industry. William

    Friese-Greene registered the first patent, where stereoscopic 3D films were broad-

    cast on two separate screens. Viewers could then view the screens through a

    stereoscope. In 1922 the first commercial 3D movie was released and the anaglyph

    glasses were released. In the decades that followed, the high costs and the effects

    of the Great Depression on film studios prevented large investments and the stu-

    dios faced several ups and downs. The Polarized lenses were released in 1934, but

    they only became popular in 1986 when IMAX released the movie Transitions,

    this new technology introduced several advantages over the anaglyph glasses. It

    was in 2009 when James Camerons Avatar was released that the public was truly

    4

  • Figure 2.1: Wheatstones stereoscope

    amazed with the experience. The movie was one of the most maybe the most

    expensive movies ever made, due to the fact that the technology was entirely

    new. Commercially however, it was worth the investment because the movie is

    the highest grossing film so far, proving that the viewers are willing to pay since

    the experience is worth the price.

    Stereoscopy is a technique to create or enhance the illusion of depth in images.

    Most methods use two images rendered with a slight horizontal offset mimicking

    the different perspective through which our eyes see the world. (see Figure 2.2).

    Our brains combine these images giving the feeling of spatial depth. Stereoscopic

    vision probably evolved as a means of survival [6]. In games, it increases the

    experience of immersion, spatial presence, and also what is known as simulator

    sickness [2]. Results related to attention and cognitive involvement indicate more

    direct and less thoughtful interactions with stereoscopic games, pointing towards

    a more natural experience through stereoscopy. However, these advantages come

    with a health concern. Section 2.2.2 discusses the possible side effects stereo im-

    ages can cause to users.

    5

  • Figure 2.2: Example of Active Stereo Image

    2.2.1 Rendering Stereo Images

    Rendering stereoscopic images using a graphics toolkit is a straightforward task.

    There are two main ways to display the images on the screen: Passive and Active

    stereo. In Passive stereo mode, two images are rendered using different polarizing

    filters, and projected superimposed onto the same screen, viewers are required to

    wear polarized glasses. This is the model used in cinemas. On the other hand,

    Active stereo consists of vertically splitting the screen in half, left and right, and

    displaying on each half an image as perceived by the left or right eye respectively,

    see Figure 2.2. This mode will be emphasized on this section because it is the

    mode HMDs use.

    The simplest way is to render the scene once for each eye, though it isnt the

    fastest way. There are several ways to render a stereoscopic scene, normally they

    depend on the hardware and toolkit available. Nowadays, it is possible to find

    specialized hardware to speed-up the rendering task. They usually have a quad-

    buffer system, which provides front and back buffers for each eye. This hardware

    6

  • is common in high-end applications [7].

    In general, the idea to set up two virtual cameras mimicking the behaviour

    of the left and right eye respectively. The virtual scene is rendered through each

    virtual camera and stored in a buffer. With the screen split into two halves, left

    and right, at each refresh of the screen the scene from a buffer is projected on the

    respective half of the screen, with one half of the screen rendering what seen from

    the left eye and the other half of the screen rendering what seen from the right

    eye. To achieve an optimal effect in the final rendering three important aspects

    must be taken into account:

    1. distance between the cameras called Inter-pupillary Distance (IPD)

    must be scaled in human proportions;

    2. camera projection axes must be parallel to each other, with the left and

    right views independent of one another; and

    3. perspective projection mode should be used [8].

    The Figure 2.3 shows how the camera model that should be adopted.

    Figure 2.3: Cameras with respective projection cones [9]

    2.2.2 Health Concerns

    One of the biggest drawbacks of devices that use stereo images is the adverse

    effects it may have on users. Recently, we see game developers being concerned

    7

  • about the future of such technologies due to simulator sickness. Aaron Foster,

    developer of the Routine [10] horror game, wrote recently on a Steam Commu-

    nity update that the development team had to slow down the VR integration

    due to motion sickness, arguing that though his team is very excited about the

    immersion level it brings to the game, they are sceptic and, using his own words,

    for now, we cant fully commit to a VR version of Routine.

    Indeed, the way our brain processes a regular scene is different from a stereo

    image of the scene. In the natural world, objects at different distances will pro-

    vide certain amount of stimulus to the accommodative system. On the other

    hand, in a computer screen there will be always the same stimulus because there

    is only one focal distance, even when objects are placed at different distances

    inside the scene. In other words, in the real world when we focus our view on

    an object, the others around appear blurred. While on a screen, there is only

    one focal distance and the entire scene is always on focus. Howarth [3] describes

    in depth these topics and explains how movement, IPD, discrepancy and focus

    influence the user experience.

    One of the recurring side-effects reported by stereoscopic displays users is the

    Visually Induced Motion Sickness (VIMS). When people are exposed to motion

    e.g. traveling by car, plane, or in a cruise they may start having symptoms such

    as pallor, sweating, nausea and, in some cases, vomiting [11, 12, 13]. As the

    source of this sickness is our sight, the same can be induced by moving images in

    displays. The VIMS is introduced when the movement in an image gives a sense

    of vection (illusion of self-motion). Bles [11] defined motion sickness as follows all

    situations which provoke motion sickness are characterized by a condition in which

    the sensed vertical as determined on the basis of integrated information from the

    eyes, the vestibular system and the non-vestibular proprioceptors is at variance

    with the expected vertical as predicted on the basis of previous experience.

    8

  • 2.3 Applications

    Due to the relative novel release of the technology, devices have been available for

    only a year, research involving the Oculus Rift is currently limited. Contributions

    are in the form of open source projects, under development, or short publications.

    In this section we list the most significant contributions existing in literature.

    Bolton [14] presented PaperDude, a VR cycling based exergame system in-

    spired by the Ataris Paperboy. The implementation was made using an Oculus

    Rift VR headset, a Trek FX bicycle attached to a Kickr power trainer and a

    Kinect camera. The idea is that the user rides the bike and throws papers in

    mailboxes. The Kinect sensor is used to capture the players arm movement,

    providing a natural input interface. While the HMD increases the players im-

    mersion in the environment.

    Pittman [15] used the Rift combined with other input devices for robot navi-

    gation. The results show a preference of head rotation motions other than head

    gestures. The subjects found that though it could cause nausea, the head move-

    ment was a very effective input movement.

    Ikeuchi [16] joined a drone, Kinect and a HMD to simulate flying. A video

    stream is captured and shown in the HMD, while Kinect tracks the movements

    of the player. Players can control the drone through natural gestures, promoting

    a realistic experience of flying.

    Halley-Prinable [17] compared the level of immersion on Rift with traditional

    monitors using fear. A game was developed and the subjects submitted to play

    half time with the traditional screen and the rest wearing the Oculus Rift. The

    players hearth rate was recorded during the experiment. From the 56 subjects,

    results show that 2 people found the screen more immersive, 3 people found both

    options equally immersive, and the remaining 51 people found the Rift to be

    the most immersive. The author also concluded from the questionnaire and the

    hearth rate data that fear level was increased when using the Rift.

    9

  • Chapter 3

    The Oculus Rift

    The Oculus Rift is a virtual reality HMD developed by Oculus VR. The project

    received support from the community in the Kickstarter [18] to release its first

    version. Currently, only the Development Kit (DK) version is available, as the

    consumer version is expected to become available in 2015. Rift is seen as the

    mark of a new era for virtual reality, it is free of most of the restrictions applied

    to previous products such as Nintendos Virtual Boy, CAVE environments [17]

    and Nvis SX60. Virtual Boy was reported as a commercial failure due to a num-

    ber of reasons including price and discomfort to use, CAVE environments are also

    expensive and too large. Young [19] compared Rift and Nvis in perception and

    action tasks, the results show that though some people felt less simulator sickness

    on the Nvis, it is expensive and Rift consistently outperformed the Nvis on all

    other aspects.

    The Rift has two main components: the headset and the control box. A dia-

    gram is shown on the Figure 3.1.

    3.1 Technical Aspects

    The company offers a Software Development Kit (SDK) [9] for developers to

    adapt and build new games for the device. The first release of the development

    10

  • Figure 3.1: Oculus Rift Headset and Control Box [9]

    kit was in 2013 with the DK. Finally, the current version of the DK was released

    in July 2014. The SDK offers a reliable source of code, samples and documenta-

    tion about features and capabilities of the device. It also offers official support

    for commercial and open-source game engines. There are normally two ways to

    develop for the Rift: to use a Game Engine or to a stereoscopic rendering envi-

    ronment anew.

    Oculus VR offers official support for the Rift development on Unity and Un-

    real game engines. Other non-commercial engines that support the Rift are also

    available. On the other hand, for developers who want to create game engines

    with native support for the Rift or even simple games without an engine, the

    SDK offers an interface in C that can be easily used to setup the device. The de-

    veloper also needs a graphic toolkit such as OpenGL or Directx. The interface in

    C allows developers to link the code from other languages such as Python or Java.

    One important point in the development of applications for the Rift is that

    the stereo images cant be used on their own as there is a wide-angle optics in

    front of the display. The lenses cause a distortion (pincushion form) and chro-

    matic aberration on the edges. Therefore, there must be a post-processing step

    called warping, where a barrel distortion is applied to the original image. The

    comparison between both distortions are shown in Figure 3.2.

    11

  • Figure 3.2: Image warping for the Rift

    The warping is normally done at shader-level and from version 0.2 of the SDK

    this functionality has been given to the SDK, which means that the developer

    can simply pass the references for the textures and the API will perform the

    distortion (SDK rendering mode). Also, there is the possibility for developers to

    customize the distortion shaders (client side rendering), as well as combine other

    shaders in the process.

    12

  • Chapter 4

    Game Development

    The game development process is complex, time consuming and demands a broad

    knowledge of graphics, artificial intelligence, and game design; in the case of

    HDMs, knowledge of all aspects related to immersive environments is also needed.

    Indeed, the paradigms of game design and user interface components must be re-

    vised in order to improve the user immersion and engagement with the game.

    This chapter shows the games development process and the challenges we found

    to work with these difficulties.

    4.1 Development Environment

    Before deciding how to proceed, we made a background study to understand the

    concepts involved with stereoscopy (section 2.2) and developed some examples

    using the OpenGL toolkit. From there, we had a look at some previous studies

    involving applications (section 2.3) for the device. Then we entered on the device

    itself, studying the SDK and evaluating examples in C/C++ and on the Unity

    Engine (section 3.1). After building a solid basis, we decided to build a game

    using an engine and apply the concepts studied.

    The game was developed to the Oculus Rift DK version 1.1 with SDK version

    0.3.2 preview 2 integrated with the Unity Engine Pro 4.5.2f1 [20] for educational

    13

  • use. In this project we chose to use the Unity game engine since to build every-

    thing from the beginning would have been time consuming and we wouldnt be

    able to focus on the game itself. The Unity support is offered in form of a Unity

    Prefab 1 which is attached to a Unity project. The scripting language used is C#

    which is also the main programming language used for development in Unity.

    The machine used is a laptop Dell with 4GB RAM, processor Intel Core i5-

    4200U at 1.60 GHz 64 bits with a NVIDIA GFORCE 740M video card, running

    Microsoft Windows 8. We also used a repository for the code maintenance. The

    service is based on Git and called Bitbucket [21].

    Finally, we had to create an application, having in mind that the time was

    limited and the game should be simple enough to be finished and evaluated in

    time. In the previous studies we found that applications in space gave very

    interesting experiences to the users, then one idea came out: Space Invaders.

    The idea was evaluated and validated, then we started the development process.

    4.2 FPSpaceInvaders

    The game developed is a remake of the classic arcade game Space Invaders devel-

    oped by Tomohiro Nishikado and released in 1978. The original version is a 2D

    space where the player controls a cannon and can move it sideways to shoot the

    alien enemies. The enemies come in waves, attempting to destroy the player by

    firing at it while they approach the bottom of the screen. If the enemies arrive

    to the bottom of the screen, the alien invasion is successful and the game ends.

    Figure 4.1 shows the game design.

    Our remake is a 3D version where the player is a spaceship and the environ-

    ment is a stellar skybox. The spaceship can move up and sideways, while the

    cannon attached is controlled by the HMD sensor, which means that to aim an

    enemy spaceship, the player can simply rotate his or her head towards the enemy

    1Asset type that allows to store a GameObject object complete with components and prop-erties.

    14

  • Figure 4.1: Snapshot from the original Space Invaders

    and shoot with the mouse.

    4.2.0.1 Design Level

    The design started by looking for ways to create a realistic space environment.

    First, we searched for a stellar skybox and found a generator called SpaceScape [22],

    where we were able to create a skybox with stars and nebulas. The next step was

    to create some planets and a Sun. As part of the environment, there are asteroids

    that may hit the Player.

    One important aspect in the development process is the object modeling. In-

    deed, we were able to model the spaceships on Blender [23] but as this isnt our

    expertise, the models turned out too heavy (memory consumption) and the tex-

    tures of low quality. Thats why we decided to use the spaceship models provided

    in the Unity Demo Space Shooter. They are free and can be obtained on the

    Unity Tutorials website.

    In order to start the game, the player has to select the options New Game or

    Load Game from a main menu. The menu was designed to be different from the

    15

  • conventional style. The player is created as a regular person on a plane and the

    menu items are placed on a semi-circle around it. To select an option the player

    must walk, using the keyboard, through the item. Then the action selected will

    be performed. This interface offers a more natural interaction. Figure 4.2 shows

    a segment menu where two options can be seen.

    Figure 4.2: Main Menu View (one eye view)

    Item selection by movement is an interesting aspect and can be explored in

    depth due to its impact on the final user experience. Indeed, the traditional 2D

    menus arent interactive and we wanted to give a sense that the player is actually

    in the scene, since the moment when the game is launched.

    The GameController is responsible for spawning the enemies and asteroids,

    as well as manage the game flow. A Finite State Machine (FSM) was created to

    handle three simple states: running, paused and idle. FSM is an elegant, simple

    and extensible way to manage the game states and actions that must be triggered

    16

  • when changing states. Game controller is also responsible for keeping score.

    4.2.0.2 The Player

    The player was attached to the spaceship, as if it was manning a cannon on a war

    tank. To accomplish its goal, it has to shoot and destroy the enemy spaceships

    before they perform the invasion. The cannon aim is attached to the HMD sensor,

    which means that when the player wants to shoot a target, all it has to do is to

    turn its head towards the target and press mouse left click. Also, the player is

    allowed to move vertically (keys W and S) and horizontally (keys A and D) on

    the screen. Figure 4.3 is a snapshot that shows the players aim.

    Figure 4.3: Players aim (one eye view)

    The players most important class is the PlayerController. It is responsible

    for controlling audio, cannon, life, collision and movement. The Cameras are

    attached to the spaceship, they are wrapped on a Unity Prefab that is given with

    the SDK integration. In order for it to work in our game a few modifications were

    necessary, though it was a straightforward task. The PlayerController uses the

    17

  • cameras position and rotation to spawn bullets when the mouse is clicked, there-

    fore the shot and the cross-hair will always be synchronized with the HMDs

    sensor.

    Another important component related to the player is the HUD (heads-up dis-

    play), which consists of several pieces of information about the character. This is

    important for the user experience and differs from the traditional style because

    it isnt simply about positioning components on screen coordinates. We found

    that components on the 3D space were better to access and created a higher level

    of immersion. Here we found a lack of guidelines to such environment on the

    literature, and due to time constraints, we werent able look closer into this issue.

    However, there is no mandatory formula to good results, as there are multiple

    ways to achieve it.

    Our game contains a very simple HUD with two components: the life bar

    and the level progression bar. The first was designed in a curved shape and lo-

    cated on the left-hand side of the screen to follow eyes curvature, while the last

    was positioned on the top of the screen. Both were designed using GIMP editor

    [24]. Note that, different from the traditional paradigm, these components are

    positioned in world coordinates. Game designers must analyze beforehand the

    impacts the components may cause on the games course. Specially when pop-up

    menus or more detailed HUDs are required, as this may cause confusion and bad

    user experience if not well designed.

    The player is destroyed when: (1) the enemies perform the invasion; or (2)

    players life ends due to collisions with asteroids or shots by enemies. The player

    has a percentage of damage it can take before being destroyed, so the incidents

    mentioned in (2) may happen several times before the player is destroyed.

    18

  • 4.2.0.3 The Enemies

    The enemies are spaceships whose goal is to destroy the player. In the original

    game, the enemies dont have artificial intelligence, they simply are spawned on

    top of the screen and fly in a zig-zag path towards the bottom. Our remake is

    very similar, except that they are randomly spawned in a plane in front of the

    player and fly towards a plane behind the player. If any enemy arrive to the latter

    plane, the invasion is completed and the player is destroyed.

    Different from the player, the enemies dont have any tolerance to damages.

    Therefore, when hit by an asteroid or the player, the enemy ship is destroyed im-

    mediately. For future works, it would be interesting to have some special ships,

    who demand more than one hit to be destroyed, as well as special weapons.

    4.2.0.4 Oculus Rift Integration

    The integration for the Rift environment is one of the most important steps in

    our process. Some issues were mentioned already, such as menu and user interface

    components design (sections 4.2.0.1 and 4.2.0.2). Others arise such as creating

    the crosshair so the player could aim the targets easily. The problem in this case is

    that the crosshair cant be placed in screen coordinates, which is the conventional

    way. We also added a feature to detect when a target is on the aim, changing the

    crosshairs color. The last feature can easily be implemented in Unity by casting

    a ray of length D from the cameras forward vector, then detecting whether it

    collides with any object. This technique is known as raycast.

    The Oculus VR provides two Prefabs ready to be attached on an object or

    scene. The OVRPlayerController consists of a regular player controller with

    a stereo camera attached. This means that it has already the attributes from

    a regular player such as walking and gravity, as well as the camera controllers

    already set-up for the Rift. This Prefab was used on the main menu. The second

    Prefab is the OVRCameraController which consists of two configured cameras

    ready to be attached to an object. This Prefab was used on the player spaceship,

    19

  • as we have an object with different capabilities (e.g. there is no gravity in space)

    and the movements are different from a first person controller.

    4.2.0.5 3D Audio Implementation

    An important feature added to our game is the 3D audio or stereo audio. A

    3D sound simulation can improve the capabilities of VR systems dramatically.

    Unfortunately, realistic 3D sound simulations are expensive and demand a signif-

    icant amount of computational power to calculate reverberation, occlusion, and

    obstruction effects [25]. As evidence of the computational cost, there are only a

    few commercial games that use this technology.

    However, as our application is not computationally heavy, we considered this

    feature. First, we made an example application, in order to evaluate if it was

    something worth to be added to our game. The example showed remarkable

    good results, the level or localization is high anywhere the audio source is lo-

    cated. Having this good experience, we decided to add the feature to our game.

    Implementing 3D sound in Unity is a straightforward task, which consists of

    adding the sound to the object as a Audio Source, set it as 3D sound and adjust

    the parameters. Then, an audio source was attached to the three main compo-

    nent of the scene (player, enemies and asteroids). It is important to notice that

    this technology requires a stereo headphone, otherwise the effect will not be felt.

    Figure 4.4 shows a snapshot from the unity environment, with the player and the

    3D audio source.

    The experiments show a really new experience when dealing with objects in-

    dividually, however when several of them come at the same time, the level of

    perception is lowered. To this date, the enemies and asteroids come from only

    one direction and it is an open space, so the 3D sound doesnt play a decisive role

    in the game. But for other styles such as first person shooters, this functionality

    should have a really good effect. The combination of the Oculus Rift with the

    stereo audio has the power to bring an extraordinary experience to the user, and

    20

  • Figure 4.4: Unity 3D sound parameters and player

    studies on this field must be encouraged.

    4.3 Empirical Evaluation

    As part of the game evaluation, we invited 10 participants to try the game and

    complete a short questionnaire. Appendix A shows the questionnaire used. In

    developing our questionnaire we followed the approach used by Jennet [26]. We

    aimed at collecting information about users background and gaming experience,

    their evaluation of our game in terms of level of engagement, immersive experi-

    ence and interaction with menus and controls.

    The evaluation form therefore was divided into three main sections: (1) de-

    mographics, (2) game experience and immersion and (3) interactions. The first

    section, demographics, regards information about participants, such as age, sex

    and previous game experiences. The second section is focused more on the game

    21

  • itself, we gather information about the participants experience during the game,

    factors such as immersion, focus and sense of spatial presence are evaluated. Fi-

    nally, we evaluated the players interaction with the game regarding access to

    controls, performance and menus, as well as suggestions and comments.

    Participants were welcomed in the laboratory and verbally informed about

    the procedure, then played the game for 10-15 minutes. Finally, each participant

    filled the questionnaire and was thanked by the test administrator. Please notice

    that participants wore stereo headphones during the whole test in order to take

    advantage of the 3D sound, as well as to prevent distractions and noises.

    Most participants invited are friends with the test administrator, others are

    students of the laboratory where we developed the game. All of the participants

    were students, male, nine of them were 18-24 years old and one of them 25-

    34. Six of them have Intermediate and one of them Expert gaming experience.

    Evidencing that 70% of the participants have relevant experience in games. The

    other three are Beginners. The platform most used is Desktop (9 participants),

    followed by Smartphone (7 participants) and Playstation (4 participants).

    22

  • Chapter 5

    Results and Discussion

    Results obtained give us a feedback about what we have done and serve as guid-

    ing factor to future works. However, in-depth studies with more participants are

    needed to enforce and detail the results.

    Preliminarily results show that for most players the sense of spatial presence

    was enhanced, and they lost track of time while playing because the game was

    able to hold their full attention. These results match with [2]. Subjects reported

    they felt as the game as an experience rather than simply a task, showing that

    they enjoyed the activity. Subjects also reported a feeling of actually being in the

    virtual world, stating that this experience was more engaging than their previous

    ones.

    The interactions with the game were an interesting point of this project. Tra-

    ditionally, aiming for targets is done with the mouse cursor. However, as shown

    in Section 4.2.0.2, our game uses the tracking device from the HMD to control the

    aim. This new paradigm caused confusion on the beginning because players were

    trying to move the aim with the mouse instead of the head. This phenomenon

    is understandable when having the first experiences within the new environment

    because participants were used to aim with the mouse, and none of them had had

    any experience with HMDs.

    The main menu was target of both complains and compliments. While some

    23

  • participants pointed that the walk-through selection provided an interesting inter-

    action, other thought that it could be done by approaching the item and pressing

    a key to select or using mouse click. Therefore, more studies are needed to find

    out which one provides the best experience.

    Though most players found the game easy to control, some of them had diffi-

    culties finding the right keys to press, specially at the beginning. This happened

    because they cant see the keys they are pressing. A study using joysticks would

    be interesting to evaluate whether it could be used as an effective replacement

    for the keyboard. The stereo audio experience could be clearly noted when en-

    emies were nearby, however the effect was diminished when several objects were

    emitting sounds at the same time.

    The general complain among the participants was the screen resolution, re-

    ferring to the evident individual pixels that can be seen. Indeed, this problem

    was reported by other authors [14, 15, 17] and contributed for VIMS symptoms.

    Oculus VR claims that the DK2 fixes this problem by having a resolution of 960

    x 1080 per eye (against the 1200 x 800 of DK1).

    The minority of the participants suffered from early symptoms of VIMS after

    the tests. We associated these effects to the fact that the game was designed with

    continuous waves of enemies, therefore there was no intervals. It was noticed that

    an unnecessary effort was demanded, and may have contributed to the symptoms

    when they stopped playing. Future works should allow moments of rest, such as a

    small interval between waves, when the player can relax during the game instead

    of being constantly focused. Also, studies are needed to create interactions in a

    way to prevent the players from moving the head too fast and abruptly. This

    may cause the symptoms mentioned before, as well as neck discomfort.

    24

  • Chapter 6

    Summary

    6.1 Conclusion

    Recent advances on the virtual reality have brought it to a new era. One of the

    most expected device was the Oculus Rift, that is the first affordable HMD to

    the general population, bringing the power of the virtual world to a market not

    previously explored. The applications of such devices can be found on the most

    different fields of study.

    In this project we developed an application to the Rift environment. Starting

    by studying the stereoscopic images used on such devices, we evaluated methods

    of rendering them and the dangers of poor quality. Later, we studied the Rift itself

    and previous applications developed for it in order to understand what changes

    in relation to the common design principles. Finally, we developed a game based

    on the classic Space Invaders where the player can control the spaceships weapon

    aim with the tracking sensor of the HMD, this input mode was associated with

    the mouse input for shooting and the keyboard to move the spaceship. We con-

    ducted a small experiment with subjects and drew some conclusions over it.

    As previous mentioned, the overall experience depends on several factors al-

    together. We can say that our game fulfilled its goals and good results were

    achieved. The Oculus Rift indeed is a very powerful device that should become

    25

  • widespread in the next few years, bringing immersive experiences to the con-

    sumers. Efforts are being done by the game industry to adapt and build games

    for this new environment. This report can be used as a starting read for beginners

    on the topic, as well as to point some items that should be looked out to allow

    more immersive game experiences within the Rift.

    6.2 Future Works

    The Rift is still on its early developments, however the perspective is very promis-

    ing. The field of study created about this technology is still on its early stages,

    therefore, there is a lot to be done in order to improve the user experience. Gen-

    erally speaking, we found a lack of guidelines on how to create user interfaces (i.e.

    menus, GUIs, HUD) that take advantage of this environment. Another challenge

    is to create efficient input methods, in part prevent the users from having adverse

    effects from using Rift, as well as to create interactive and immersive experiences

    associating devices such as the HMDs sensor, mouse, keyboard, joysticks, cam-

    eras such as the Kinect and leap motion sensors.

    Specifically about the developed game, there is room for improvements and

    extensions. We found the need to create a mechanism to allow the user to rest

    between waves of enemies, as the lack of it may have caused an unnecessary

    demand of effort from the players. Furthermore, the aspects mentioned before

    about input methods apply here as well. It is possible to extend the current

    game by adding special spaceships and weapons, as well as creating different

    space environments. The results encourage us to carry on the work.

    26

  • Appendix A

    Questionnaire

    27

  • Immersion Questionnaire

    Please answer the following questions by circling the relevant answer.

    Demographics

    Sex:

    Female Male

    Age group:

    12-17 18-24 25-34 35 and over

    Occupation (please specify your field of work or study):

    Do you play video-games?

    Never Seldom Often

    How would you rate your gaming experience?

    None whatsoever Beginner Intermediate Expert

    If you play video-games which of the following platforms have you used? (You can choose multiple

    options)

    Desktop/Laptop

    Smartphone

    Tablet

    Playstation

    Xbox

    Wii

    Other (please specify)

  • Your Experience of the Game.

    Please answer the following questions by circling the relevant number.

    To what extent did the game hold your attention?

    Not at all 1 2 3 4 5 A lot

    To what extent did you feel you were focused on the game?

    Not at all 1 2 3 4 5 A lot

    How much effort did you put into playing the game?

    Very little 1 2 3 4 5 A lot

    Assessing Immersion

    To what extent did you lose track of time?

    Not at all 1 2 3 4 5 A lot

    To what extent did you feel consciously aware of being in the real world whilst playing?

    Not at all 1 2 3 4 5 Very much so

    To what extent were you aware of yourself in your surroundings?

    Not at all 1 2 3 4 5 Very aware

    To what extent did you notice events taking place around you?

    Not at all 1 2 3 4 5 A lot

    Did you feel the urge at any point to stop playing and see what was happening around

    you?

    Not at all 1 2 3 4 5 Very much so

    To what extent did you feel that you were interacting with the game environment?

    Not at all 1 2 3 4 5 Very much so

  • To what extent did you feel as though you were separated from your real-world

    environment?

    Not at all 1 2 3 4 5 Very much so

    To what extent did you feel that the game was something you were experiencing, rather

    than something you were just doing?

    Not at all 1 2 3 4 5 Very much so

    To what extent was your sense of being in the game environment stronger than your

    sense of being in the real world?

    Not at all 1 2 3 4 5 Very much so

    To what extent did you feel emotionally attached to the game?

    Not at all 1 2 3 4 5 Very much so

    To what extent were you interested in seeing how the games events would progress?

    Not at all 1 2 3 4 5 A lot

    How much did you want to win the game?

    Not at all 1 2 3 4 5 Very much so

    Were you in suspense about whether or not you would win or lose the game?

    Not at all 1 2 3 4 5 Very much so

    To what extent did you find the immersive experience more engaging than your previous gaming

    experiences?

    Not at all 1 2 3 4 5 Very much so

  • Assessing Controls

    To what extent did you find the Menu selection easy to control?

    Not at all 1 2 3 4 5 Very much so

    To what extent did you find the game easy to control?

    Not at all 1 2 3 4 5 Very much so

    To what extent did you find easy to assess your performances during the game?

    Not at all 1 2 3 4 5 A lot

    To what extent did you enjoy the graphics and the imagery?

    Not at all 1 2 3 4 5 A lot

    How much would you say you enjoyed playing the game?

    Not at all 1 2 3 4 5 A lot

    Would you like to play the game again?

    Definitely not 1 2 3 4 5 Definitely yes

    Suggestions on how to improve Menus and/or interaction with Menus?

    Suggestions on how to improve the Graphics?

    Any other comment?

  • References

    [1] Oculus vr, 2014. URL http://www.oculusvr.com/.

    [2] Jonas Schild, Joseph LaViola, and Maic Masuch. Understanding user experi-

    ence in stereoscopic 3d games. In Proceedings of the SIGCHI Conference on

    Human Factors in Computing Systems, CHI 12, pages 8998, New York, NY,

    USA, 2012. ACM. ISBN 978-1-4503-1015-4. doi: 10.1145/2207676.2207690.

    URL http://doi.acm.org/10.1145/2207676.2207690.

    [3] Peter A Howarth. Potential hazards of viewing 3-d stereoscopic television,

    cinema and computer games: a review. Ophthalmic and Physiological Optics,

    31(2):111122, 2011. ISSN 1475-1313. doi: 10.1111/j.1475-1313.2011.00822.

    x. URL http://dx.doi.org/10.1111/j.1475-1313.2011.00822.x.

    [4] R. Shields. The Virtual. Key ideas. Routledge, 2003. ISBN 9780415281805.

    URL http://books.google.co.uk/books?id=x6JLD8pHdhIC.

    [5] R.A. Earnshaw, M.A. Gigante, and H. Jones. Virtual Reality Systems. Aca-

    demic Press, 1993. ISBN 9780122277481.

    [6] Vision 3d, 2014. URL http://www.vision3d.com/stereo.html.

    [7] John M. Zelle and Charles Figura. Simple, low-cost stereographics: Vr

    for everyone. In Proceedings of the 35th SIGCSE Technical Symposium on

    Computer Science Education, SIGCSE 04, pages 348352, New York, NY,

    USA, 2004. ACM. ISBN 1-58113-798-2. doi: 10.1145/971300.971421. URL

    http://doi.acm.org/10.1145/971300.971421.

    [8] Paul Bourke. Stereographics theory. URL http://paulbourke.net/

    exhibition/vpac/theory.html.

    32

  • REFERENCES

    [9] Oculus rift sdk, 2014. URL http://static.oculusvr.com/

    sdk-downloads/documents/Oculus_SDK_Overview_0.3.2_Preview2.

    pdf.

    [10] Routine game status update, 2014. URL http://steamcommunity.com/

    sharedfiles/filedetails/updates/92985806.

    [11] W. Bles, J.E. Bos, B. De Graaf, E. Groen, and A.H. Wertheim. Motion

    sickness: Only one provocative conflict? Brain Research Bulletin, 47(5):481

    487, 1998. URL http://www.scopus.com/inward/record.url?eid=2-s2.

    0-0032534164&partnerID=40&md5=99f5dffb04410583a8e3a7fa76cca331.

    cited By (since 1996)86.

    [12] Jelte E. Bos, Willem Bles, and Eric L. Groen. A theory on visually in-

    duced motion sickness. Displays, 29(2):47 57, 2008. ISSN 0141-9382.

    doi: http://dx.doi.org/10.1016/j.displa.2007.09.002. URL http://www.

    sciencedirect.com/science/article/pii/S0141938207000935. Health

    and Safety Aspects of Visual Displays.

    [13] P.A. Howarth and M. Finch. The nauseogenicity of two methods of navigat-

    ing within a virtual environment. Applied Ergonomics, 30(1):3945, 1999.

    [14] John Bolton, Mike Lambert, Denis Lirette, and Ben Unsworth. Paperdude:

    A virtual reality cycling exergame. In CHI 14 Extended Abstracts on Human

    Factors in Computing Systems, CHI EA 14, pages 475478, New York, NY,

    USA, 2014. ACM. ISBN 978-1-4503-2474-8. doi: 10.1145/2559206.2574827.

    URL http://doi.acm.org/10.1145/2559206.2574827.

    [15] Corey Pittman and Joseph J. LaViola, Jr. Exploring head tracked head

    mounted displays for first person robot teleoperation. In Proceedings of the

    19th International Conference on Intelligent User Interfaces, IUI 14, pages

    323328, New York, NY, USA, 2014. ACM. ISBN 978-1-4503-2184-6. doi:

    10.1145/2557500.2557527. URL http://doi.acm.org/10.1145/2557500.

    2557527.

    [16] Kohki Ikeuchi, Tomoaki Otsuka, Akihito Yoshii, Mizuki Sakamoto, and Tat-

    suo Nakajima. Kinecdrone: Enhancing somatic sensation to fly in the sky

    33

  • REFERENCES

    with kinect and ar.drone. In Proceedings of the 5th Augmented Human

    International Conference, AH 14, pages 53:153:2, New York, NY, USA,

    2014. ACM. ISBN 978-1-4503-2761-9. doi: 10.1145/2582051.2582104. URL

    http://doi.acm.org/10.1145/2582051.2582104.

    [17] Adam Halley-Prinable. The oculus rift and immersion through fear. Tech-

    nical report, Bournemouth University, 2013. URL http://www.academia.

    edu/5387318/The_Oculus_Rift_and_Immersion_through_Fear.

    [18] Kickstarter: bring creative projects to life, 2014. URL www.kickstarter.

    com.

    [19] Mary K. Young, Graham B. Gaylor, Scott M. Andrus, and Bobby Boden-

    heimer. A comparison of two cost-differentiated virtual reality systems for

    perception and action tasks. In Proceedings of the ACM Symposium on

    Applied Perception, SAP 14, pages 8390, New York, NY, USA, 2014.

    ACM. ISBN 978-1-4503-3009-1. doi: 10.1145/2628257.2628261. URL

    http://doi.acm.org/10.1145/2628257.2628261.

    [20] Unity 3d engine, 2014. URL http://unity3d.com/.

    [21] Bitbucket: unlimited private code repositories, 2014. URL https://

    bitbucket.org/.

    [22] Alex C. Peterson. Spacescape: tool for creating space skyboxes with stars

    and nebulas, 2014. URL http://alexcpeterson.com/spacescape.

    [23] Blender modeling toolkit, 2014. URL http://www.blender.org/.

    [24] Gimp - the gnu image manipulation program, 2014. URL www.gimp.org.

    [25] Kai-Uwe Doerr, Holger Rademacher, Silke Huesgen, and Wolfgang Kubbat.

    Evaluation of a low-cost 3d sound system for immersive virtual reality train-

    ing systems. IEEE Transactions on Visualization and Computer Graphics,

    13(2):204212, 2007. ISSN 1077-2626. doi: http://doi.ieeecomputersociety.

    org/10.1109/TVCG.2007.37.

    34

  • REFERENCES

    [26] Charlene Jennett, Anna L. Cox, Paul Cairns, Samira Dhoparee, Andrew

    Epps, Tim Tijs, and Alison Walton. Measuring and defining the experience of

    immersion in games. Int. J. Hum.-Comput. Stud., 66(9):641661, September

    2008. ISSN 1071-5819. doi: 10.1016/j.ijhcs.2008.04.004. URL http://dx.

    doi.org/10.1016/j.ijhcs.2008.04.004.

    35

    Contents1 Introduction2 Background Information 2.1 Virtual Reality 2.2 Stereoscopy 2.2.1 Rendering Stereo Images2.2.2 Health Concerns

    2.3 Applications

    3 The Oculus Rift 3.1 Technical Aspects

    4 Game Development 4.1 Development Environment4.2 FPSpaceInvaders4.2.0.1 Design Level 4.2.0.2 The Player 4.2.0.3 The Enemies4.2.0.4 Oculus Rift Integration4.2.0.5 3D Audio Implementation

    4.3 Empirical Evaluation

    5 Results and Discussion 6 Summary6.1 Conclusion6.2 Future Works

    A Questionnaire References


Top Related