stepping into virtual reality || other vr applications

8
13 Other VR Applications 13.1 Vehicle Simulators It is common practice to develop vehicle simulators based on physical mock- ups. They might be equipped with hydraulic platforms or based on a real car placed in front of large rear-projection screens. The primary application of vehicle simulators is driver training and behavior analysis (see Figure 13.1); they have also been used for virtual prototyping. Fig. 13.1: A car driving simulator Most car manufacturers use vehicle simulators as part of product con- ception. Car simulators allow engineers to test the car before it is built and evaluate ergonomic aspects, interior design and even road behavior.

Upload: daniel

Post on 08-Dec-2016

216 views

Category:

Documents


1 download

TRANSCRIPT

13

Other VR Applications

13.1 Vehicle Simulators

It is common practice to develop vehicle simulators based on physical mock-ups. They might be equipped with hydraulic platforms or based on a real carplaced in front of large rear-projection screens. The primary application ofvehicle simulators is driver training and behavior analysis (see Figure 13.1);they have also been used for virtual prototyping.

Fig. 13.1: A car driving simulator

Most car manufacturers use vehicle simulators as part of product con-ception. Car simulators allow engineers to test the car before it is built andevaluate ergonomic aspects, interior design and even road behavior.

182 13 Other VR Applications

For virtual prototyping, simulators based on physical reproduction of cab-ins require substantial time and cost to be manufactured. Therefore, theycannot be reconstructed each time to reflect every part updated on the CADdesign. In particular, this difficulty appears when performing ergonomic andvisibility evaluations according to changes in the car interior.

Immersive Virtual Reality provides a natural alternative. Virtual proto-types can replace physical mockups for the analysis of design aspects like: lay-out and packaging efficiency; visibility of instruments, controls, and mirrors;reachability and accessibility; clearances and collisions; human performance;and aesthetics and appeal. The goal is to immerse a person in the virtual carinterior, to study the design and interact with the virtual car.

Completely virtual car simulators suitable for analysis activities are stillunfeasible due to the limitations of the technology, in particular concerningthe haptic feedback. Nevertheless, some prototypes have been developed andtested.

In [325], Kallman et al. present a simulator system built for both trainingand ergonomics-related tasks. The system can be used in two different con-figurations: the first is based on a physical mockup of the vehicle, equippedwith a force-feedback steering wheel, gearshift, and pedals (see Figure 13.2a);the second configuration is based on a fully virtual control metaphor, allowingone to interact with the vehicle only through the use of motion trackers anddata gloves.

(a) Vehicle simulator (b) Virtual cockpit

Fig. 13.2: Virtual vehicle simulators

Virtual Reality interfaces can also be used to teleoperate real vehicles.An example of this kind of application is presented by Ott et al. in [326].The authors developed a virtual cockpit with haptic feedback provided by aHaptic WorkstationTM Four alternative teleoperation interfaces were imple-mented. Each interface exploited different aspects of Virtual Reality and haptictechnologies: realistic 3D virtual objects, haptic force-feedback, and free arm

13.2 Manufacturing 183

gestures. Tests with multiple users were conducted to evaluate and identifythe best interface in terms of efficiency and subjective user appreciation.

The interface that got the best evaluation was a gesture-based interfacethat used free arm gestures to drive the vehicle (a small robot). The authorsconcluded that an efficient interface for direct teleoperation must have richvisual feedback in the form of passive controls such as speedometers and di-rection indicators. Such visual aids were appreciated by users once they werereleased from the burden of manipulating the virtual steering wheel and throt-tle. Force feedback shall be exploited not as a way to simulate tangible objects(interface resembling reality) but to drive the user’s movements (gesture-basedinterface).

The free-form (gesture-based) interface was efficient because it did notrequire precise manipulations. It reduced the amount of concentration requiredto drive. The user could direct her attention to the rest of the visuals and usethem to improve the driving.

Virtual interfaces resembling real cockpits are usually more intuitive inthe sense that users know immediately how they work, from previous real-world experience. Nevertheless, the available hardware can make them lessefficient due to problems with the grasping mechanisms and force-feedbackinherent to the Haptic WorkstationTM Other more advanced haptic interfacesmay provide better results.

The work of Salamin et al. [327] is another example of experimentationwith virtual cockpits using the Haptic WorkstationTM In collaboration withthe Renault Trucks Cockpit Studies Department, the authors developed avirtual cockpit with a tangible gearbox. The user was able to grasp and ma-nipulate the gearbox lever and feel the mechanical forces due to the gearbox.Such forces were generated according to the specifications provided by RenaultTrucks. Other forces were applied to constrain the user’s movements and pro-vide some comfort by partially compensating for the weight of the exoskeleton(a “gravity compensation” mechanism described in [328]). Figure 13.2b showsthe virtual cockpit and the virtual representation of the user’s hand.

13.2 Manufacturing

Manufacturing comprises some basic classes of engineering tasks. Such taskscan benefit from the application of VR through high-fidelity and interactivevisualizations. The simulation of assembly processes requires taking into ac-count several assembly conditions: availability of assembly room, supply ofmaterials and handling of parts and assembly tools. A virtual environmentallows one to interactively produce an assembly sequence plan that considersthe real production environment. The basic principles of this planning arethe structural and geometric construction of the product and all restrictionsthat arise from the limited assembly space and the restriced accesibility. In[329], Ou et al. present a Web-based manufacturing simulation system based

184 13 Other VR Applications

on VRML (Virtual Reality Modeling Language). A more recent work [330]uses X3D, the succesor to VRML as the basis for a distributed virtual man-ufacturing system for small and medium enterprises. The system allows oneto perform collaborative tasks including product design, manufacturing andresource sharing through the Web.

In [331], Salonen et al. present a system that demonstrates the use ofaugmented reality and multimodal interfaces for assisting human operatorsduring assembly tasks. The system uses a wooden 3D puzzle as a simplifiedassembly task in a factory. The prototype system uses 3D models generatedusing CAD-system and translated to STL (the standard for rapid prototypingsystems) or VRML. The visualization can be done using a monitor or HMD.A Webcam is used to acquire the images of the user’s hands and pieces ofthe 3D puzzle. The wooden pieces have fiducial markers that can be recog-nized through image processing, using the Augmented Reality Toolkit (AR-Toolkit, freely available at http://www.hitl.washington.edu/artoolkit/). Oncethe markers are recognized, the orientation of each part can be computed and3D images can be superimposed on the real image. The assembly work is de-scribed using an XML file describing the marker board file, the models andtheir identifiers (the markers), and the work phases and parts that belong toeach phase. When a work phase is going on, the application visualizes the partthat belongs to that work phase and shows the required action. For example,pointing with an arrow to the next part that should be taken and displaying ananimation of how to mount this part in the object being assembled. The usercan move forward and backward in assembly phases using different modalities:keyboard, speech or gesture commands. The most convenient modalities arespeech and gesture user interfaces. The system can be controlled by choosingthe appropriate function from a virtual menu using hand gestures.

Fig. 13.3: Augmented reality for operator training

Augmented and mixed reality seem to be excellent tools for manufacturingplanning, simulation, and training. In [332], Liverani et al. describe a systemfor real-time control of human assembling sequences of mechanical compo-nents. The development method they propose involves a CAD environment, a

13.3 Entertainment 185

hardware system called Personal Active Assistant (PAA), and a set of mixedreality features. The whole scheme is targeted at reducing the gap betweenengineers and manual operators by means of CAD and mixed reality tech-nologies. The system is based on a CAD assembly module and mixed realitywearable equipment. It can be used to improve several activities in the indus-trial field, such as operator professional training, optimal assembly sequenceseeking, or on-field teleconferencing (suitable for remote collaboration or forfull exploitation of concurrent engineering suggestions during design and setupstages). The main characteristic of the PAA is a real-time wireless linkage toa remote server or designer workstation, where project geometric databaseis stored. The Mixed Reality wearable equipment is a see-through HMD anda head-mounted camera. The user can freely operate in the mixed environ-ment, while the camera can record the human-driven assembly sequence andcheck the efficiency and correctness via object recognition: an incremental sub-assembly detection algorithm has been developed to achieve complex datasetmonitoring. Conversely, designers or assembly planners can exploit the pecu-liarities of mixed reality–based assembly: a straightforward interaction withthe assembly operator can be obtained by sending vocal advice or displayingsuperimposed visual information on the real scene.

Another example of the use of augmented reality for operator trainingin a manufacturing context is the work of Vacchetti et al. [333], partiallydeveloped in the framework of the IST STAR European project. The systemtracks the 3D camera position by means of a natural feature tracker, which,given a rough CAD model, can deal with complex 3D objects. The trackingalgorithm is described in [204]. The tracking method is robust and can handlelarge camera displacements and aspect changes (see Figure 13.3). The targetapplications are industrial maintenance, repair, and training. The trackingrobustness makes the AR system able to work in real environments such asindustrial facilities, not only in the laboratory. Real-time object recognitionhas been used to augment real scenes with virtual humans and virtual devicesthat show how to operate various types of industrial equipment.

13.3 Entertainment

Virtual Reality constantly borrows video game technology (advanced render-ing algorithms, user interaction paradigms and interfaces) and vice versa;some of the advanced concepts and technologies that were created in researchlaboratories have found their way into the game industry. VR devices likeHMDs, data gloves, and sensors have been used with different degrees of suc-cess in video games. In some cases, advanced computer graphics are comple-mented with highly realistic interfaces to produce a compelling experience; seeFigure 13.4. One of the most recent success stories is the case of the NintendoWii. The Wii remote game controller, equipped with accelerometers and aninfrared sensor that detect its spatial orientation, makes possible the creation

186 13 Other VR Applications

of appealing games. VR and human-computer interaction researchers havestudied this kind of gesture-based interface before and continue to test novelconcepts. Sometimes researchers use commercially available devices. This isthe case of Shirai et al. [334], who developed some motion-analysis methodsusing the Wii remote.

Fig. 13.4: Simulator or advanced video game

A very representative example of VR research applied to the entertain-ment industry is the case of VR Studio, the Walt Disney Company’s centerof excellence in real-time graphics and interactive entertainment. The stu-dio was founded in 1992 to explore the potential of Virtual Reality tech-nology for theme park attractions. They have also developed Toontown(http://play.toontown.com), one of the first massively multiplayer onlineworlds designed for children ages 7 and older.

One of the first VR-based attractions they developed was an interactivesystem that uses HMDs to immerse multiple users in the movie Rocketeer.Another attraction based on the movie Aladdin also used an HMD but was asingle-user system.

VR Studio was responsible for the development of three of the main Dis-neyQuest attractions (a theme park that opened in Orlando in 1998):

• Aladdin’s Magic Carpet Ride, a four-person race through the worldof Aladdin. The players use custom-designed HMDs and climb aboardmotorcycle-like vehicles to begin the game.

• Hercules in the Underworld. This attraction uses a five-screen immersiveprojection theater (IPT). This is similar to a CAVE, but with an hexagonal

13.3 Entertainment 187

shape. The players used 3D glasses and a joystick to navigate through thevirtual world.

• Pirates of the Caribbean: Battle for Buccaneer Gold. The game takes placein a four-screen immersive theater built around a ship-themed motionplatform. The ship has a real steering wheel and six physical cannons. Theplayers wear stereo glasses and all screens are in stereo.

A historical review of VR Studio describing its research experience is pre-sented by Mine in [217]. One of the most interesting lessons learned reportedby Mine is the fact that immersive multiuser (CAVE-like) displays are veryeffective not only as game interfaces but also as design tools. This kind ofdisplay allows one to share experiences and ease communication between par-ticipants while examining a prototype or playing a game.

Another important point reported by VR Studio is that the use of physicalinterfaces helps to immerse users in a VR experience. An example of this isthe work of Abaci et al. [335], “Enigma of the Sphinx.” This is a VR-basedgame that uses a large rear-projection screen and a multimodal interface called“Magic Wand.” The Magic Wand is a stick equipped with a magnetic tracker.It uses both posture and speech recognition to let the user interact with objectsand characters in a virtual world based on ancient Egypt (see Figure 13.5).The VR game was played by several users in a special event; the Magic Wandwas an intuitive interface that helped players to get more involved in thevirtual world.

Fig. 13.5: “Enigma of the Sphinx” using a multimodal interface: the Magic Wand

The industry of video games has produced highly advanced interactive3D (I3D) authoring tools and game engines [336]. These software tools arebeing used not only for developing new games but also all kinds of “serious”applications in the areas of simulation, training and education.

188 13 Other VR Applications

More and more industries and researchers are turning to available gameengines and other I3D software issued from the video game industry. Thereare free (mostly open source) and commercial products, and most of the timethey allow one to reduce the costs of development, in terms both of time andfinancial budget.

Among the most popular and advanced I3D software, we can consider:

• Unreal game engine (http://www.unrealtechnology.com), a game develop-ment framework that provides a vast array of core technologies (state-of-the-art 3D rendering), content-creation tools, and support infrastructure(e.g., networking functionality for distributed virtual environments).

• Virtools (http://www.virtools.com/), a set of tools designed to developinteractive real-time applications for industry and games. It supports thecreation of a range of VR applications: online, desktop-based, and large-scale interactive digital mockups.

• OpenSceneGraph (http://www.openscenegraph.org), an open source high-performance 3D graphics toolkit. It is used by application developers infields such as visual simulation, games, Virtual Reality, scientific visual-ization and modeling. OpenSceneGraphs was written in Standard C++and OpenGL; it runs on all Windows platforms, OSX, GNU/Linux, IRIX,Solaris, HP-Ux, AIX and FreeBSD operating systems.

• Quest3D (http://www.quest3d.com/), a software developer’s kit (SDK)that supports various VR devices (inertial and magnetic trackers, datagloves) and provides support for dynamic simulation (Newton physics).