[ieee 2012 ieee virtual reality (vr) - costa mesa, ca, usa (2012.03.4-2012.03.8)] 2012 ieee virtual...

2
Command Center: Authoring Tool to Supervise Augmented Reality Session Remi Tache * Hunfuko Asanka Abeykoon Kasun Thejitha Karunanayaka Janaka Prabhash Kumarasinghe § Gerhard Roth Owen Noel Newton Fernando k Adrian David Cheok ** Keio-NUS CUTE Center, IDM Institute National University of Singapore ABSTRACT We describe a real time authoring tool that is useful to supervise group of users in an augmented reality environment. The overall system is composed of a server; the Command Center (CC) and several clients; Wearable Systems (WS). The server is used to visu- alize the virtual representation of the real environment, 3d features map where augmentation are to occur, augmented content that will be rendered and users avatar that evolves on the field. The user’s WS is equipped of a Head Mounted Display (HMD) along with sensors (GPS/DRM/Compass) to provide constant pose estimation (position and orientation) to the CC and a vision based tracking that uses natural features and pose estimation allgorithms to render vir- tual content accordingly. Alternatively a sensor based renderer is used to display AR guidance on the HMD based on the path de- fined and sent from the CC. The natural features system handles a large number of small 3D features maps. The overall system offers a complete system to train and supervise remote users through an AR environment. Index Terms: K.4.3 [Computer-supported collaborative work]: ;— [J.7]: COMPUTERS IN OTHER SYSTEMS—Command and control 1 I NTRODUCTION Ivan Sutherland [8] created the first mixed reality system in 1968 however such mobile systems really expanded in the early 90’ through diverce forms of projects from AR tourist guide [1] to en- tertainment [9]. Later, several number of systems that use AR to im- prove collaboration and communication were developed [5, 4, 6, 2]. Monitoring tasks from a remote location is important when situ- ational awareness become a major concern; such as for fire fighting, emergency evacuation or a military operation. In these cases a com- mander needs to synchronise a team by providing overall situational feedback to the team members who do not themselves have a global overview. Therefore, the research aim to develop a complete system that offers a way to define augmented reality directional guidance that will be rendered on users HMD, position virtual objects rendered on the HMD too and to visualize the users evolving on the field along with a CAD model of the environment. The computer vision based * e-mail: [email protected] e-mail: [email protected] e-mail: [email protected] § e-mail: [email protected] e-mail: [email protected] k e-mail: [email protected] ** e-mail: [email protected] tracking is used to render virtual content such as fire or enemy using Bundler maps [3] and EPnP pose estimation. This can be used in a case of a simulation or training session to simulate the presence of enemies or any other type of dangers. Hence, the commander can edit both virtual content from the CC; the sensors based rendered (AR guidance) and the tracking based rendered (virtual objects). Additionally the commanders can see the current situation evolu- tion thanks to users’ avatar displayed along with a CAD model; virtual representation of the real environment on the CC (Figure 1). The updates are exchanged via XML file format. The CC send the updates of the map; GPS location and virtual object attached to it as well as the AR guidance coordinates. The WS send the inertial sensors data back to the CC. Figure 1: On the left: Soldier’s WS composed of sensors and a HMD. On the right: The command center that visualize the users avatar in the virtual environment and edit the AR path and virtual objects. 2 SYSTEM DESCRIPTION 2.1 System overview The command center is a real time authoring tool that allows the session commander to interact and dynamically change the aug- mented content rendered on the users HMD while visualing the sit- uation evolution through users avatar along with a CAD model of the real environment. It is used to setup training session by adding the Bundler natural features map and virtual content into a specific part of the virtual world where the augmentations are to occur. First, a CAD model of the training environment is built off-line from an accurate architectural plan. Then natural feature maps used for the computer vision based tracking in the wearable computers are created, also in an off-line process using Bundler. These natural feature maps are added to the CAD model in the command center at the locations where augmentations are to occur. The model dis- played in the command center is therefore both a CAD model of the environment, and a set of natural feature maps along with the virtual object attached to them, as shown in Figure 2. As a real-time session visualizer, the command center displays an avatar for each connected WS, and this avatar is updated as the user moves in the training environment. The visualizer also allows 99 IEEE Virtual Reality 2012 4-8 March, Orange County, CA, USA 978-1-4673-1246-2/12/$31.00 ©2012 IEEE

Upload: adrian-david

Post on 17-Mar-2017

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: [IEEE 2012 IEEE Virtual Reality (VR) - Costa Mesa, CA, USA (2012.03.4-2012.03.8)] 2012 IEEE Virtual Reality (VR) - Command Center: Authoring tool to supervise augmented reality session

Command Center: Authoring Tool to Supervise Augmented RealitySession

Remi Tache ∗ Hunfuko Asanka Abeykoon † Kasun Thejitha Karunanayaka ‡

Janaka Prabhash Kumarasinghe § Gerhard Roth ¶ Owen Noel Newton Fernando ‖

Adrian David Cheok ∗∗

Keio-NUS CUTE Center, IDM InstituteNational University of Singapore

ABSTRACT

We describe a real time authoring tool that is useful to supervisegroup of users in an augmented reality environment. The overallsystem is composed of a server; the Command Center (CC) andseveral clients; Wearable Systems (WS). The server is used to visu-alize the virtual representation of the real environment, 3d featuresmap where augmentation are to occur, augmented content that willbe rendered and users avatar that evolves on the field. The user’sWS is equipped of a Head Mounted Display (HMD) along withsensors (GPS/DRM/Compass) to provide constant pose estimation(position and orientation) to the CC and a vision based tracking thatuses natural features and pose estimation allgorithms to render vir-tual content accordingly. Alternatively a sensor based renderer isused to display AR guidance on the HMD based on the path de-fined and sent from the CC. The natural features system handles alarge number of small 3D features maps. The overall system offersa complete system to train and supervise remote users through anAR environment.

Index Terms: K.4.3 [Computer-supported collaborative work]:;— [J.7]: COMPUTERS IN OTHER SYSTEMS—Command andcontrol

1 INTRODUCTION

Ivan Sutherland [8] created the first mixed reality system in 1968however such mobile systems really expanded in the early 90’through diverce forms of projects from AR tourist guide [1] to en-tertainment [9]. Later, several number of systems that use AR to im-prove collaboration and communication were developed [5, 4, 6, 2].

Monitoring tasks from a remote location is important when situ-ational awareness become a major concern; such as for fire fighting,emergency evacuation or a military operation. In these cases a com-mander needs to synchronise a team by providing overall situationalfeedback to the team members who do not themselves have a globaloverview.

Therefore, the research aim to develop a complete system thatoffers a way to define augmented reality directional guidance thatwill be rendered on users HMD, position virtual objects rendered onthe HMD too and to visualize the users evolving on the field alongwith a CAD model of the environment. The computer vision based

∗e-mail: [email protected]†e-mail: [email protected]‡e-mail: [email protected]§e-mail: [email protected]¶e-mail: [email protected]‖e-mail: [email protected]∗∗e-mail: [email protected]

tracking is used to render virtual content such as fire or enemy usingBundler maps [3] and EPnP pose estimation. This can be used in acase of a simulation or training session to simulate the presence ofenemies or any other type of dangers. Hence, the commander canedit both virtual content from the CC; the sensors based rendered(AR guidance) and the tracking based rendered (virtual objects).Additionally the commanders can see the current situation evolu-tion thanks to users’ avatar displayed along with a CAD model;virtual representation of the real environment on the CC (Figure 1).The updates are exchanged via XML file format. The CC send theupdates of the map; GPS location and virtual object attached to itas well as the AR guidance coordinates. The WS send the inertialsensors data back to the CC.

Figure 1: On the left: Soldier’s WS composed of sensors and a HMD.On the right: The command center that visualize the users avatar inthe virtual environment and edit the AR path and virtual objects.

2 SYSTEM DESCRIPTION

2.1 System overview

The command center is a real time authoring tool that allows thesession commander to interact and dynamically change the aug-mented content rendered on the users HMD while visualing the sit-uation evolution through users avatar along with a CAD model ofthe real environment. It is used to setup training session by addingthe Bundler natural features map and virtual content into a specificpart of the virtual world where the augmentations are to occur.

First, a CAD model of the training environment is built off-linefrom an accurate architectural plan. Then natural feature maps usedfor the computer vision based tracking in the wearable computersare created, also in an off-line process using Bundler. These naturalfeature maps are added to the CAD model in the command centerat the locations where augmentations are to occur. The model dis-played in the command center is therefore both a CAD model ofthe environment, and a set of natural feature maps along with thevirtual object attached to them, as shown in Figure 2.

As a real-time session visualizer, the command center displaysan avatar for each connected WS, and this avatar is updated as theuser moves in the training environment. The visualizer also allows

99

IEEE Virtual Reality 20124-8 March, Orange County, CA, USA978-1-4673-1246-2/12/$31.00 ©2012 IEEE

Page 2: [IEEE 2012 IEEE Virtual Reality (VR) - Costa Mesa, CA, USA (2012.03.4-2012.03.8)] 2012 IEEE Virtual Reality (VR) - Command Center: Authoring tool to supervise augmented reality session

the commander to dynamically provide directional information tothe users in the form of AR arrow guidance, as shown in Figure2. The content is rendered using the sensor based rendering. Thisdynamic updating of the users augmentations is done by having thecommander directly designed a path in the CC. As an example, ina training scenario, the commander might want to have a constantcontrol of the mixed reality by enabling, disabling, or editing spe-cific augmentations.

To perform both avatar pose update in the CC and AR guidancerendered in the WS, both coordinates system (GPS and virtual co-ordinates) are calibrated during the setup. A conversion matrix thatconvert from GPS to virtual coordinates and alternatively the in-verse that convert the other way from virtual coordinates to GPS iscomputed using a standard least-squares optimization process.

2.2 Updating the User’s avatar and Performing augmen-tation

Users are displayed in the virtual world using an avatar whose po-sition and orientation is updated in real time by the sensor informa-tion provided by the wearable systems. Each wearable system hasboth a combined GPS/DRM and a digital compass which send poseinformation to the command center in real time. The CC transformsthese GPS co-ordinates into virtual coordinates of the CAD modeland display the avatar accordingly. Since both a GPS and a DRMunit are used, this GPS location information is available in both anindoor and outdoor environment. In this Figure 2 it is possible tosee the user’s avatar drawn in outside the building at the locationand orientation defined by the data received from the sensors.

Figure 2: On the left: Soldier’s avatar with the field of view in pink.On the right: natural features map points along with the virtual tankattached to it. In the middle: A path used as AR guidance.

A unique feature of our system is that it is possible for the train-ing commander to provide directional guidance to the users by sim-ply drawing paths in the CAD model. The co-ordinates of the vir-tual path are transformed into GPS position and sent to selectedwearable systems. Then each selected users’s HMD can displaythis path as the appropriate augmentations using the sensor basedrendering, as shown in Figure 1. In this way the training comman-der can provide real-time directional guidance to each user, and thishas proved to be more effective than simple audible messages [7].Since the user who receives this guidance can be selected in thecommand center, this means that different users can receive per-sonalized update. In this way the training commanders can trans-fer their situational awareness obtained from their real-time globaloverview of the session to each users as necessary.

Additionaly the command center is used to attach virtual con-tent to the natural features maps. The maps are positioned into theCommand center along with their virtual content. The virtual ob-ject position is computed using the 3d features map as origin. Thenthe commander can decide to send this matrix to the WS that willrender the virtual object accordingly once the 3d features map is

detected by the computer vision algorithm and the EPnP pose esti-mated. The computer vision based tracking is used instead of thesensor based one since it reaches centimeters accuracy whereas thesensors is subject to weather, pressure condition and so proved to beless accurate. However it is used for AR guidance since in this casemeters accuracy is not necessary and it can be available anywherewhereas computer vision needs markers (natural features map inour case).

3 CONCLUSION

We have presented a system for AR training that has a commandcenter which functions as traditional AR authoring tool, a real-time display along with real-time feedback mechanism during asession run. The wearable system for each user gives real-timeupdates of their position to the command center using sensors(GPS/DRM/Compass). The commander can visualise the evolu-tion of the users through their avatar in the training environment.Additionally, 3d features map are loaded and positioned in the CCto enable the rendering of virtual content relative to these previ-ously mentioned maps. Therefore, the commander can in real-timevisualize, design and edit the virtual world on the CC to providepersonalized AR guidance that will be render on the WS using theappropriate sensors. We believe such a novel prototype can be thebasis of a more complex system to help emergency providers in thefield. Soldiers or fire fighters could improve their efficiency throughAR simulation training and by having more accurate information indangerous real world scenario. For example, a commander coulddefine paths and guide a fire fighter though a building during a firein order to save people at a specific location or to find the appropri-ate emergency exit.

4 ADDITIONAL AUTHORS

Tam Van NGuyen, Rahul Budhiraja, Chamika Deshan, Sun Ying,Rahul Nirmesh from National University of Singapore. This re-search is partially supported by the Singapore National ResearchFoundation under its International Research Centre @ SingaporeFunding Initiative and administered by the IDM Programme Office

REFERENCES

[1] S. Freiner, B. MacIntyre, T. Hollerer, and A. Webster. A touring ma-chine: Prototyping 3d mobile augmented reality systems for explor-ing the urban environment. In Proceedings of First iEEE InternationalSymposium on Wearable computers, pages 74–81, 1997.

[2] C. Fuch, N. Aschenbruck, P. Martini, and M. Wieneke. Indoor track-ing for mission critical scenarios: A survey. In Pervasive and MobileComputing 7, pages 1–15, May 2011.

[3] U. Neumann and S. You. Naural features tracking for augmented real-ity. In IEEE Transaction on Multimedia, Vol 1, 1999.

[4] W. Pierkarski and B. H. Thomas. Interactive augmented reality tech-niques for construction at distance of 3d geometry. In 7th Int’l Work-shop on Immersive Projection Technology / 9th Eurographics Workshopon Virtual Environments, May 2003.

[5] G. Reitmayr and D. Schmalstieg. Mobile collaborative augmented re-ality. In Int’l Symposium on Augmented Reality, pages 114–123, 2004.

[6] W. L. Roux. The use of augmented reality in command and controlsituation awareness. In South African Journal of Military Studies, 2010.

[7] A. Stafford, B. H. Thomas, and W. Piekarski. Efficiency of techniquesfor mixed-space collaborative navigation. 2004.

[8] I. Sutherland. A head-mounted three dimensional display. In Proceed-ings of Fall Joint Computer Conference, pages 757–764, 1968.

[9] B. Thomas, B. Close, J. Donoghue, J. Squires, P. D. Bondi, M. Morris,and W. Piekarski. Arquake: An outdoor/indoor augmented reality firstperson application. In Proceedings of the 4th International Symposiumon Wearable Computers, pages 139–146, 2000.

100