[ieee 2010 ieee virtual reality conference (vr) - boston, ma, usa (2010.03.20-2010.03.24)] 2010 ieee...

2
Augmenting Virtual Worlds with Real-life Data from Mobile Devices Heikki Laaki*, Karel Kaurila*, Karl Ots*, Vik Nuckchady*, Petros Belimpasakis *Nokia Corporate Development Unit, Nokia Research Center ABSTRACT Virtual worlds have typically been isolated from the real environment, treated as separate parallel worlds. In this paper we show a scenario where context data collection from mobile devices can be used for augmenting virtual worlds with real-life data. Life-logging elements are used to control an avatar, in a virtual world, as a way to re-play experiences. The prototype system, which was implemented for proving the concept, is presented. KEYWORDS: Augmented Virtuality, Virtual Worlds, Life- Logging, Mobile Context Data INDEX TERMS: H.3.3 [Information Systems]: Information Storage and Retrieval -information search and retrieval; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism – virtual reality 1 INTRODUCTION The term of Mixed Reality (MR) has been defined by Milgram and Kishino [7] with help from the concept of Virtuality Continuum (VC). In [9] the components of Mixed Reality are presented, as shown in Figure 1. Figure 1. Components of Mixed Reality By blurring the lines between the real and the digital, we are targeting to create an Augmented Virtuality system that brings elements of the real-life world, via Life-logging, to virtual environments, such as Virtual Worlds. More specifically, we want to use mobile smart phones as context gathering devices for logging one’s life. Then, import this data, automatically, to a virtual world and influence the virtual environment or the user’s avatar, as well as the interaction with the other virtual world players. 2 RELATED WORK Life logging, or recording, has researched in various domains, for capturing users’ activities in both digital and physical world, e.g. in [3] & [8], focusing on building a live digital archive. The raw data captured can be further analyzed to provide a “track back” feature and an overall understanding of the past. Further than that, PeopleTone [5] presented a mobile application with auditory and tactile cues, reflecting buddy proximity detection rather than life exploration and communication. Location sharing among buddies in the Reno system [10] enabled context-aware implicit communication between people. According to [6] the integration of real-world properties can add to the immersiveness of a gaming experience, especially when atmospheric properties such as background audio are utilized. In [2] different pervasive gaming paradigms are surveyed with examples of classic computer games mapped onto real-world settings and games with mixed players on a city street with online players in a parallel virtual city. The same work suggests the usage of mobile devices for capturing information about their current context, including their location in order to deliver an enhanced gaming experience. The work of [4] is augmenting aerial visualizations of earth with dynamic information obtained from videos, but does not utilize contextual information from personal devices. 3 OUR SYSTEM Our work is targeting to add real life elements to the virtual worlds, by making them a place to re-play life-logging experiences. We are focusing on experiences captured using mobile personal devices, as context information collectors. 3.1 The Scenario and Demonstrator In our basic scenario a user is walking outdoors, listening to music via his mobile smart phone, and enjoying a relaxing walking route. This detailed information, about the walked track and the listened music at any given point is stored, initially on the device and later uploaded to a server for analysing. At a later time in can be reproduced, in a synchronized manner, in a virtual world as show in our prototype screenshot of Figure 2. Figure 2. Replaying a Real-life Track in a Virtual World The user avatar is moving, on the virtual world, exactly as the user was walking in the real life, mimicking his location, orientation and speed. The avatar is moving on a map illustrating the actual location, and the computer speakers play the same {firstname.lastname}@nokia.com, P.O. Box 1000, 33721, Tampere, Finland 281 IEEE Virtual Reality 2010 20 - 24 March, Waltham, Massachusetts, USA 978-1-4244-6238-4/10/$26.00 ©2010 IEEE

Upload: petros

Post on 04-Mar-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: [IEEE 2010 IEEE Virtual Reality Conference (VR) - Boston, MA, USA (2010.03.20-2010.03.24)] 2010 IEEE Virtual Reality Conference (VR) - Augmenting virtual worlds with real-life data

Augmenting Virtual Worlds with Real-life Data from Mobile Devices

Heikki Laaki*, Karel Kaurila*, Karl Ots*, Vik Nuckchady*, Petros Belimpasakis•

*Nokia Corporate Development Unit, •Nokia Research Center

ABSTRACT Virtual worlds have typically been isolated from the real environment, treated as separate parallel worlds. In this paper we show a scenario where context data collection from mobile devices can be used for augmenting virtual worlds with real-life data. Life-logging elements are used to control an avatar, in a virtual world, as a way to re-play experiences. The prototype system, which was implemented for proving the concept, is presented. KEYWORDS: Augmented Virtuality, Virtual Worlds, Life-Logging, Mobile Context Data INDEX TERMS: H.3.3 [Information Systems]: Information Storage and Retrieval -information search and retrieval; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism – virtual reality

1 INTRODUCTION The term of Mixed Reality (MR) has been defined by Milgram

and Kishino [7] with help from the concept of Virtuality Continuum (VC). In [9] the components of Mixed Reality are presented, as shown in Figure 1.

Figure 1. Components of Mixed Reality

By blurring the lines between the real and the digital, we are targeting to create an Augmented Virtuality system that brings elements of the real-life world, via Life-logging, to virtual environments, such as Virtual Worlds. More specifically, we want to use mobile smart phones as context gathering devices for logging one’s life. Then, import this data, automatically, to a virtual world and influence the virtual environment or the user’s avatar, as well as the interaction with the other virtual world players.

2 RELATED WORK Life logging, or recording, has researched in various domains,

for capturing users’ activities in both digital and physical world, e.g. in [3] & [8], focusing on building a live digital archive. The raw data captured can be further analyzed to provide a “track

back” feature and an overall understanding of the past. Further than that, PeopleTone [5] presented a mobile application with auditory and tactile cues, reflecting buddy proximity detection rather than life exploration and communication. Location sharing among buddies in the Reno system [10] enabled context-aware implicit communication between people.

According to [6] the integration of real-world properties can add to the immersiveness of a gaming experience, especially when atmospheric properties such as background audio are utilized. In [2] different pervasive gaming paradigms are surveyed with examples of classic computer games mapped onto real-world settings and games with mixed players on a city street with online players in a parallel virtual city. The same work suggests the usage of mobile devices for capturing information about their current context, including their location in order to deliver an enhanced gaming experience. The work of [4] is augmenting aerial visualizations of earth with dynamic information obtained from videos, but does not utilize contextual information from personal devices.

3 OUR SYSTEM Our work is targeting to add real life elements to the virtual worlds, by making them a place to re-play life-logging experiences. We are focusing on experiences captured using mobile personal devices, as context information collectors.

3.1 The Scenario and Demonstrator In our basic scenario a user is walking outdoors, listening to

music via his mobile smart phone, and enjoying a relaxing walking route. This detailed information, about the walked track and the listened music at any given point is stored, initially on the device and later uploaded to a server for analysing. At a later time in can be reproduced, in a synchronized manner, in a virtual world as show in our prototype screenshot of Figure 2.

Figure 2. Replaying a Real-life Track in a Virtual World

The user avatar is moving, on the virtual world, exactly as the user was walking in the real life, mimicking his location, orientation and speed. The avatar is moving on a map illustrating the actual location, and the computer speakers play the same

{firstname.lastname}@nokia.com, P.O. Box 1000, 33721, Tampere, Finland

281

IEEE Virtual Reality 201020 - 24 March, Waltham, Massachusetts, USA978-1-4244-6238-4/10/$26.00 ©2010 IEEE

Page 2: [IEEE 2010 IEEE Virtual Reality Conference (VR) - Boston, MA, USA (2010.03.20-2010.03.24)] 2010 IEEE Virtual Reality Conference (VR) - Augmenting virtual worlds with real-life data

music the user was listening at the given points. The trail behind the avatar shows the last meters of his walk, while the music notes denote that audio is rendered by the computer’s speakers.

We are reproducing the experience, both for the user to recall, as well as for other on-line virtual world users to also experience. For other users, the audio volume rendering is increasing as their avatar approaches the avatar of the main user and mutes when their distance increases beyond a given limit.

3.2 Prototype Implementation We prototyped the whole system utilizing of-the-shelf

components. The system was based on the high-level architecture illustrated in Figure 3.

Figure 3. System Architecture

As a mobile smart phone we used a Symbian-OS based device, equipped with the context collection daemon described in [1]. That software, which was running on the background, was constantly monitoring the location of the device from the Global Positioning System (GPS) and the songs played by the standard media player of the phone. This information was periodically, i.e. every couple of minutes, uploaded to a “Raw Context Database” and from there harvested by our “Virtual World Sever”, via Hypertext Transfer Protocol (HTTP) web interfaces.

The Virtual World Server was based on the open source realXtend (http://www.realxtend.org/) platform. On the same server, our components were gathering from an external map service (in our case from Google Maps), tiles of the user’s location and rendering them as the ground for the avatar to walk on. The track of the user was imported as a set of latitude-longitudes-timestamp triples, which was used for reproducing the users walk, on top of the textured map. Moreover, the “Raw Context Database” was providing to our system the listened songs, in a set of artist-title-duration-timestamp entries, based on which we could search for the actual song file on an external music service. In our case we used YouTube for searching the corresponding music video file and locally transcoded it (from Flv to Ogg format), in order to be playable in the virtual world.

3.3 Learnings The movement of the avatar was very smooth, and knowing the

given points of the path, along with timestamps, we could pre-calculate the trajectory of the avatar. On the other hand, the accuracy of the mobile phone GPS module was not sufficient, for such a scenario where an avatar needs to be placed on top of a map, on a specific road. The avatar was crossing, from time to time, road boundaries and appeared a bit awkward. A solution for fixing this, apart from having better GPS accuracy, would be to utilize vector maps, instead of bitmap tiles, so that the avatar movement could be guided on top of strictly defined road paths.

Apart from the asynchronous “generate life-logging data”-“replay data in virtual world” model we prototyped, we also tried to implement a real-time version of the system, where the avatar would move in the virtual world, at the same time the actual user was walking outdoors. However, most of our components utilized

a data “periodic-pull” model, rather than “push” model, thus this introduced many delays in the data flow chain. For example, pulling the data from the “Raw Context Database” was a bottle neck. One solution could be the utilization of an instant messaging style protocol (e.g. the Extensible Messaging and Presence Protocol – XMPP), for delivering the context data from the mobile device to the virtual world server, in almost real-time, rather than using the existing pull based HTTP interfaces.

4 CONCLUSIONS & FUTURE WORK Our prototype implementation proved our target, that virtual

worlds can be enhanced by importing contextual data, gathered by every day personal user devices, such as smart phones. Combining life-logging and virtual worlds, two of the core mixed reality components, brings real-life as an augmentation to the typically isolated virtual worlds. A natural evolution of this work would be to focus on the real-time context import, so that we could study the interactions between a user being in-situ, i.e. in the outdoor environment, and others being only on the virtual world, via their avatars. Moreover, other contextual information gathered by mobile devices, such as detailed user activity, could bring even more useful real-life information to the virtual world. Of course, that would lead to many privacy concerns which would need further study.

REFERENCES [1] P. Belimpasakis, K. Roimela, Y. You. Experience Explorer: a Life-

Logging Platform Based on Mobile Context Collection. In Proceedings of the Third IEEE Conference on Next Generation Mobile Applications, Services and Technologies, pp. 77-82, September 2009.

[2] Benford, S., Magerkurth, C., and Ljungstrand, P. 2005. Bridging the physical and digital in pervasive gaming. Commun. ACM 48, 3 pp. 54-57, March 2005.

[3] J. Gemmell, G. Bell, and R. Lueder, MyLifeBits: a personal database for everything. Communications of the ACM, vol. 49, no. 1, pp. 88-95, January 2006.

[4] K. Kim, S. Oh, J. Lee, I. Essa, Augmenting Aerial Earth Maps with Dynamic Information. In Proceedings of the Eighth IEEE/ACM International Symposium on Mixed and Augmented Reality, October 2009.

[5] K.A. Li, T.Y. Sohn, S. Huang, and W. G. Griswold, Peopletones: a system for the detection and notification of buddy proximity on mobile phones. In Proceeding of the 6th international Conference on Mobile Systems, Applications, and Services, June 2008.

[6] C. Magerkurth, T. Engelke, M. Memisoglu. Augmenting the virtual domain with physical and social elements: towards a paradigm shift in computer entertainment technology. Comput. Entertain. 2, 4, Oct. 2004.

[7] P. Milgram, F. Kishino. A Taxonomy of Mixed Reality Visual Displays. IEICE Transactions on Information Systems E77-D (12): 1321--1329, 1994.

[8] A. J. Sellen, A. Fogg, M. Aitken, S. Hodges, C. Rother, and K. Wood. Do life-logging technologies support memory for the past?: an experimental study using sensecam. In Proceedings of the SIGCHI Conference on Human Factors in Computing System, pp. 81-90, April 2007.

[9] J. M. Smart, J. Cascio, and J. Paffendorf. Metaverse Roadmap: Pathways to the 3D Web. http://www.metaverseroadmap.org, 2007

[10] I. Smith, S. Consolvo, J. Hightower, J. Hughes, G. Iachello, A. LaMarca, J. Scott, T. Sohn, and G. Abowd, Social Disclosure of Place: From Location Technology to Communication Practice. In Proceedings of the 3rd International Conference on Pervasive Computing, pp.134-151, May 2005.

282