[ieee 2010 ieee virtual reality conference (vr) - boston, ma, usa (2010.03.20-2010.03.24)] 2010 ieee...

2
An Augmented Reality View on Mirror World Content, with Image Space David J. Murphy Markus Kähäri Ville-Veikko Mattila Nokia Research Center, Helsinki Nokia Research Center, Helsinki Nokia Research Center, Tampere ABSTRACT We present a prototype mobile augmented reality client addition to the “Image Space” mixed reality media sharing service. We have explored how the real world aligned "mirror world" content from that service can be interacted with in-situ and identified two different use scenarios - geospatial media sharing and social connection. Since both the existing web based mirror world modality and the additional mobile augmented reality modality intersect at the common data, the combined service is as an example of how mirror worlds can be used to bridge the real and the virtual, and allow for interaction from either side of the reality continuum. KEYWORDS: Mixed Reality, Augmented Reality, Mirror Worlds. INDEX TERMS: H.5.1 [Multimedia Information Systems]: Artificial, augmented and virtual realities 1 INTRODUCTION Nokia Image Space [12] is a prototype social media sharing service that records the position and orientation of media at capture time, and uses this metadata to present media in a virtual world such that the spatial relationships are preserved. Virtual spaces which map directly to real-world content in this manner are often referred to as “mirror worlds”, since they “reflect” the make-up of the real world they model, and fall into the “augmented virtuality” segment of Milgram’s reality continuum [9]. The goal of the mirror world modality in Image Space is to contextualize the user’s own media, and to allow users to avail of the spatiality as a narrative dimension. The service provides users with tools for easy packaging of spatial, narrative driven media collections into shareable objects called “scenes”. Scenes allow users to share specific experiences, rather than the general feeling of being present in the location. Our work presents a functional system that uses augmented reality to provide an alternate modality upon the common basis data, making the system a more holistic mixed-reality experience applicable in the real world as well as from in front of a monitor, and making creation of AR content as easy as taking a picture. 2 DISCUSSION Höllerer et al [6] explored overlaying photographs coordinated to real-world objects as part of a situated documentary, using their MARS augmented reality system [4]. MacIntyre et al [8] explored AR as a new media type, and examined how existing formats can be remediated for new media. We see this work as conceptually similar, lowering the barrier to acceptance of AR as a medium by illustrating how it relates to existing media. Other famous AR remediation works include Billinghurst et al’s Magic Book [2], and Piekarski et al’s AR Quake [11]. Guvën et al [5] have also explored overlaying images on historical sites, coining the term “situated media”. Using Engström’s concept of a “social object” [3], our work tries to bring the Image Space media to AR keeping the images themselves as the social object of the service, presenting the images and the experience they intend to convey in context, rather than using the images to bring context to the AR viewed environment. Figure 1 UI view with clickable image outlines, POI. 3.1 Use Case 1: Geospatial Media Sharing At its core, the Image Space service is about sharing geospatial media, allowing users to capture, share and experience remotely via a web-browser. The AR modality enables in-situ experience sharing, allowing users located where the media was captured to access the spatially-organized media. The spatial images are presented in the augmented reality view as frame-outlines, aligned with their capture position and orientation (Figure 1). We would rather show the complete images skewed correctly, but this is unsupported by the platform - it is a feature we plan to add. Images are selected by “clicking” upon the thumbnail, which brings the image to full-screen on the mobile client in line with the web-based behaviour. Other images visible from the current image’s view are similarly represented as clickable image frames, mimicking the interaction found in the desktop Image Space that users can exploit to make obviously linear, click-through narrative scenes. We envision private media accessible only to a given social network accruing to create very rich, deep pockets of content in those places most frequented by the group, making that mirror world very personal and meaningful. 3.2 Use Case 2: Social Connection Social connection is an extension of the presence sharing feature on the Image Space service, enhanced with features from our work representing life-logging data as recorded trails [10]. Since the client application must know its position for media capture and consumption, it’s possible for the application to always share the GPS location of the user. We extend the presence component of the service to include some presence history, visualizing not only where friends are, but where they have recently been. Collections of proximal GPS positions are intelligently compiled into paths by our context server, which associates any media taken along the way. When a user happens upon a path, they are alerted by a haptic event. They can then get an AR visualization of the path and related media, making users aware of how they share that e-mail: first.(initial.)[email protected] 291 IEEE Virtual Reality 2010 20 - 24 March, Waltham, Massachusetts, USA 978-1-4244-6238-4/10/$26.00 ©2010 IEEE

Upload: ville-veikko

Post on 09-Feb-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

An Augmented Reality View on Mirror World Content, with Image Space

David J. Murphy Markus Kähäri Ville-Veikko Mattila Nokia Research Center, Helsinki Nokia Research Center, Helsinki Nokia Research Center, Tampere

ABSTRACT

We present a prototype mobile augmented reality client addition to the “Image Space” mixed reality media sharing service. We have explored how the real world aligned "mirror world" content from that service can be interacted with in-situ and identified two different use scenarios - geospatial media sharing and social connection. Since both the existing web based mirror world modality and the additional mobile augmented reality modality intersect at the common data, the combined service is as an example of how mirror worlds can be used to bridge the real and the virtual, and allow for interaction from either side of the reality continuum.

KEYWORDS: Mixed Reality, Augmented Reality, Mirror Worlds. INDEX TERMS: H.5.1 [Multimedia Information Systems]: Artificial, augmented and virtual realities

1 INTRODUCTION

Nokia Image Space [12] is a prototype social media sharing service that records the position and orientation of media at capture time, and uses this metadata to present media in a virtual world such that the spatial relationships are preserved. Virtual spaces which map directly to real-world content in this manner are often referred to as “mirror worlds”, since they “reflect” the make-up of the real world they model, and fall into the “augmented virtuality” segment of Milgram’s reality continuum [9]. The goal of the mirror world modality in Image Space is to contextualize the user’s own media, and to allow users to avail of the spatiality as a narrative dimension. The service provides users with tools for easy packaging of spatial, narrative driven media collections into shareable objects called “scenes”. Scenes allow users to share specific experiences, rather than the general feeling of being present in the location. Our work presents a functional system that uses augmented reality to provide an alternate modality upon the common basis data, making the system a more holistic mixed-reality experience applicable in the real world as well as from in front of a monitor, and making creation of AR content as easy as taking a picture.

2 DISCUSSION

Höllerer et al [6] explored overlaying photographs coordinated to real-world objects as part of a situated documentary, using their MARS augmented reality system [4]. MacIntyre et al [8] explored AR as a new media type, and examined how existing formats can be remediated for new media. We see this work as conceptually similar, lowering the barrier to acceptance of AR as a medium by illustrating how it relates to existing media. Other famous AR remediation works include Billinghurst et al’s Magic Book [2], and Piekarski et al’s AR Quake [11]. Guvën et al [5] have also explored overlaying images on historical sites, coining

the term “situated media”. Using Engström’s concept of a “social object” [3], our work tries to bring the Image Space media to AR keeping the images themselves as the social object of the service, presenting the images and the experience they intend to convey in context, rather than using the images to bring context to the AR viewed environment.

Figure 1 UI view with clickable image outlines, POI.

3.1 Use Case 1: Geospatial Media Sharing

At its core, the Image Space service is about sharing geospatial media, allowing users to capture, share and experience remotely via a web-browser. The AR modality enables in-situ experience sharing, allowing users located where the media was captured to access the spatially-organized media. The spatial images are presented in the augmented reality view as frame-outlines, aligned with their capture position and orientation (Figure 1). We would rather show the complete images skewed correctly, but this is unsupported by the platform - it is a feature we plan to add. Images are selected by “clicking” upon the thumbnail, which brings the image to full-screen on the mobile client in line with the web-based behaviour. Other images visible from the current image’s view are similarly represented as clickable image frames, mimicking the interaction found in the desktop Image Space that users can exploit to make obviously linear, click-through narrative scenes. We envision private media accessible only to a given social network accruing to create very rich, deep pockets of content in those places most frequented by the group, making that mirror world very personal and meaningful.

3.2 Use Case 2: Social Connection

Social connection is an extension of the presence sharing feature on the Image Space service, enhanced with features from our work representing life-logging data as recorded trails [10]. Since the client application must know its position for media capture and consumption, it’s possible for the application to always share the GPS location of the user. We extend the presence component of the service to include some presence history, visualizing not only where friends are, but where they have recently been. Collections of proximal GPS positions are intelligently compiled into paths by our context server, which associates any media taken along the way. When a user happens upon a path, they are alerted by a haptic event. They can then get an AR visualization of the path and related media, making users aware of how they share that

e-mail: first.(initial.)[email protected]

291

IEEE Virtual Reality 201020 - 24 March, Waltham, Massachusetts, USA978-1-4244-6238-4/10/$26.00 ©2010 IEEE

particular space and of serendipitous “crossing of paths”. Only media taken while the path was being created is shown, making it more focused on a particular route being followed than the geospatial media sharing case. This modality may be exploited to create extremely rigid temporally linear narratives where the route has primacy such as running trails, tourist trails, etc.

Aside from past paths, the current real time location of friends

is available from the server, allowing them to be represented as

avatars in the AR view, with their distance from the user. Since

the system can determine friends’ position, it can also search for

stored media taken at those particular locations, so that interacting

with the friend icon presents the user with images from where the

friend is, helping the user to understand the friends’ contexts and

what they might be experiencing at that given time. This

enhances the presence features found on the Image Space web

service.

4 PROTOTYPE SYSTEM

We have built upon the MARA [7] mobile augmented reality browser to create the AR viewer on the Image Space data. Since this is sensor-based AR, our client devices are smart-phones with embedded 3D compasses, 3D accelerometers, and assisted GPS (A-GPS) receivers, and the system uses Symbian S60 platform APIs to acquire bearing and location estimate from the embedded sensors. The accuracy of the pose estimate is quite coarse: the position is still only accurate to within ~12-15m as with all non-differential systems. The platform sensor server provides a slow, filtered bearing estimate accurate to ~5 degrees when fully calibrated in a favorable environment, and also reports on magnetometer calibration status allowing us to prompt the user to recalibrate when necessary. To compensate for the lag and low granularity, we have fused the bearing estimate sensor data with an estimate of rotation and translation provided by fast camera viewfinder motion estimate [1]. This allows us to give a more granular, speedy response to small rotations.

On the client side relevant content, such as photos, videos, and point-of-interest information, are automatically downloaded from the service based on the current location, and stored in a database of cached local items to improve performance. The camera pose is determined from the sensors, and the proximal content is projected using perspective projection to a plane matching the camera position and orientation, with similar intrinsic parameters to the physical lens. Projected content which is visible on this plane is then presented on top of device’s camera view as selectable AR objects by alpha-blended blitting of point of interest icons, rendering of projected rectangular image outlines, or blitting of image thumbnails depending on the content. The system runs at approximately 15fps, limited by the camera viewfinder frame-rate on a standard Nokia 6210 mobile device, without our expending any considerable effort on performance optimization. The device is also used to capture Image Space media, making it a key component in both content creation and consumption.

Our AR content is based on Symbian S60 landmark objects,

with each item having at least a WGS-84 coordinate describing its

real-life location. The AR objects may also contain various types

of additional information including a textual description, an icon,

web address, a list related images and other multimedia content,

and geometric primitives to control screen rendering. The images

from Image Space have been captured on mobile devices running

the Image Space daemon, which inserts the position and

orientation into the EXIF metadata for the image, allowing the

image pose to be determined with respect to the camera or other

images.

Figure 2. System architecture.

All of the mobile client’s content is fetched from the Image Space

server, which uses web-based user authentication to limit

delivered content to that which the user has been granted access.

5 CONCLUSION

We have presented an unevaluated, but functional system for

consuming geospatial media content, as an example of the holistic

approach that can be taken to augmented reality and augmented

virtuality, and how familiar media types can be remediated. We

have indentified two use scenarios our system enables that

demonstrate the kinds of interaction that can be made upon the

mirror-world Image Space data by a mobile AR client, allowing

users to experience the basis data from both modalities.

REFERENCES

[1] Andrew Adams, Natasha Gelfand, and Kari Pulli. Viewfinder

Alignment. Computer Graphics Forum, 27(2):597–606, 2008.

[2] Mark Billinghurst, Hirokazu Katu, and Ivan Poupyrev. Projects in

VR. IEEE Computer Graphics and Applications, (June):6–8, 2001.

[3] Jyri Engeström. Why some social network services work and others

don’t: the case for object-centered sociality, 2005.

[4] Steven Feiner, Blair MacIntyre, and Tobias Hollerer. Wearing It

Out: First Steps Toward Mobile Augmented Reality Systems. In

International Symposium on Mixed Reality (ISMR ’99), pages 363–

377. IEEE, 1999.

[5] S. Guven and S. Feiner. Interaction Techniques for Exploring

Historic Sites through Situated Media. In 3D User Interfaces

(3DUI’06), pages 111–118. IEEE, 2006.

[6] T. Hollerer, S. Feiner, and J. Pavlik. Situated documentaries:

embedding multimedia presentations in the real world. In Digest of

Papers. Third International Symposium on Wearable Computers,

volume 21, pages 79–86. IEEE, 1999.

[7] Markus Kahari and David Murphy. MARA - Sensor Based

Augmented Reality System for Mobile Imaging Device. In 2006 5th

IEEE/ACM ISMAR, Santa Barbara. IEEE.

[8] B. MacIntyre, J.D. Bolter, E. Moreno, and B. Hannigan. Augmented

reality as a new media experience. In Proceedings IEEE and ACM

ISAR, pages 197–206. IEEE, 2001.

[9] P Milgram and F Kishino. A Taxonomy of Mixed Reality Visual

Displays. IEICE Trans. Information Systems, E77-D(12):1321–1329,

1994.

[10] David Murphy and Petros Belimpasakis. Where-Others-Walked. In

Sharing Experiences with Social Mobile Media Workshop

MobileHCI, pages 11–17, Bonn, TUT (Pori) 2009.

[11] Wayne Piekarski and Bruce Thomas. ARQuake: the outdoor

augmented reality gaming system. Communications of the ACM,

45(1):36–38, 2002.

[12] Severi Uusitalo, Peter Eskolin, and Petros Belimpasakis. A Solution

for Navigating User-Generated Content. In 8th IEEE/ACM ISMAR.

IEEE, 2009.

292