[ieee 2013 ieee virtual reality (vr) - lake buena vista, fl (2013.3.18-2013.3.20)] 2013 ieee virtual...

2
Touch Experience in Augmented Reality Dong Li * Beijing Institute of Technology Jinghui Xie Beijing Institute of Technology Dongdong Weng Beijing Institute of Technology Yuqian Li § Beijing Institute of Technology ABSTRACT In this paper, we presented two prototype systems to analyze the role of touch in AR system. One was an AR evaluation prototype system used to design the aircraft cockpit. In this system, the pi- lots can touch the virtual buttons and meters to test the ergonomics performance of design results. The other was a touchable AR girl prototype system used to treat heterosexual-social anxiety. In this system, patients can touch and feel a virtual girl. We designed a se- ries of subjective and objective experiments to prove the importance of “touchable” to enhancing the user experience in AR system. Keywords: Touch, augmented reality. Index Terms: H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems—Artificial, Augmented, and Vir- tual Realities 1 I NTRODUCTION Touch is an important factor when people perceive real world. But there are only a little research related with “touch” in AR field. Researchers were often simply use the haptic device to interact, just like in VR system [1, 3, 2]. The first touchable AR research was “U-Tsu-Shi-O-Mi” done by Michihiko Shoji [4]. This project produced a virtual humanoid based on AR technique. The user wearing the HMD can touch the robot, shake hands and communicate with the virtual girl. It was designed to achieve the remote communication between friends. We inspired by the “U-Tsu-Shi-O-Mi” project. Touchable AR system does not use haptic device, but achieve real touch through a real object. To a certain extent, it is more like a color mapping technique. Using AR technique, it colors a real object and changes its visual appearance while maintaining its physical touch. Thus a particular human-computer interaction can be achieved. 2 FRAMEWORK OF TOUCHABLE AR SYSTEM A touchable AR system includes three components shown in figure 1: a high-precision tracking system, a see-through HMD and a real touchable object. The tracking system tracks user’s heads and the touchable objects at the same time. The See-through HMD shows the final augmented image, which solves the occlusion problem by chrome key. The real interactive objects should have the similar shape with the corresponding virtual objects. They can change the shape automatically to accommodate the deformation of virtual ob- jects. It provides new possibilities to the combination of augmented reality technology and robotics technology. 3 OBJECTIVE EVALUATION EXPERIMENTS Meter deployment and the layout of the control panel in the aircraft cabin have an impact on operating equipment efficiency (OEE), and * e-mail: [email protected] e-mail:[email protected] e-mail:[email protected] § e-mail:[email protected] Figure 1: General framework of touchable AR system. even threat to pilots’ safety. Therefore, the effect assessment of aircraft design has a great significance. It depends on the tactile experience, so real touch is extremely important to performance evaluation system. Figure 2: (a) the appearance of the system; (b) non-touch environ- ment; (c) touchable environment. We developed a prototype touchable AR system used to evaluate the effect of aircraft cockpit design shown in figure 2(a). The meter and control panel were generated by computer and superimposed on a simplified solid model. The pilots involved in the assessment not only are able to see the final design, but also to touch these structures. This system used Sony HMZ-T1 HMD, with a 0.7-inch display screen, a 45 degree field of view and resolution (1280*720 pixel- s). In order to achieve the effect of AR display, a microsoft Life- Cam camera was installed on the top of HMD, which can capture 720P high definition images at speeds as fast as 30 fps. The cockpit front panel model was a simple model covered with green canvas. A marker placed on the center of the panel was used for 3d reg- istration. Chroma key technology was applied to solve occlusion problem. The participants can see the own hands before the virtual objects. In this paper, this system was used to assess touchable AR sys- tem performance through task-based experimental methods. 20 par- ticipants were invited. During the test, each participant wore the H- MD and LED fingertip did the same work in turn in touchable AR environment, VR environment and non-touch AR environment. Experiment 1 (speed test): Click the random appearing ball with the finger as fast as possible. If the ball was hit, it disappeared from the current position and appeared in the next location. After the 8th ball was hit, the test was over. The time was counted. Experiment 2 (accuracy test): Draw a circle with fingers along the reference circle as far as possible. The Hand painted tracks were 87 IEEE Virtual Reality 2013 16 - 20 March, Orlando, FL, USA 978-1-4673-4796-9/13/$31.00 ©2013 IEEE

Upload: lekien

Post on 09-Mar-2017

218 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: [IEEE 2013 IEEE Virtual Reality (VR) - Lake Buena Vista, FL (2013.3.18-2013.3.20)] 2013 IEEE Virtual Reality (VR) - Touch experience in augmented reality

Touch Experience in Augmented RealityDong Li∗

Beijing Institute of TechnologyJinghui Xie†

Beijing Institute of TechnologyDongdong Weng‡

Beijing Institute of TechnologyYuqian Li§

Beijing Institute of Technology

ABSTRACT

In this paper, we presented two prototype systems to analyze therole of touch in AR system. One was an AR evaluation prototypesystem used to design the aircraft cockpit. In this system, the pi-lots can touch the virtual buttons and meters to test the ergonomicsperformance of design results. The other was a touchable AR girlprototype system used to treat heterosexual-social anxiety. In thissystem, patients can touch and feel a virtual girl. We designed a se-ries of subjective and objective experiments to prove the importanceof “touchable” to enhancing the user experience in AR system.

Keywords: Touch, augmented reality.

Index Terms: H.5.1 [Information Interfaces and Presentation]:Multimedia Information Systems—Artificial, Augmented, and Vir-tual Realities

1 INTRODUCTION

Touch is an important factor when people perceive real world. Butthere are only a little research related with “touch” in AR field.Researchers were often simply use the haptic device to interact, justlike in VR system [1, 3, 2].

The first touchable AR research was “U-Tsu-Shi-O-Mi” doneby Michihiko Shoji [4]. This project produced a virtual humanoidbased on AR technique. The user wearing the HMD can touch therobot, shake hands and communicate with the virtual girl. It wasdesigned to achieve the remote communication between friends.

We inspired by the “U-Tsu-Shi-O-Mi” project. Touchable ARsystem does not use haptic device, but achieve real touch througha real object. To a certain extent, it is more like a color mappingtechnique. Using AR technique, it colors a real object and changesits visual appearance while maintaining its physical touch. Thus aparticular human-computer interaction can be achieved.

2 FRAMEWORK OF TOUCHABLE AR SYSTEM

A touchable AR system includes three components shown in figure1: a high-precision tracking system, a see-through HMD and a realtouchable object. The tracking system tracks user’s heads and thetouchable objects at the same time. The See-through HMD showsthe final augmented image, which solves the occlusion problem bychrome key. The real interactive objects should have the similarshape with the corresponding virtual objects. They can change theshape automatically to accommodate the deformation of virtual ob-jects. It provides new possibilities to the combination of augmentedreality technology and robotics technology.

3 OBJECTIVE EVALUATION EXPERIMENTS

Meter deployment and the layout of the control panel in the aircraftcabin have an impact on operating equipment efficiency (OEE), and

∗e-mail: [email protected]†e-mail:[email protected]‡e-mail:[email protected]§e-mail:[email protected]

Figure 1: General framework of touchable AR system.

even threat to pilots’ safety. Therefore, the effect assessment ofaircraft design has a great significance. It depends on the tactileexperience, so real touch is extremely important to performanceevaluation system.

Figure 2: (a) the appearance of the system; (b) non-touch environ-ment; (c) touchable environment.

We developed a prototype touchable AR system used to evaluatethe effect of aircraft cockpit design shown in figure 2(a). The meterand control panel were generated by computer and superimposedon a simplified solid model. The pilots involved in the assessmentnot only are able to see the final design, but also to touch thesestructures.

This system used Sony HMZ-T1 HMD, with a 0.7-inch displayscreen, a 45 degree field of view and resolution (1280*720 pixel-s). In order to achieve the effect of AR display, a microsoft Life-Cam camera was installed on the top of HMD, which can capture720P high definition images at speeds as fast as 30 fps. The cockpitfront panel model was a simple model covered with green canvas.A marker placed on the center of the panel was used for 3d reg-istration. Chroma key technology was applied to solve occlusionproblem. The participants can see the own hands before the virtualobjects.

In this paper, this system was used to assess touchable AR sys-tem performance through task-based experimental methods. 20 par-ticipants were invited. During the test, each participant wore the H-MD and LED fingertip did the same work in turn in touchable ARenvironment, VR environment and non-touch AR environment.

Experiment 1 (speed test): Click the random appearing ball withthe finger as fast as possible. If the ball was hit, it disappeared fromthe current position and appeared in the next location. After the 8thball was hit, the test was over. The time was counted.

Experiment 2 (accuracy test): Draw a circle with fingers alongthe reference circle as far as possible. The Hand painted tracks were

87

IEEE Virtual Reality 201316 - 20 March, Orlando, FL, USA978-1-4673-4796-9/13/$31.00 ©2013 IEEE

Page 2: [IEEE 2013 IEEE Virtual Reality (VR) - Lake Buena Vista, FL (2013.3.18-2013.3.20)] 2013 IEEE Virtual Reality (VR) - Touch experience in augmented reality

Figure 3: (a1)(a2) in VR system; (b1)(b2) in non-touch AR system;(c1)(c2) in touchable AR system.

stored in the computer.In non-touch AR environment, a longer time was taken to detect

and locate around the target when trying to hit the virtual ball andthe finger jittered at a larger range in the circle test.

Figure 4: (a) Total time to click for each participant; (b) Track of atypical data Track.

Experiment 1: the time to complete 8 clicks is shown in the fig-ure 4 (a). Clearly, touchable AR system shows the best result. Thenon-touch AR system consumes even longer time than VR system.Due to the disuse of stereo camera, the image seen through the cam-era was quite different from real image. Participant had to spendmore much time to judge.

Experiment 2: the trajectory of a typical data is shown in thefigure 4 (b). In the sub-graph, the projected images of the circu-lar motion trajectories in the XY plane and in the ZY plane arepresented respectively. According to the analysis of the trajectory,there is no essential difference in performance between VR systemsand non-touch AR system. Touchable AR system is the best and themost satisfactory one. This result may be because the participant’shand was limit by the real flat.

4 SUBJECTIVE EVALUATION EXPERIMENTS

A touchable AR girl prototype system was originally designed totreat heterosexual-social anxiety. In this system, we designed sub-jective experiment to assess the immersion of AR system afteradding touch function.

This system was consisted of a HMD, a human model and atracking system (PhaseSpace). The green cloth covering on themodel was utilized to image matting based on chrome key. Theactive tracking devices were fixed on the user’s head, hands andmodel’s head in order to get the relative position in real-time. Themovement and position could trigger the feedback movement of vir-tual model. A comparative experiment was designed to verify theimmersion of our system.

20 participants were invited to the experiment and interact withthe virtual girl in the AR and VR environments. During the test,

Figure 5: (a) The experimental environment; (b) Participant’s view inAR environment; (c) Participant’s view in VR environment.

participants saw a virtual girl and they were asked to touch the girl’sface or poke her eyes, etc. The virtual girl would laugh or angeraccording to the testers’ action.

After the experiment, most participants said that much better im-mersive environment was presented in touchable AR system. Theycan see their hands by the HMD and this makes the AR environ-ment seem more realistic than VR. In addition, users in the VRenvironment were more active than those in the AR environmentand VR users seemed to be more willing to have body contact withthe virtual girl. This may because participants were ashamed whenthey saw other people in AR system.

5 CONCLUSION

In this paper test result showed that the “touch” can improve theperformance of the AR system. Since this paper was only a pre-liminary work, there were many uncertainties factors in the experi-mental. A deviation in the position of virtual hand and real hand andunstable reference circle model all leaded to errors in experimentaldata. In future work, a much more stabilized system will be built tofurther evaluate the performance of the touchable AR system, anda deformable interactive object will be used in that system.

ACKNOWLEDGEMENTS

This work is supported by the National Natural Science Foundationof China under grant No.60903069.

REFERENCES

[1] G. Bianchi, B. Knoerlein, G. Szekely, and M. Harders. High precisionaugmented reality haptics. In Proc. EuroHaptics, volume 6, pages 169–178, 2006.

[2] T. Ha and W. Woo. An empirical evaluation of virtual hand techniquesfor 3d object manipulation in a tangible augmented reality environmen-t. In Symposium on 3D User Interfaces (3DUI), pages 91–98. IEEE,2010.

[3] S. Jeon and G. Kim. Providing a wide field of view for effective inter-action in desktop tangible augmented reality. In Virtual Reality Confer-ence(VR), pages 3–10. IEEE, 2008.

[4] M. Shoji, K. Miura, and A. Konno. U-tsu-shi-o-mi: the virtual hu-manoid you can reach. In SIGGRAPH 2006 Emerging technologies,page 34. ACM, 2006.

88