[ieee 2013 ieee virtual reality (vr) - lake buena vista, fl (2013.3.18-2013.3.20)] 2013 ieee virtual...

4
Emoballoon: a Balloon-shaped Interface Recognizing Social Touch Interactions Kosuke Nakajima Yuichi Itoh Yusuke Hayashi Kazuaki Ikeda Kazuyuki Fujita Takao Onoye Graduate School of Information Science and Technology Osaka University 1-5, Yamadaoka, Suita, Osaka, Japan [email protected] AbstractPeople often communicate with others using social touch interactions including hugging, rubbing, and punching. We propose a soft social-touchable interface called “Emoballoon” that can recognize the types of social touch interactions. The proposed interface consists of a balloon and some sensors including a barometric pressure sensor inside of a balloon, and has a soft surface and ability to detect the force of the touch input. We construct the prototype of Emoballoon using a simple configuration based on the features of a balloon, and evaluate the implemented prototype. The evaluation indicates that our implementation can distinguish seven types of touch interactions with 83.5% accuracy. Keywords- soft interface; social touch interaction; gesture recognition I. INTRODUCTION When people communicate with others, they often make various forms of physical contact. For example, they shake hands or hug in amiable greeting, and also punch or slap to express anger. Since these physical expressions can be used as natural interactions with surroundings, many studies attempt to detect these physical expressions to expand human-computer interaction [4, 7, 10]. Some of these studies have proposed physical interfaces utilizing soft material to accept such strong actions as hugging and grasping. In this paper, we propose a balloon-shaped interface “Emoballoon” shown in Figure 1. It can recognize various touch interaction with a balloon such as rubbing, hugging, and so on. A balloon is one of the soft objects and has such large space inside that we could enclose some electronic devices. The Emoballoon has handheld elastic sphere that contains sensors to recognize user’s interactions. Emoballoon is a bi-directional handheld interface that can present some visual feedbacks on its surface as well. So far some handheld and huggable soft interfaces have been proposed. FuwaFuwa [10] is a sensor module to make huggable soft interfaces. Since FuwaFuwa uses cotton as soft material, it can receive such strong actions as hugging and punching. On the other hand, our Emoballoon employs a thin elastic skin as a soft material. The elastic skin is convenient material not only to receive strong actions but also to project some visual feedback through its thin surface [4, 8]. In this paper, we implement Emoballoon using a balloon, and evaluate recognition accuracy for various actions acquired with the investigation of people’s interaction with it. II. RELATED WORK Many researchers have developed recognition techniques for physical contacts. Physical and handheld interfaces called Graspable User Interfaces have been proposed to recognize various types of touch interactions [2, 3, 6]. In the field of Graspable User Interfaces, recognizing how users grasp a mobile device is a major concern because grasp recognition is available to automatically change the function of mobile devices. Kim et al. have proposed a grip recognition technique on handheld mobile devices using 64 capacitive sensors and a 3-axis accelerometer [5]. This technique can distinguish eight different grips with approximately 90% accuracy between participants. Similarly, Handsense [13] is a mobile device that tackles how to recognize in which hand the device is held, and how, using capacitive sensors. Both of these grasp recognition techniques expand interaction with mobile devices based on static grasp recognition. The Graspables projects [11] have tried to recognize some dynamic gestures on handheld mobile devices as well as static grip posture. These projects have implemented the Bar of Soap as a prototype of grasp recognition systems using 72 capacitive sensors and a 3-axis accelerometer. This prototype can recognize five static grips with approximately 80% accuracy between 13 participants and with approximately 95% on average within each participant. While these grasp recognition techniques recognize static touch interactions and gentle gestures to improve manipulation of handheld devices, they are currently not appropriate for accepting rougher touch interaction including Figure 1. Emoballoon. IEEE Virtual Reality 2013 16 - 20 March, Orlando, FL, USA 978-1-4673-4796-9/13/$31.00 ©2013 IEEE

Upload: takao

Post on 28-Feb-2017

213 views

Category:

Documents


1 download

TRANSCRIPT

Emoballoon: a Balloon-shaped Interface Recognizing Social Touch Interactions

Kosuke Nakajima Yuichi Itoh Yusuke Hayashi Kazuaki Ikeda Kazuyuki Fujita Takao Onoye

Graduate School of Information Science and Technology Osaka University

1-5, Yamadaoka, Suita, Osaka, Japan [email protected]

Abstract— People often communicate with others using social touch interactions including hugging, rubbing, and punching. We propose a soft social-touchable interface called “Emoballoon” that can recognize the types of social touch interactions. The proposed interface consists of a balloon and some sensors including a barometric pressure sensor inside of a balloon, and has a soft surface and ability to detect the force of the touch input. We construct the prototype of Emoballoon using a simple configuration based on the features of a balloon, and evaluate the implemented prototype. The evaluation indicates that our implementation can distinguish seven types of touch interactions with 83.5% accuracy.

Keywords- soft interface; social touch interaction; gesture recognition

I. INTRODUCTION

When people communicate with others, they often make various forms of physical contact. For example, they shake hands or hug in amiable greeting, and also punch or slap to express anger. Since these physical expressions can be used as natural interactions with surroundings, many studies attempt to detect these physical expressions to expand human-computer interaction [4, 7, 10]. Some of these studies have proposed physical interfaces utilizing soft material to accept such strong actions as hugging and grasping.

In this paper, we propose a balloon-shaped interface “Emoballoon” shown in Figure 1. It can recognize various touch interaction with a balloon such as rubbing, hugging, and so on. A balloon is one of the soft objects and has such large space inside that we could enclose some electronic devices. The Emoballoon has handheld elastic sphere that contains sensors to recognize user’s interactions. Emoballoon is a bi-directional handheld interface that can present some visual feedbacks on its surface as well.

So far some handheld and huggable soft interfaces have been proposed. FuwaFuwa [10] is a sensor module to make huggable soft interfaces. Since FuwaFuwa uses cotton as soft material, it can receive such strong actions as hugging and punching. On the other hand, our Emoballoon employs a thin elastic skin as a soft material. The elastic skin is convenient material not only to receive strong actions but also to project some visual feedback through its thin surface [4, 8]. In this paper, we implement Emoballoon using a balloon, and

evaluate recognition accuracy for various actions acquired with the investigation of people’s interaction with it.

II. RELATED WORK

Many researchers have developed recognition techniques for physical contacts. Physical and handheld interfaces called Graspable User Interfaces have been proposed to recognize various types of touch interactions [2, 3, 6]. In the field of Graspable User Interfaces, recognizing how users grasp a mobile device is a major concern because grasp recognition is available to automatically change the function of mobile devices. Kim et al. have proposed a grip recognition technique on handheld mobile devices using 64 capacitive sensors and a 3-axis accelerometer [5]. This technique can distinguish eight different grips with approximately 90% accuracy between participants. Similarly, Handsense [13] is a mobile device that tackles how to recognize in which hand the device is held, and how, using capacitive sensors. Both of these grasp recognition techniques expand interaction with mobile devices based on static grasp recognition. The Graspables projects [11] have tried to recognize some dynamic gestures on handheld mobile devices as well as static grip posture. These projects have implemented the Bar of Soap as a prototype of grasp recognition systems using 72 capacitive sensors and a 3-axis accelerometer. This prototype can recognize five static grips with approximately 80% accuracy between 13 participants and with approximately 95% on average within each participant.

While these grasp recognition techniques recognize static touch interactions and gentle gestures to improve manipulation of handheld devices, they are currently not appropriate for accepting rougher touch interaction including

Figure 1. Emoballoon.

IEEE Virtual Reality 201316 - 20 March, Orlando, FL, USA978-1-4673-4796-9/13/$31.00 ©2013 IEEE

punching and hugging tightly. We consider these rougher interactions as important touch interaction to enhance a user’s input technique.

Grasping, slapping, and hugging tightly can be seen in human-human communication. Brave et al. have achieved haptic telecommunication imitating haptic interactions in face-to-face communication [1]. We can find from this haptic telecommunication system that haptic interaction is important to express our intentions or emotions. Rough and strong touch interactions such as hugging, rubbing, and slapping are also seen in social interaction when the user wants to communicate happiness, warmth, anger, or danger. We believe that rough touch interactions are also useful to express our intentions in human-computer interaction and human-human telecommunication.

Thus far, some handheld huggable soft interfaces have been proposed. FuwaFuwa [10] is a sensor module to make huggable soft interfaces. The modules arrayed in the cotton measure the position and depth of a user’s pressing onto the cotton surface. Since FuwaFuwa uses cotton as soft material, it can work well even when strong touch interactions are performed. Knight et al. have proposed a cotton-made sensate robot that recognizes social touch [6]. They have implemented five gestures including tickle, poke, pet, hold, and no touch with capacitive sensors. Formerly, they had implemented another social touch recognition using �TC sensors, temperature sensors, and electric field sensors [9]. Their evaluation with the heavily equipped version reveals that the recognition accuracy is 54.3% on average among eight gestures including tickle, poke, scratch, pet, pat, rub, squeeze, and contact. We achieve an effective recognition technique for various touch interactions in a simple and convenient configuration using an inflatable object and a barometric pressure sensor.

III. EMOBALLOON

A. System configuration Figure 2 shows our implementation of “Emoballoon”.

The balloon-made interface recognizes a wide range of social touch interactions and provides users with a reasonable input technique to express their intent or emotions. It greatly

extends a user’s input technique, and is useful for various applications including telecommunication and entertainment. We have found such fundamental interactions with a balloon as “hug”, “punch”, “kiss”, “rub”, “slap”, “grasp”, and “press” from preliminary study. In order to distinguish the six fundamental interactions, we have to sense two kinds of measurements. One is the shape-change of the balloon, and the other is the gentle touch to the surface of the balloon.

For these requirements, we utilize a barometric pressure sensor and a microphone inside its elastic surface. Figure 3 shows the system configuration. The elastic soft surface of this interface accepts strong and rough actions including slapping or tight hugging, and presents passive force feedback depending on changes of the shape, unlike conventional rigid interfaces. Since the inside barometric pressure reflects the changing shape of the balloon, the pressure sensor is effective at recognizing strong actions like tight “hug” and “press”. The barometric pressure depends on the force of the user’s touch regardless of the touched area. The change of barometric pressure is one of the unique features about inflatable objects. It is easy to detect the force of the touch interactions by measuring the pressure. We employ the barometric pressure as one of the important cues in recognizing touch interactions with the balloon-shaped interface. For detecting more gentle actions like “rub”, we also attach a microphone inside. Rubbing the balloon’s elastic surface produces loud sounds. In many cases, sounds caused by touch interactions have been effective cues for distinguishing user actions [4]. The system recognizes a user’s touch interactions with the balloon by Support Vector Machine �S�M� based on the sensed data. The prototype of Emoballoon also contains a full-color LED inside of it to display a sort of visual feedback. The Emoballoon can respond to touch interactions itself. Though the current configuration is quite simple, we try to recognize various interactions based on the features specific to a balloon that produce the change of barometric pressure and the unique sound caused by a user’s touch interactions.

In addition to recognizing types of touch interactions, it is also important to detect touched areas. Thus far, some interactive surfaces using elastic material have utilized a

Figure 2. Implementation of Emoballoon using a balloon.

A pressure sensor

A microphone Arduino UNO

Figure 3. System configuration..

basic touch detection technique based on infrared light and an infrared camera put behind the elastic surface [4, 8]. On the other hand, a flexible capacitive sensor is becoming more familiar in the research field to detect a touched area. In the future, flexible sensors should be available for such soft elastic interfaces as mentioned above. Naturally, they will also be applied to elastic handheld interfaces including our Emoballoon, and should enable a remarkable input recognition technique in combination with the other sensors. We believe that, in the future, force sensing will be considered as a more important touch input technique in combination with detecting the touched areas, and inflatable objects with a barometric pressure sensor will be reasonable equipment for sensing the force of touch. Our challenge contributes to expanding possibilities of input recognition using the force of touch.

B. Implementation Emoballoon includes a barometric pressure sensor and a

microphone to recognize touch interactions. We use BMP085 �Bosch Sensortec� as a pressure sensor, and AT810F �Audio-Technica Corporation� as a microphone. The pressure sensor is connected to Arduino UNO, and the Arduino sends pressure values to a PC. The microphone is connected to the PC directly. The pressure sensor captures barometric pressure inside the balloon at approximately 25 Hz, and the microphone captures the audio signal at 44,100 Hz. Based on the captured audio signals, the system calculates the average signal from 3 chunks of 1,024 audio samples �or 23.2 msec of audio at 44,100 Hz�. Then Discrete Fourier Transformation calculates frequency spectrum from the averaged signal. We use LIBS�M [2] for S�M.

For now, all connections are wired. To seal the hole of the balloon, we insert a clay cylinder into the hole. All wires connecting sensors and PC are buried in the clay cylinder. While the wired connections may constrain a user’s actions in our prototype, using wireless communication in the future should improve handling of the balloon-shaped interface.

We put a full-color LED inside, which is controlled by the Arduino, in order to display visual feedbacks with users’ interactions. We use a white balloon for Emoballoon so that the system can display various colors using the LED. Of

course we could utilize a small projector to display some visual feedbacks like Floating Avatar [12]. For Emoballoon, we choose a relatively thick balloon whose shrinking force is stronger than a thinner one. The stronger force makes the pressure change more sensitive. Furthermore, thicker elastic is more resistant to bursting. The diameter of the selected white balloon when fully inflated is approximately 40 cm. In consideration of the safety of the following user study, we inflate the white balloon 2.0 times from the initial one in diameter, and the diameter is approximately 20 cm, which is not fully inflated but quite safe against hard physical contact like punching and slapping.

C. Recognition Process When the system calculates the feature values using the

barometric pressure, it uses difference values between measured pressure and the initial one �or the pressure when doing nothing� instead of measured pressure values. This preprocess for pressure values prevents the initial pressure of a balloon from affecting the recognition process. Finally, the system separates the sensed data in 150 msec with 33.3% overlap, and calculates feature values from each separated data. The data length and the overlap rate for separation are decided by preliminary study. According to these differences in sounds and pressures, we apply the following values to feature values for the recognition process�

the distribution of the frequency spectrum of audio signal

an average of barometric pressures inside a balloon a variance of barometric pressures inside a balloon each difference of barometric pressures.

I�. E�ALUATION We evaluated the performance of our implemented

system by calculating recognition rates for fundamental touch interactions with it. For evaluation, we asked eight nine participants to perform each of six fundamental touch interactions and “do nothing” continuously in 7 seconds. Nine participants were a female and eight males, and their average age was 23.6 �from 21 to 26�. While they performed the instructed actions, the system collected the sensed data. They repeated each action 5 times. We calculated feature value vectors from the sensed data, and we extracted the vectors of equal size among participants and actions as a validation dataset. In this evaluation, we extracted 50 feature value vectors per participant and action� in total, we used 15,750 vectors �� 50 vectors x 9 participants x 7 actions x 5 times�. Then, we calculated recognition rate by 10-fold cross validation. The kernel for the S�M is radial basis function �RBF�. From the preliminary 5-fold cross validation, we decided to set 215 for complexity parameter �C� and 2-5 for the width in RBF (γ). The preliminary validation tried 2-5, 2-3, … , 215 for parameter C, and 2-5, 2-3, … , 27 for parameter γ, and indicated the above settings were the best.

Table 2 shows results of average recognition rates within participants. The recognition rate is approximately 74.7% among all participants. In the same setting of the S�M, the recognition accuracy within the participants is 83.5% on average. The result indicates that the system can recognize

TABLE I. A�ERAGE RECOGNITION RATES WITHIN PARTICIPANTS �GREEN COLOR DENSITY OF CELLS MEANS DEGREE OF RECOGNITION RATE�

Recognition rates �%� nothing

gra sp hug pun

ch pre ss rub slap

In-put

nothing 94 0 1 0 3 1 0 grasp 0 85 1 0 13 0 0 hug 1 1 97 0 1 0 0

punch 0 0 0 74 0 8 18 press 3 12 0 0 83 1 0 rub 2 0 1 9 1 83 4 slap 0 0 0 21 0 6 72

various types of touch interactions based on the change of barometric pressure and sound inside a balloon.

�. DISCUSSION

A. Possibility to Recognize Social Touch Interaction Our implementation can distinguish seven fundamental

actions with 83.5% accuracy on average even though it has a quite simple configuration with a microphone and a barometric pressure sensor. From the result shown in Table 2, there are many incorrect recognitions between “grasp” and “press”, and between “punch” and “slap”. It is still difficult to distinguish exactly these pairs of touch interactions. However, additional evaluation indicates the system can recognize five actions with the exception of “press” and “slap” with 88.1% accuracy among all participants, and with approximately 93.7% within participants on average. The system can recognize five actions including “do nothing”, “hug”, “grasp”, “punch”, “rub” with high accuracy. To improve the accuracy in the future, it would be useful to consider the duration of each interaction. The “punch” and “slap” are finished in a short time while some actions including “hug” and “rub” can be continued for a relatively long time. Thus, the difference in duration among touch interactions would be a good cue to distinguish users’ touch input.

In our current implementation, the barometric pressure is utilized to recognize touch interactions with a balloon. Furthermore, it is also useful to detect the force of touch as well. The balloon-shaped interfaces with a barometric pressure sensor potentially can recognize both the types and the force of the touch simultaneously. If it is possible to detect both, for example, the system can distinguish a tight hug from a gentle hug, and a soft punch from a hard punch. Thus far, Huggable [9] has tried to distinguish between hard and gentle touch interactions by using �TC sensors, temperature sensors, and accelerometers, and it did not work well. However, employing inflatable objects with a barometric pressure sensor will be a simple but effective way to sense the force of touch interactions. One of our future steps is recognizing both the types and the force of touch interactions simultaneously.

�I. FUTURE APPLICATION It was clarified that Emoballoon provides users with a

way to input various kinds of touch interactions. The promising applications are telecommunication and entertainment. Emoballoon can be applied to enabling users to input their intention or emotion. For example, we regard the balloon-shaped interface as a physical avatar that can accept such user’s touch interaction as rubbing and slapping, and estimate their emotions and intentions. We believe that a balloon is a familiar and appropriate object for an avatar [12].

�II. CONCLUSION AND FUTURE WORK We propose a balloon-shaped interface called

“Emoballoon” that can recognize various social touch interactions with a simple configuration. The Emoballoon, which has an elegant configuration with a microphone and a pressure sensor inside a balloon, can recognize 7 actions with 83.5% accuracy within nine people on average. Since the proposed recognition technique can be applied to the other inflatable material, future work will implement it using safer and more durable materials as well as a balloon. It also remains as future work to improve the recognition process, and achieve such application as a haptic telecommunication.

ACKNOWLEDGMENT This work was partially supported by SCOPE �Strategic

Information and Communications R�D Promotion Programme� Grant Number 111702002, and also by JSPS KAKENHI Grant Number 24680013.

REFERENCES [1] Brave, S. and Dahley, A., inTouch� a medium for haptic interpersonal

communication. Extended abstracts on CHI 1997, pp. 363-364, 1997. [2] Chang, C. and Lin, C., LIBS�M� a library for support vector

machines. ACM Transactions on Intelligent Systems and Technology, �ol. 2, pp. 27�1-27�27, 2011.

[3] Fitzmaurice, G. W. and Buxton, W., An empirical evaluation of graspable user interfaces� towards specialized, space-multiplexed input. In Proc. CHI 1997, pp. 43-50, 1997.

[4] Harrison, C. and Hudson, S. E., Providing dynamically changeable physical buttons on a visual display. In Proc. CHI 2009, pp. 299-308, 2009.

[5] Kim, K.-E., Chang, W., Cho, S.-J., Shim, J., Lee, H., Park, J., Lee, Y., and Kim, S., Hand grip pattern recognition for mobile user interfaces. In Proc. IAAI 2006 - Volume 2, pp. 1789-1794, 2006.

[6] Knight, H., Toscano, R., Stiehl, W. D., Chang, A., Wang, Y., Breazeal, C., Real-time social touch gesture recognition for sensate robots. In Proc. IROS 2009, pp. 3715-3720, 2009.

[7] Sato, T., Mamiya, H., Koike, H., and Fukuchi, K., PhotoelasticTouch� transparent rubbery tangible interface using an LCD and photoelasticity. In Proc. UIST 2009, pp. 43-50, 2009.

[8] Stevenson, A., Perez, C., and �ertegaal, R., An inflatable hemispherical multi-touch display. In Proc. TEI 2011, pp. 289-292, 2011.

[9] Stiehl, W. and Breazeal, C., Affective Touch for Robotic Companions. Affective Computing and Intelligent Interaction, pp. 747-754, 2005.

[10] Sugiura, Y., Kakehi, G., Withana, A., Lee, C., Sakamoto, D., Sugimoto, M., Inami, M., and Igarashi, T., Detecting shape deformation of soft objects using directional photoreflectivity measurement. In Proc. UIST 2011, pp. 509-516, 2011.

[11] Taylor, B. T. and Bove, Jr., �. M., Graspables� grasp-recognition as a user interface. In Proc. CHI 2009, pp. 917-926, 2009.

[12] Tobita, H., Maruyama, S., and Kuji, T., Floating avatar� blimp-based telepresence system for communication and entertainment. ACM SIGGRAPH 2011 Emerging Technologies, pp. 4�1-4�1, 2011.

[13] Wimmer, R. and Boring, S., HandSense� discriminating different ways of grasping and holding a tangible user interface. In Proc. TEI 2009, pp. 359-362, 2009.

.