bimanual gesture keyboard - xiaojun bi · pdf filebimanual gesture keyboard ... reduced the...

Download Bimanual Gesture Keyboard - Xiaojun Bi · PDF fileBimanual Gesture Keyboard ... reduced the physical demand relative to unimanual gestures ... a user can draw multiple strokes for

If you can't read please download the document

Upload: vuongthien

Post on 06-Feb-2018

225 views

Category:

Documents


0 download

TRANSCRIPT

  • Bimanual Gesture Keyboard Xiaojun Bi Ciprian Chelba Tom Ouyang Kurt Partridge Shumin Zhai

    Google Inc. Mountain View, CA, USA

    {bxj, ciprianchelba, ouyang, kep, zhai}@google.com

    ABSTRACT Gesture keyboards represent an increasingly popular way to input text on mobile devices today. However, current ges-ture keyboards are exclusively unimanual. To take advan-tage of the capability of modern multi-touch screens, we created a novel bimanual gesture text entry system, extend-ing the gesture keyboard paradigm from one finger to mul-tiple fingers. To address the complexity of recognizing bi-manual gesture, we designed and implemented two related interaction methods, finger-release and space-required, both based on a new multi-stroke gesture recognition algo-rithm. A formal experiment showed that bimanual gesture behaviors were easy to learn. They improved comfort and reduced the physical demand relative to unimanual gestures on tablets. The results indicated that these new gesture key-boards were valuable complements to unimanual gesture and regular typing keyboards. Keywords: Text Entry, Touch Screen ACM Classification Keywords H.5.2 [Information Interfaces and Presentation]: Input De-vices and Strategies

    INTRODUCTION Among numerous novel methods of text input, gesture keyboards, also referred as shape writing [26, 21], have become a rare exception in gaining large-scale adoption on mobile devices. First published in the user interface litera-ture in the early 2000s [27, 13], gesture keyboards today are commercially available through many products such as ShapeWriter, SlideIT, Swype, T9 Trace, and TouchPal. A gesture keyboard offers users a variety of advantages. It supports a gradual and seamless transition from visually guided tracing to recall-based gesturing, and also relaxes the requirement of precisely specifying words verbatim by allowing users to express the intention with approximate shape and location finger strokes [13]. Also, it is immune to one major problem plaguing regular touchscreen typing: the lack of tactile-feedback [7, 21]. One frequently raised question is that given the capability of modern multi-touch screens, it seems amiss that gesture

    keyboards are exclusively unimanual. Traditional typewrit-ing is a two-handed activity. Furthermore, many mobile device users often hold the device with both hands and type with two thumbs (Figure 1). On many tablets the virtual keyboard can be split to two parts to support this mode of operation. Additionally, a split keyboard consumes only a small amount of screen space, which has been shown to boost user experience on portable touch screens. [15]. For example, the sizes of the split keyboards in Figures 1a and 1b are only 25% (landscape) and 47% (portrait) of the sizes of default virtual tablet keyboards. In recognition of these advantages, split keyboards are now supported on iPad, Windows 8 and Android tablets. However, despite the pop-ularity of split keyboards, current gesture keyboards are incompatible with two-thumb use. We aim to overcome this limitation by enabling bimanual operation of gesture keyboards.

    (a) (b)

    (c) Entering life (d) Entering like

    (f) Entering interaction(e) Entering you Figure 1. (a) and (b): two-hand holding postures for tablets with split keyboards. (c), (d), (e), and (f): en-tering words on split keyboards with bimanual ges-tures. Note that the actual gap between the two sides of a split keyboard is greater than illustrated. Blue strokes represent finger strokes.

    Other than supporting two-handed holding and typing, and freeing screen real estate, another benefit of enabling bi-manual operation lies in movement efficiency. Instead of frequently moving the same finger from one side of the

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. UIST 12, October 710, 2012, Cambridge, Massachusetts, USA. Copyright 2012 ACM 978-1-4503-1580-7/12/10...$15.00.

    137

  • keyboard to another, each thumb only moves in its vicinity on one side of the keyboard, hence eliminating the long movements common on unimanual gesture keyboards. For example to enter the word life, a unimanual gesture ap-proximates the trace l-i-f-e on the keyboard. On a standard Qwerty keyboard the trace from i to f moving from the right side to the left side of the keyboard is the longest segment of the l-i-f-e gesture. With a bimanual gesture keyboard, the word life can be gestured with l-i movement by the right hand and f-e by the left hand, eliminating the i-f segment (Figure 1). In another example, the bimanual ges-ture for the word like can be drawn as l-i-k by the right hand and a tap (a zero length gesture) on e by the left hand, saving the long distance travel from k to e on Qwerty (Figure 1). It is important and in fact necessary to design bimanual gesture keyboards to be fully compatible with existing modes of keyboard operation. First, bimanual gestures al-gorithm should be compatible with unimanual gestures. Any additional flexibility or benefits that bimanual gestures may bring should be at little or no cost of unimanual ges-tures. Second, tapping is still the preferred mode of touch screen keyboard operation for many users today (if not the majority). Gesture keyboards should not be viewed as competition or as a replacement for tapping. They should be compatible with the tapping mode of keyboard as well. The contribution of this paper is two-fold. First, we de-signed and implemented a bimanual gesture text entry sys-tem, based on the multi-stroke gesture recognition algo-rithm as well as two interaction methods (i.e., finger-release and space-required). This system enabled a novel form of bimanual gesture input that was not previously possible, and extended the gesture keyboard from one fin-ger to multiple fingers. Second, we conducted a formal experiment to evaluate the strengths and weaknesses of bimanual gesture input. The results showed that the biman-ual behavior was easy to learn. It raised the comfort level and reduced the physical demand over the unimanual coun-terpart. 42% (15 out of 36) participants chose one of the bimanual gesture keyboards as the most preferred input method. The weakness was that it had slightly inferior speed efficiency over unimanual gesture keyboard, due to the attention and action switches between the two fingers. In the rest of this paper we first briefly review related work. We then focus on the main bimanual gesture algorithms and the systems we have developed, outlining the pros and cons of each design. We then report and analyze a formal lab experiment evaluating bimanual gesture keyboards on tablets. RELATED WORK

    To our knowledge no bimanual gesture keyboard research has been reported in the literature to date. In the body of literature on unimanual gesture keyboards, Zhai and Kristensson published the first prototype, experi-ment and a set of basic principles of gesture keyboards [27]. The first complete and large vocabulary gesture key-

    board was published in [13]. Kristensson and Zhai [12] extended the gesture keyboard concepts to gesture key-board commands whose strokes are defined by the com-mand name but prefixed on a command key. Many of the key concepts of gesture keyboard were summarized in In-troduction to Shape Writing [26]. The idea of treating typ-ing as a discrete form of shape writing was reported in the work of Elastic Stylus Keyboard (ESK) [ 14 ] . Zhai et al analyzed user feedback of ShapeWriter on the iPhone [28]. Rick [21] developed a predictive modele of gesture keyboard and used such a model to estimate keyboard lay-outs impact on gesture keyboard performance. A large body of research on two-handed interaction inter-faces, beginning with Buxton and Myers [7], can be found in the HCI literature. A focus of this body of work is on the task assignment between two hands [e.g. 4]. The best known model of bimanual action is Guiards kinematic chain theory [9] explaining the asymmetric relationship between the dominant and the non-dominant hands role bimanual operation. In this theory the non-dominant hand sets the coarse level frame of reference (e.g. holding a pad) and the dominant hand makes fine control within that frame of reference (e.g. writing with a pen on the pad). In human-computer interaction as well as other real world tasks bi-manual operation can also be symmetrical (typing, steering a bicycle). The literature suggests that a number of tasks such as 2D or 3D navigation [3] can be performed effec-tively with a symmetric assignment of roles to the hand. Factors affecting the performance of symmetric bimanual interaction were also studied [2]. In the field of text entry, symmetric bimanual typing is common on both physical and virtual keyboards. In this work, we extend the biman-ual interaction to gesture keyboards. There is also a long history of research on gesture and sketch based interfaces in the HCI community. These ges-ture symbols can be drawn using either a single stroke [22, 16] or multiple strokes [1, 19], and have been applied to problems ranging from interfaces for blind users [11] to search on mobile devices [17, 20]. In contrast to our work, these approaches focus on unimanual input where strokes are often drawn sequentially. Furthermore, they represent an inherently different type of interaction where gestures are direct visual recreations of the symbols in the domain, rather than as paths that are defined by virtual keys on a soft keyboard. Finally, strokes are recognized at