alphafabric: a wearable textile sensor for continuous

4
AlphaFabric: A Wearable Textile Sensor for Continuous Touch Tracking and Gestural Input Nur Al-huda Hamdan RWTH Aachen University 52056 Aachen, Germany [email protected] Jan Borchers RWTH Aachen University 52056 Aachen, Germany [email protected] Christian Schmidt RWTH Aachen University 52056 Aachen, Germany christian.schmidt1@rwth- aachen.de Paste the appropriate copyright statement here. ACM now supports three different copyright statements: ACM copyright: ACM holds the copyright on the work. This is the historical approach. License: The author(s) retain copyright, but ACM receives an exclusive publication license. Open Access: The author(s) wish to pay for the work to be open access. The additional fee must be paid to ACM. This text field is large enough to hold the appropriate release statement assuming it is single spaced in a sans-serif 7 point font. Every submission will be assigned their own unique DOI string to be included here. Abstract AlphaFabric is a wearable textile sensor capable of continu- ous touch tracking and gesture detection. It is made of two layers of conductive striped fabric separated by a layer of non-conductive foam. AlphaFabric uses resistive technol- ogy to determine the X and Y position of the touch on the fabric. The sensor can be worn directly on the skin or in- tegrated into clothes and still retain consistent measuring properties even during continuous movement. We describe the construction of our sensor and outline its benefits com- pared to other textile sensing techniques. We implemented a set of applications to demonstrate the applicability and versatility of our sensor. Author Keywords Wearable computing; smart textiles; touch; gesture input. ACM Classification Keywords H.5.2. [Information Interfaces and Presentation (e.g. HCI)]: Input devices and strategies (e.g., mouse, touchscreen) Introduction In recent years, many researchers have investigated how to appropriate textiles as large and potentially unobtrusive wearable interactive surfaces. Different touch sensing fab- rics have been proposed [3, 7, 4]. These fabrics are capa- ble of detecting taps and simple stroke gestures while re-

Upload: others

Post on 11-Jan-2022

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: AlphaFabric: A Wearable Textile Sensor for Continuous

AlphaFabric: A Wearable TextileSensor for Continuous Touch Trackingand Gestural Input

Nur Al-huda HamdanRWTH Aachen University52056 Aachen, [email protected]

Jan BorchersRWTH Aachen University52056 Aachen, [email protected]

Christian SchmidtRWTH Aachen University52056 Aachen, [email protected]

Paste the appropriate copyright statement here. ACM now supports three differentcopyright statements:

• ACM copyright: ACM holds the copyright on the work. This is the historicalapproach.

• License: The author(s) retain copyright, but ACM receives an exclusivepublication license.

• Open Access: The author(s) wish to pay for the work to be open access. Theadditional fee must be paid to ACM.

This text field is large enough to hold the appropriate release statement assuming it issingle spaced in a sans-serif 7 point font.Every submission will be assigned their own unique DOI string to be included here.

AbstractAlphaFabric is a wearable textile sensor capable of continu-ous touch tracking and gesture detection. It is made of twolayers of conductive striped fabric separated by a layer ofnon-conductive foam. AlphaFabric uses resistive technol-ogy to determine the X and Y position of the touch on thefabric. The sensor can be worn directly on the skin or in-tegrated into clothes and still retain consistent measuringproperties even during continuous movement. We describethe construction of our sensor and outline its benefits com-pared to other textile sensing techniques. We implementeda set of applications to demonstrate the applicability andversatility of our sensor.

Author KeywordsWearable computing; smart textiles; touch; gesture input.

ACM Classification KeywordsH.5.2. [Information Interfaces and Presentation (e.g. HCI)]:Input devices and strategies (e.g., mouse, touchscreen)

IntroductionIn recent years, many researchers have investigated howto appropriate textiles as large and potentially unobtrusivewearable interactive surfaces. Different touch sensing fab-rics have been proposed [3, 7, 4]. These fabrics are capa-ble of detecting taps and simple stroke gestures while re-

Page 2: AlphaFabric: A Wearable Textile Sensor for Continuous

Figure 1: The AlphaFabric sensor uses resistive touch technologyto continuously detect the finger position on the fabric.

taining the physical characteristics of regular textiles. How-ever, when placed near the skin [5] or during continuousmovement [3, 7], these sensors can become quite noisymaking it harder to detect touch accurately and reliably.

In this demonstration, we present AlphaFabric, a wearablemultilayer fabric sensor capable of detecting touch and free-form gestures (Fig. 1). It uses resistive touch technology toavoid the common noise sources (e.g., the capacitive fieldof the body and continuous body motion) which affect mosttextile sensors. We show how AlphaFabric enables a largeinput vocabulary, which opens a range of new input tech-niques and novel use cases for wearable textile sensors.

striped conductive fabric

non-conductive foam

Figure 2: Multilayer fabric sensor.

Sensor DesignThe sensor is composed of three layers of textile: two lay-ers of conductive striped fabric placed perpendicular toeach other forming a matrix, and a foam spacer in between(Fig. 2). The conductive layers are made by weaving con-ductive yarn into parallel stripes of 3 mm width with 3 mmspacing. These dimensions are based on the yarn’s phys-

ical properties and electrical conductivity. They are neces-sary to create reliable conductive electrodes and to avoidaccidental connections between parallel stripes caused byyarn flyaways. Our sensor has a resolution of 4 ppi.

We used a foam spacer of 3mm thickness to separate theconductive layers. Foam is light, flexible, and very elas-tic: it concedes easily with applied pressure and resumesits normal shape spontaneously after being compressed.The thickness of the spacer increases the sensor’s resis-tance to accidental pressure especially during continuousmovement. To create a touch contact on our sensor, oneneeds to exert force equal to 1.2 N. We used a laser cutterto cut holes into the spacer corresponding to the intersec-tion points of the conductive stripes.

AlphaFabric scales easily. We built two sensors of differentsizes: 80× 80 mm and 120× 120 mm. But one should notethat the number of electrical traces which connect the fabricto the sensing controller increases linearly with sensor size.

In our prototype we used copper wires to connect the multi-layer fabric to the sensing controller. These can be replacedwith conducive thread. Our prototype works wirelessly us-ing a Bluetooth unit and a power pack.

Sensing TechnologyWe use resistive touch technology to determine the X and Yposition of the touch on the fabric. A touch is created whena contact is made between the layers by pressing on thefabric. A microcontroller (TI EK-TM4C1294XL) measuresthe resistance at the intersections of the conductive stripesto determine the coordinates of the contacts. Typically, asingle touch results in registering several adjacent contacts.We compute the center of gravity of the contacts to deter-mine the coordinates of the touch. The sensor can track thefinger’s position continuously at a rate of 16.8Hz.

Page 3: AlphaFabric: A Wearable Textile Sensor for Continuous

Gesture DetectionTo demonstrate AlphaFabric’s capacity to detect gestures,we implemented two types of the most common path-basedgestures in the literature: mark-based gestures and free-form gestures (see Fig. 3). We implemented a simple algo-rithm to detect mark-based gestures based on directionalstrokes. For detecting free-form gestures we used the $1gesture recognizer [8] and encoded the templates of thegestures. The recognizer compares user input to the tem-plates using the proportional shape matching approach andprovides an euclidean score. Our sensor is capable of de-tecting 25 unique gestures, including tapping, at any scaleand in any orientation.

Figure 3: Mark gestures (top),free-form gestures (below) [2].

Sensor EvaluationWe performed informal in-lab tests to observe how the sen-sor behaves under three wearable conditions. Participantswere asked to wear the sensor on their arms and performthe set of gestures in Figure 3. The participants were freeto move their arms as they wanted. We then asked the par-ticipants to repeat the same gesturing task but this time wecontrolled three variables: the STIFFNESS of the materialunderlying the sensor (ranging from a stiff hard surface tosoft foams), the ROUGHNESS of the material overlaying thesensor (synthetic silk, cotton, jeans, and rib knit cotton), andthe RADIUS to which the sensor can be bent (180°— 20°).We manipulated these variables to simulate the expectedwearing conditions of a textile sensor when placed on thebody or integrated into clothes.

AlphaFabric created no accidental contacts when placed onthe body. The success rate of our sensor was 84%. Somegestures were registered by the hardware but not detectedby the recognizer, while other gestures failed to register.The authors of this work score 92% success rate indicatinglearning effects. Following, we summarize our observations.

The sensor’s performance was independent of the stiffnessof the underlying material or the roughness of the overlyingmaterial. Only when the sensor was bent to a radius below26 mm, the foam spacer started to make creases along itssurface creating permanent electrical connections betweenthe conductive layers. In practice, these contacts can bedetected by software and filtered out but this would causethe input resolution of the sensor to decrease.

Interaction with the sensor was perceived more pleasingwhen the underlying material was softer, despite the factthat more pressure was needed to create a contact.

Smooth overlying materials such as silk improved the userexperience. In contrast, rough overlying materials such asrib knit cotton made it challenging to sustain a continuouscontact between the finger and the fabric, which resultedin premature gesture detection (false positives). A quick fixwould be to increase the time threshold before a gestureis detected to, e.g., 2 sec, enough time to resume contactand continue the gesture. However, this solution potentiallyreduces the responsiveness of the sensor.

Prototype ApplicationsTo demonstrate the versatility of AlphaFabric, we imple-mented prototypes based on two input modalities: continu-ous finger tracking and free-form gestures (Fig. 4).

The first prototype is a doodling application. The user drawson a computer display by moving her finger continuouslyover the surface of the textile sensor. We implemented amarking menu interface to enable the user to access andnavigate hierarchical menus using simple mark-based ges-tures. This interaction also allows users to interact withmenus on small and mobile devices eyes-free [1].

Page 4: AlphaFabric: A Wearable Textile Sensor for Continuous

As a possible usage example of continuous touch tracking,we envision implementing handwriting recognition to con-vert doodling strokes to digital text, similar to [6], creating arelatively large and quickly accessible textile notepad. Thesame input modality can be used to control a cursor or ma-nipulate objects in a head-mounted display.

The second prototype uses gestures to control other ap-plications. We implemented a set of gestures to control akeynote presentation (directional swipes to navigate theslides, and two gestures to enter and exit the presentationmode). The sensor can be operated while attached on theuser’s body, e.g., upper thigh (see Video Figure).

Figure 4: Prototype applications todemonstrate the input modalities ofthe AlphaFabric.

Summary and Future WorkWe built AlphaFabric, a simple textile sensor that enablescontinuous touch tracking and gesture recognition on fabric.The sensor can be worn on any location on the body andcan be easily integrated into everyday garments.

Resistive touch technology brings accurate and reliablecontrol to wearable textile sensors, and because it respondsto pressure, it can detect a contact with a finger or throughlayers of fabric. However, during long interaction intervals,resistive sensors can become cumbersome to use. We arelooking into new types of flexible, elastic and abrasion resis-tant textile spacers (e.g., 3D fabrics) that can further reducethe pressure needed to register touches.

We intend to increase the resolution of our sensor by in-creasing the number of conductive stripes per fabric inch.This requires testing new types of conductive yarns that canbe woven into narrower closer pins.

Further studies will be needed to investigate how the sen-sor reacts in-the-wild, e.g., in crowded areas and while do-ing physical activities such as walking and running. This

work provides a step towards the development of wearabletouch sensors with large gesture vocabularies and new in-put modalities.

References[1] Jens Bauer, Achim Ebert, Oliver Kreylos, and Bernd

Hamann. 2013. Marking menus for eyes-free interac-tion using smart phones and tablets. In Proc. ARES’13. Springer, 481–494.

[2] Andrew Bragdon, Eugene Nelson, Yang Li, and KenHinckley. 2011. Experimental analysis of touch-screengesture designs in mobile environments. In Proc. CHI’11. ACM, 403–412.

[3] Florian Heller, Stefan Ivanov, Chat Wacharaman-otham, and Jan Borchers. 2014. FabriTouch: explor-ing flexible touch input on textiles. In Proc. ISWC ’14.ACM, 59–62.

[4] Ivan Poupyrev, Nan-Wei Gong, Shiho Fukuhara,Mustafa Emre Karagozler, Carsten Schwesig, andKaren E Robinson. 2016. Project Jacquard: Inter-active Digital Textiles at Scale. In Proc. CHI ’16. ACM,4216–4227.

[5] Cliff Randell, Ian Andersen, Henk Moore, Sharon Bau-rley, and others. 2005. Sensor Sleeve: Sensing Affec-tive Gestures. In Proc. ISWC ’05. 117–123.

[6] T Scott Saponas, Chris Harrison, and Hrvoje Benko.2011. PocketTouch: through-fabric capacitive touchinput. In Proc. UIST ’11. ACM, 303–308.

[7] Stefan Schneegass and Alexandra Voit. 2016. Ges-tureSleeve: using touch sensitive fabrics for gesturalinput on the forearm for controlling smartwatches. InProc. ISWC ’16. ACM, 108–115.

[8] Jacob O Wobbrock, Andrew D Wilson, and Yang Li.2007. Gestures without libraries, toolkits or train-ing: a $1 recognizer for user interface prototypes. InProc. UIST ’07. ACM, 159–168.