chapter 8: body awareness in prosthetic hands · chapter 8 body awareness in prosthetic hands...

16
183 Chapter 8 Body Awareness in Prosthetic Hands Alejandro Hernández-Arieta and Dana D. Damian 8.1 Introduction Cybernetics is a field developing at a fast pace aiming to improve the lives of innu- merable people around the globe. In the not-so-distant future, artificial limbs will accurately emulate their biological counterparts. An amputee will have complete control over the movement of his or her artificial limb, allowing him or her to manipulate, reach, and even type on his or her computer. e sensors installed in the artificial hand will transmit relevant information from the environment to his or her body, allowing him or her to feel the smoothness of a cup, the warmth from a steam- ing coffee, or even the sharp pain of a needle. is bidirectional interaction with the artificial limb will allow the person to be aware of it (body awareness); in other Contents 8.1 Introduction ............................................................................................. 183 8.2 Body Awareness in Prosthetics ................................................................. 184 8.3 Artificial Sensory Skin Exploiting Morphology for Prosthetics ................ 187 8.3.1 Artificial Skin Construction ......................................................... 188 8.3.2 Artificial Skin as a Force Transducer............................................. 189 8.3.3 Artificial Skin as a Slippage Detector ............................................ 190 8.4 Sensory Feedback in Prosthetic Applications ............................................ 192 8.5 Conclusions.............................................................................................. 194 References ......................................................................................................... 195

Upload: others

Post on 16-Mar-2020

11 views

Category:

Documents


0 download

TRANSCRIPT

183

Chapter 8

Body Awareness in Prosthetic Hands

Alejandro Hernández-Arieta and Dana D. Damian

8.1 IntroductionCybernetics is a field developing at a fast pace aiming to improve the lives of innu-merable people around the globe. In the not-so-distant future, artificial limbs will accurately emulate their biological counterparts. An amputee will have complete control over the movement of his or her artificial limb, allowing him or her to manipulate, reach, and even type on his or her computer. The sensors installed in the artificial hand will transmit relevant information from the environment to his or her body, allowing him or her to feel the smoothness of a cup, the warmth from a steam-ing coffee, or even the sharp pain of a needle. This bidirectional interaction with the artificial limb will allow the person to be aware of it (body awareness); in other

Contents8.1 Introduction .............................................................................................1838.2 Body Awareness in Prosthetics .................................................................1848.3 Artificial Sensory Skin Exploiting Morphology for Prosthetics ................187

8.3.1 Artificial Skin Construction .........................................................1888.3.2 Artificial Skin as a Force Transducer.............................................1898.3.3 Artificial Skin as a Slippage Detector ............................................190

8.4 Sensory Feedback in Prosthetic Applications ............................................1928.5 Conclusions ..............................................................................................194References .........................................................................................................195

184  ◾  Advances in Therapeutic Engineering

words, he or she will feel as if the artificial arm is part of his or her body. In summary, the advances in cybernetics will allow him or her to have a completely normal life.

In the present day, artificial limbs are still far from the state just described. Current efforts are underway to try to close the gap between artificial and natu-ral limbs. Currently, we are able to control several degrees of freedom of an arti-ficial hand (Parker et al., 2006; Antfolk et al., 2010), regulate the force applied by it (Castellini et al., 2009; Castellini & van der Smagt, 2009), or even control the individual movement of the fingers (Acharya et al., 2008; Maier & van der Smagt, 2008; Smith et al., 2009; Harada et al., 2010). New mechanical implemen-tations are now able to reproduce most of the movements available to natural limbs. For example, the ‘Luke Arm’ developed by DEKA Technologies is a prosthetic replacement for whole arm amputees that pushes to the limit current engineering technology (Adee, 2008).

Attempts to provide feedback have been made, by either applying direct electri-cal stimulation to the remaining sensory channels (Rossini et al., 2010) or through the redirected sensory nerves by applying vibration in the chest (Marasco et al., 2011). However, a large number of prosthetic devices still face limitations in per-formance, mostly due to the lack of sensory feedback (Kuiken et al., 2007). A vast number of users quit using myoelectrically controlled prostheses due to the extra cognitive demands required to use the artificial hands in their daily living activities (Biddiss & Chau, 2007). One of the reasons is that current prostheses provide very little nonvisual feedback, if any at all. Early attempts to supply sensory feedback to prosthetic devices dates back to the 1950s, when von Békésy (1959) discussed the possibility to use either electrical or mechanical means to stimulate the skin for information transfer. Although the idea of providing sensory feedback to an amputee is not new, we are still looking for efficient methods to send sensory infor-mation back to the body. This information is important to promote body awareness.

This chapter continues as follows. Section 8.2 will explore neuroscience studies carried out in the context of the inclusion of ‘external’ objects into one’s own body image, and how these can be used in prosthetics. Section 8.3 offers a sur-vey on current methods to provide robotic hands with sensory capabilities, their implementation, and current limitations. Here, we propose the development of an artificial skin exploiting material properties to overcome the limitations on sensory feedback. Finally, Section 8.4 explores the methods available to send this sensory information to the user’s body. On the basis of the neuroscience studies, we look for the methods that can increase the body awareness in prosthetic devices.

8.2 Body Awareness in ProstheticsThere are at least two questions that may hold the key to promoting body aware-ness in prosthetic applications. How do we identify our bodies? And what are the mechanisms behind the mental representations of our bodies? The brain

Body Awareness in Prosthetic Hands  ◾  185

creates mental representations of our body through the sensory stimuli generated when interacting with the environment. Although the definition of these mental representations is still widely discussed, there is general consensus that there are at least two different types of body representations: body schema and body image (de Vignemont, 2010). Body schema is defined as the sensorimotor representations of the body used to guide movement and action. Body image is defined as the representation used to form our perceptual, conceptual, or emotional judgements towards our body. We will consider body image for the purposes of this chapter to be the representation used to form our perceptual body or body percept (Gallagher, 2005). Body image is an important concept that we will explore later in this chapter.

These mental representations are not static; instead, they exhibit plastic-ity (Waller & Barnes, 2002) that allows our brain to adapt to changes that the body experiences throughout our lives. This effect is more notorious during the early stages of human development (Rochat, 1998) when the brain continu-ously ‘updates’ its mental representation of the body, and accordingly modifies kinematic models that facilitate motor control. When a person suffers an ampu-tation, his or her body and mental representations are simultaneously affected (Ramachandran & Blakeslee, 1998). The mental representation of the body is affected by the firing of neighbouring neurons in the affected area. These extra stimuli cause a disturbance in the body’s mental representation, known as ‘phan-tom limb’ (Berlucchi & Aglioti, 1997). This mismatched representation can be reverted if the person receives extra feedback, that is, in amputee patients who received visual feedback, their brain was able to update the state of the body, rear-ranging the ‘phantom limb’ to match the body’s actual situation (Ramachandran & Hirstein, 1998). Even though the mismatched representations can produce unde-sired phantom pain, the remaining mental representations of the missing limb in the brain can be positively exploited to produce motor commands in the form of action potentials. Action potentials stimulate motor neurons, which in turn pro-duce electromyograms (EMG) signals that can be acquired at the remaining limb or stump of the patient. The literature presents several examples of the application of myoelectric signals for the control of actuated prosthetic hands (Parker et al., 2006; Oskoei & Hu, 2007). This permits, through the remaining representation of the missing limb in the brain, the acquisition of ‘intention’ from the patient (Reilly et al., 2006). Using this ‘intention’, it is possible to generate the appropriate motor commands for the robotic device and consequently reproduce the desired movement (Figure 8.1).

However, having control over the prosthetic hand does not suffice to promote body awareness. To promote sensory awareness in the body of an amputee, it is necessary to supply multisensory information (pressure, proprio-ception, and visual). Several studies support the argument that tactile, proprio-ceptive, and visual feedback plays an important role in the incorporation of artificial limbs in the mental representation of the body (Ehrsson et al., 2008; Rosén et al., 2009).

186  ◾  Advances in Therapeutic Engineering

Attempts to provide an objective evaluation of this ‘ownership’ phenomenon used functional magnetic resonance imaging (fMRI) to measure the changes in the motor and somatosensory cortices (Hernandez-Arieta et al., 2008; Kato et al., 2009). Although fMRI studies have a very low temporal resolution, it is possible to capture a ‘picture’ of the active brain locations during a certain period of time. Arieta et al. (2006) applied electrical stimulation to the skin of the contralat-eral arm, and controlled a robot hand using EMG signals from the stump of the patient (right hand). fMRI made it possible to observe the activation of both the motor and somatosensory cortices in the patients (Figure 8.2). The experiments showed a simultaneous activation of the somatosensory and motor cortices in the left hemisphere, even though the tactile stimulation was taking place in the ipsilateral arm.

These results show how brain plasticity can be used to our advantage for the recovery of lost function. The active participation of the patient in the control of the prosthetic device supported by multisensory feedback allows the brain to reshape its mental representations of the body.

Measured EMG signal raw data

10 V

Finger’s flexion signal

Feature vectorextraction

Featurevector

Motion

Feed-forward neural network

Frequency (Hz)

Am

plitu

de

t (ms)

t (ms)

On

OffRobot hand

Finger’s flexion–10 V

Prosthetic hand control signal

Adaptation unitFrequency spectrum

Figure 8.1 ‘Intention’ acquisition. EMG signals can be used to detect the inten-tion of the person using an artificial neural network to produce the desired motor commands for the robot hand. (Reproduced from Arieta, A. H. et al., Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4336–4342. With permission.)

Body Awareness in Prosthetic Hands  ◾  187

8.3 Artificial Sensory Skin Exploiting Morphology for Prosthetics

As discussed in Section 8.2, several sensory modalities such as touch, proprioception, and vision are required to promote body awareness. However, the implementation of these sensor modalities in a prosthetic device is not an easy task. The development of prostheses equipped with sensory modalities needs to take into consideration restric-tions like the total weight of the device, size, and energy consumption. Even though advances in microelectronics have made it possible to increase the number of sensors embedded in robot hands, much of the existing range of tactile sensors are still not fit to be used in humanoid robotics. The main reasons are their large size, their single unit construction, which sacrifices dexterity, or that they are too slow, fragile, or made of rigid materials (Lee, 2000). Dahiya et al. (2010) present an extensive review of the current tactile technologies for humanoids, developed in the last two decades and cov-ering nearly all modes of transduction. Despite the broad experimentation in different transduction technologies, tactile-sensing attempts have not been scaled up to com-plete tactile-sensing systems or realisation of full-blown skins. Without a general sys-tem, implementation in humanoid robotics, and even more in prosthetics, is difficult.

In this section, we focus on slip, which can be seen as the coding of motion by the receptors of the skin. In the human body, the sensation of slip is essential in the

Hand motor area

Somatosensory area

Frontal lobe

Figure 8.2 Amputee cortical activation. The image on the right is zoomed out, showing the motor and somatosensory areas related to the hand and arm. The upper part shows the activation of the motor area in charge of the right-hand move-ment and the lower part shows the reaction from the somatosensory area related to the hand. It is important to notice that the subject does not have a right arm to touch any object. (Reproduced from Arieta et al., A fMRI study of the cross-modal interaction in the brain with an adaptable EMG prosthetic hand with biofeed-back. In Engineering in Medicine and Biology Society, 28th Annual International Conference of the IEEE, pp. 1280–1284. doi: 10.1109/IEMBS.2006.259938, 2006. With permission.)

188  ◾  Advances in Therapeutic Engineering

perception of roughness (Johanson & Hsiao, 1992; Srinivasan & LaMotte, 1995; Cascio & Sathian, 2001), hardness (Binkofski et al., 2001), and shape (Howe & Cutkosky, 1993; Bodegård et al., 2000). This sensor modality is of particular inter-est because it plays an important role as an error signal for optimal grip force con-trol, which is a common problem in prosthetic applications.

A wide range of interesting tactile sensors that use a variety of transduction principles have been developed for slippage detection. Cotton et al. (2007) devel-oped a thick-film piezoelectric sensor for slippage detection. When slippage occurs, the film tilts and produces vibrations, causing changes in the value of the piezore-sistors. Yamada et al. (2002) built a skin featuring rounded ridges equipped with strain sensors in between to detect slippage using sensor deformation. The slippage information is extracted from the velocity and acceleration of the strain gauges deformation. Tremblay and Cutkosky (1993) used nibs on top of the skin surface that vibrate when an object starts to slip. Accelerometers placed inside the artificial skin capture the vibration and convey the slippage notification. Lowe et al. (2010) use similar systems to detect slip in prosthetic applications. Their system exploits vibration information in three axes to detect and calculate slip and the travelled distance. Beccai et al. (2008) made a soft compliant tactile sensor for prosthetics consisting of a high shear sensitive 1.4 mm3 triaxial force microsensor embedded in a soft, compliant, and flexible package. They demonstrated that the sensor is robust enough to be used in prosthetic devices and sensitive enough to detect slip events with a maximum delay of 44 ms.

Optics is an additional option in detecting the slippage (Ohka et al., 2005). Using conical feelers, on a rubber sheet surface, Ohka et al. acquired an image of the contact area and of the feelers’ displacement to determine the surface normal and shear forces. Another method for detecting slippage is to consider tactile infor-mation to be a tactile image and use motion detection algorithms (Maldonado-Lopez et al., 2007). An array of identical electrical circuits are sensitive to temporal and spatial changes, and thus identify microvibrations produced by slip.

Despite all the technological advances, the systems mentioned earlier still face the limitations described by Dahiya et al. (2010). We propose the use of a ridged skin made of compliant materials to encode the slip and velocity information in frequency patterns. In Section 8.3.1, we introduce the mechanisms behind the con-struction of artificial skin. In Sections 8.3.2 and 8.3.3, we present artificial skin behaviour as a force transducer and as a slippage detector, respectively.

8.3.1 Artificial Skin ConstructionTo develop an artificial skin that can be used in prosthetic applications, we need to consider several restrictions in addition to the usual limitations of artificial sensor systems for robotics. One way to reduce the complexity of the sensor sys-tem and improve the robustness of the system at the same time is to exploit the material properties and morphology to encode information. To accomplish this,

Body Awareness in Prosthetic Hands  ◾  189

we resort to mechanical structures that are able to encode information about the slippage of an object (Damian et al., 2010). On the basis of studies about the role of fingerprints in encoding tactile information (Johansson & Westling, 1984), we developed an artificial skin with ridges. Figure 8.3 shows the components of the artificial skin.

The ridges are made of silicone, a compliant material that can be compressed and deformed. When an object moves over the ridged surface, the ridges help to change the normal force applied by the object over the skin. These changes in the normal force can be captured using a pressure sensor (we used a force-sensing resistor [FSR] from Interlink). The changes in the normal force registered in the pressure sensor produce patterns in the frequency spectrum that can be used to recognise the speed of the object sliding over the artificial skin.

8.3.2 Artificial Skin as a Force TransducerFor a static characterisation of force, we built a set of ridged artificial skins where the distance between the two consecutive ridges (discretely) varied from 2.5 to 4 mm. The ridge densities are designated by the interridge distance (Drr). To measure the normal force detected by the sensor, we increased the weight of the object applied over it in 100 g units. A flat skin (without ridges) was also used for reference (Drr = 0.0 mm).

Drr

SiliconeSelf-adhesive fabric

FSR

(a) (b)

(c)

L

Figure 8.3 Artificial ridged skin. The figure shows a sample of the silicone-ridged skin, the standard force sensing resistance (FSR) sensor, and illustrates the process of construction. The ridged shapes of the skin were obtained by solidifying the silicone into an ABS plastic ridged mask that was built using rapid prototyping. The transverse sectional shape of the silicone ridge is an equilateral triangle, with the side L = 2.5 mm. The thickness of the pad on which the ridges lay is 1 mm. The FSR sensor size is 4 × 4 cm, measuring force sensitivity from 100 g to 10 kg.

190  ◾  Advances in Therapeutic Engineering

Four trials were performed for each skin and each weight. We acquired the voltage produced by the pressure sensor using a data acquisition (DAQ) system (NI USB-6289) with a sampling rate of 1 kHz (Figure 8.4).

The Drr affects force measurement in which the contact surface distributes force according to the number of ridges supporting the object. This implies increased force values for large Drrs and decreased force values when the ridge density is high. These tendencies can be seen in Figure 8.4.

8.3.3 Artificial Skin as a Slippage DetectorTo test the capabilities for slipping detection of the proposed skin sensor, we con-ducted experiments with a sliding object to quantitatively evaluate the slippage speed. We utilised six types of artificial skins and applied a total of three slippage velocities. We placed two FSR sensors underneath to cover the entire surface of the artificial skin. The results are invariant to the number of FSRs, which grants skin efficiency even with a single FSR. However, there is no such standard sensor that has the size of the particular skin we built. In a typical experiment, an object was made to slide horizontally across the artificial skin at a constant speed. The skins were fixed to exclude extraneous vibrations. The peak frequency was extracted from the spectrum of the data in contrast to the flat artificial skin. In the spectrum, the ridge patterns gave rise to meaningful peak frequencies. The flat artificial skin

2

1.5

FSR

sens

or o

utpu

t (V

)1

0.5

00 2 4 6Weight (× 100 g)

8 10 12

Drr = 0.00 mmDrr = 2.50 mmDrr = 3.00 mmDrr = 3.50 mmDrr = 3.75 mmDrr = 4.00 mm

Figure 8.4 Voltage versus weight sensor response. The figure shows the volt-age amplitude elicited by the skin patches when weights from 100 g up to 1 kg were placed on top. We can observe increased force values for large interridge distances and decreased force values when the ridge density is high. The results presented a maximum standard deviation of 0.13 V for the skin with Drr = 3.75 mm, followed by the skin with Drr = 3.0 mm with standard deviation of 0.11 V. (Reproduced from Damian et al., Artificial ridged skin for slippage speed detection in prosthetic hand applications. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010. With permission.)

Body Awareness in Prosthetic Hands  ◾  191

maintained low amplitude in time and the yielded frequency was being assigned, for most trials, the smallest frequency in the spectrum, regardless of the slippage velocity.

Under the slippage conditions, the skin patch behaves like a signal generator whose frequency fs accounts for the slippage speed vo of the object and the distance Drr between two consecutive ridges. Given a constant velocity, this relation can be expressed as follows:

ft

vDs

o

rr= =1

where Δt is the period between two consecutive peaks in the signal. By statistically averaging the measured velocity of the moving object across experimental trials, we were able to calculate the ideal frequency.

The results depicted in Figure 8.5 show the peak frequencies extracted for the six skin types. They suggested that Drr is an important parameter for the quality of frequency-encoded information. Among all skins, the one with Drr = 4.0 mm yielded discriminatory peak frequencies for each velocity. The peak frequency was computed as an average over five experiments per skin per velocity. We hypoth-esise that the low ridge density was most accurate because the forces applied over the sensor present more periodicity, shown by the mean frequency value, which was the closest to the ideal frequency given by the formula mentioned earlier.

0.16Drr= 0.0 mm

Drr= 3.5 mm

0.08

0.7

0.62

0.54

0

200

Am

plitu

de (V

)

600 1000 1400

200 600 1000 1400

0.4

0.3

0.2

0.1

0.88

0.8

Drr= 2.5 mm

Drr= 3.75 mm

200 600 1000 1400

200 600Time (ms)

1000 1400

0.68

0.6

0.52

1.15

1.05

0.95

Drr= 3.0 mm

Drr= 4.0 mm

200 600 1000 1400

200 600 1000 1400

Figure 8.5 Sensor frequency response. The time series shows the signature of the slippage signal with respect to the interridge distance parameter at a constant slippage velocity of 10 mm/s. (Reproduced from Damian et al., Artificial ridged skin for slippage speed detection in prosthetic hand applications. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010. With permission.)

192  ◾  Advances in Therapeutic Engineering

In this section, we explored the use of morphology and material properties to implement a slip sensor for prosthetic devices. Our results show the possibility to encode the velocity of a slipping object in frequency patterns that can be translated into sensory information using sensory substitution systems. Following the same strategy, it should be possible to implement the other sensor modalities required to promote body awareness of the artificial hand. The next challenge for a prosthet-ics developer is to transmit the decoded sensory information as feedback for the amputee.

8.4 Sensory Feedback in Prosthetic ApplicationsThere are several sensor modalities that are considered necessary for prosthetic devices; in this section, we will focus on the required sensor modalities for upper limb prostheses. Functionally, there are three basic sensory states required to know the arm position and the status of the hand: position of the hand, posi-tion of the elbow, and gripping force of the hand (Shannon, 1976). In addition to the previously mentioned sensory states, recent studies propose an extended version of the necessary sensor modalities in prosthetic hands, such as pres-sure, vibration, shear force, temperature, and proprioception (Kim et al., 2008). However, the state of the art has not fulfilled the basic requirements proposed by Shannon.

Early attempts to provide multiple signal feedback reported limited results (Szeto & Farrenkopf, 1992) due to the technical limitations of that time. After improvements that increased reliability in prosthetic robot hands, the scientific community directed their attention again to provide sensory information to improve the control performance of artificial limbs (Kuiken et al., 2007). Recent studies (Ehrsson et al., 2008) look at the requirements for the incorporation of these artificial limbs into the brain’s body representations.

The most common methods used to transmit meaningful information to the body of a person are vibration and pressure. Kaczmarek et al. (1991) did an extensive review of the technology at the time for sensory function substitution, where he explored different methods to transmit information, comparing both vibration and electrovibration. Using a pair of electrodes, Szeto and Saunders (1982) looked for methods to maximise the amount of information transmitted to the body coded in frequency changes. In order to avoid interference in the information patterns from changes in the intensity of the signal, Szeto proposed the following formula to keep the intensity constant while changing the stimula-tion frequency:

log( ) . . ( ( ))PW log PR= − ×2 82 0 412

where PW is the pulse width in microseconds and PR is the pulse rate in hertz.

Body Awareness in Prosthetic Hands  ◾  193

From these studies, we learned that electrovibration provides more control over the range of sensations that can be created. Kajimoto et al. (2003) showed the pos-sibility to produce selective sensations using electrical stimulation of the fingertips. However, it is still unclear whether this system will work in other areas of the body with a smaller density of nerve receptors. Seps et al. (2011) investigated the param-eters necessary to promote proprioceptive feedback in the lower back, opening the door for other information channels.

Prosthetics can benefit from studies done in other areas, such as in virtual reality’s tactile displays. State-of-the-art devices exploit transduction methods such as static or vibrating pins, focused ultrasound, electrical stimulation, sur-face acoustic waves, and electrorheological materials (Chouvardas et al., 2007; Visell, 2009). Although promising, these methods focus on the stimulation of the fingertips’ high-density mechanoreceptors. For these technologies to be usable in prosthetic applications, they need to be adapted to areas of a small density of mechanoreceptors.

Despite several possible feedback mechanisms found in the literature, only a few have been tested directly with users. One of the most common implementa-tions is the use of force feedback to control grasping force. The addition of a simple mechanism that changes the vibration frequency according to the thumb force (measured by a force sensor installed in the thumb of the prosthetic hand) was enough to reduce the applied force to hold objects by 37% without visual feedback (Pylatiuk et al., 2006). Tactile feedback is also used to measure improvement in the performance of a grasping task (Cipriani et al., 2008). Cipriani et al. explore the collaboration between the prosthetic hand and the user, using different methods to achieve grasping. In addition to the control schemes, the study looks at the influ-ence of tactile feedback over grasping. Although it does not achieve a statistically significant difference between the task’s performances, the study presents a consid-erable increase in the subjective acceptance of the robot hand; in other words, the participants seem more eager to use the robot hand for the grasping task when the tactile feedback is present.

If we want to reduce dependence on visual feedback for amputees, in particu-lar in reaching tasks, we need to find efficient methods to provide proprioceptive feedback. Proprioceptive feedback is related to tactile sensing, but it contains more global information about the hand’s position in space. Proprioceptive feedback proves to be useful in increasing performance in a targeting task under nonsighted conditions, and in some cases, even in sighted conditions (Blank et al., 2010). In addition, auditory feedback proves to be a good and reliable method to transmit proprioceptive feedback. González et al. (2010) showed similar performances of a reaching task using either visual or auditory feedback.

In addition to the efforts just described, a new promising method has been proposed to improve the interaction between the amputee and the artificial limb. Kuiken et al. (2005) introduced a method to improve the chances of acquiring meaningful data from noninvasive EMG acquisition, called nerve reinervation.

194  ◾  Advances in Therapeutic Engineering

This method consists of the relocation of nerves that were connected to the hand into the chest of the person, increasing the amount of information available from the body. In addition to the advantages of prosthetic control due to the increase in the number of available nerve signals, the remaining afferent nerves reiner-vate with the surrounding nerve endings and mechanoreceptors in the skin of the chest, providing an extra channel to transmit information to the body of the patient. Marasco et al. (2009, 2011) show how the reinervated nerves can perceive several sensations, although these sensations have a lower recognition rate than normal ones.

The scientific community has turned again their interest into the search for effective methods to provide sensory feedback; thanks to improvements in micro-electronics, robotics, and neuroscience. We described earlier several methods to pro-vide different sensations. In addition, we presented the use of some of these methods to increase the functional performance in some goal-oriented tasks. There are still several challenges that need to be met before we achieve truly biologically compat-ible devices, such as efficiency and long-term implementations, yet the future of these applications seems promising.

8.5 ConclusionsIn this chapter, we looked at the requirements to include artificial limbs in the mental representation of the body or body image. A multisensory synchronous combination of stimuli is necessary to raise the body awareness of the prosthesis’ user. Neuroscience methods showed us the possibility to train our brains to inte-grate external objects in the mental representation of our bodies. If we provide a patient with simultaneous tactile, proprioceptive, and visual feedback, we will enable the possibility for his or her brain to accommodate the artificial limb in the body image. This will reduce considerably the cognitive burden experienced by the patient, improving his or her quality of life.

Even though in robotics there has been an extensive development of differ-ent tactile sensors, these sensors are still not integrated in a system that could be implemented in prosthetic devices. In this chapter, we proposed the exploitation of material properties to encode information, therefore reducing the complexity of the implementation of a tactile sensor. We explored the coding of velocity of a slipping object in frequency patterns through the use of a ridged skin. Our results presented the possibility to encode the slippage of an object in frequency, which can later be transmitted to the body through sensory substitution methods. A review of the state of the art in sensory feedback presented the best methods available to transmit the required sensory feedback for the promotion of body awareness. Even though work on transmission methods remains to be done, the possibilities presented by the state of the art are very promising. In a not-so-far future, they will improve the quality of the life of disabled people.

Body Awareness in Prosthetic Hands  ◾  195

ReferencesAcharya, S., Tenore, F., Aggarwal, V., Etienne-Cummings, R., Schieber, M. H., and Thakor,

N. V. (2008). Decoding individuated finger movements using volume-constrained neuronal ensembles in the M1 hand area. IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol. 16, pp. 15–23.

Adee, S. (February 2008). Dean Kamen’s ‘Luke Arm’ prosthesis readies for clinical trials. IEEE Spectrum Magazine. Available at http://spectrum.ieee.org/biomedical/bionics/dean-kamens-luke-arm-prosthesis-readies-for-clinical-trials (Last accessed September 7, 2012).

Antfolk, C., Cipriani, C., Controzzi, M., Carrozza, M. C., Lundborg, G., Rosén, B., and Sebelius, F. (2010). Using EMG for real-time prediction of joint angles to control a prosthetic hand equipped with a sensory feedback system. Journal of Medical and Biological Engineering, Vol. 30, No. 6, pp. 399–406, doi: 10.5405/jmbe.767.

Arieta, A. H., Kato, R., Yokoi, H., and Arai, T. (2006). A fMRI study of the cross-modal interaction in the brain with an adaptable EMG prosthetic hand with biofeedback. In Engineering in Medicine and Biology Society, 28th Annual International Conference of the IEEE, New York, NY, pp. 1280–1284, doi: 10.1109/IEMBS.2006.259938.

Beccai, L., Roccella, S., Ascari, L., Valdastri, P., Sieber, A., Carrozza, M. C., and Dario, P. (2008). Development and experimental analysis of a soft compliant tactile microsen-sor for anthropomorphic artificial hand. IEEE/ASME Transactions on Mechatronics, Vol. 13, No. 2, pp. 158–168.

Berlucchi, G. and Aglioti, S. (1997). The body in the brain: Neural bases of corporeal aware-ness. Trends in Neurosciences, Vol. 20, No. 12, pp. 560–564.

Biddiss, E. A. and Chau, T. T. (2007). Upper limb prosthesis use and abandonment: A survey of the last 25 years. Prosthetics and Orthotics International, Vol. 31, No. 3, pp. 236–257, doi: 10.1080/03093640600994581.

Binkofski, F., Kunesch, E., Classen, J., Seitz, R. J., and Freund, H. J. (2001). Tactile apraxia– unimodal apractic disorder of tactile object recognition associated with parietal lobe lesions. Brain, Vol. 124, pp. 132–144.

Blank, A., Okamura, A. M., and Kuchenbecker, K. J. (June 2010). Identifying the role of proprioception in upper-limb prosthesis control: Studies on targeted motion. ACM Transactions on Applied Perception, Vol. 7, No. 3, pp. 23, Article 15, doi: 1145/1773965.1773966. http://doi.acm.org/10.1145/1773965.1773966 (Last accessed September 7, 2012).

Bodegård, A., Ledberg, A., Geyer, S., Naito, E., Zilles, K., and Roland, P. E. (2000). Object shape differences reflected by somatosensory cortical activation in human. The Journal of Neuroscience, Vol. 20, No. RC51, pp. 1–5.

Cascio, C. J. and Sathian, K. (2001). Temporal cues contribute to tactile perception of roughness. The Journal of Neuroscience, Vol. 21, No. 14, pp. 5289–5296.

Castellini, C., Gruppioni, E., Davalli, A., and Sandini, G. (May-September 2009). Fine detection of grasp force and posture by amputees via surface electromyography. Journal of Physiology-Paris, Vol. 103, No. 3–5, pp. 255–262, Neurorobotics. ISSN 0928-4257, doi: 10.1016/j.jphysparis.2009.08.008.

Castellini, C. and van der Smagt, P. (2009). Surface EMG in advanced hand prosthetics. Biological Cybernetics, Vol. 100, No. 1, pp. 35–47, doi: 10.1007/s00422-008-0278-1.

Chouvardas, V., Miliou, A., and Hatalis, M. (2007). Tactile displays: Overview and recent advances. Displays, Vol. 29, No. 3, pp. 185–194.

196  ◾  Advances in Therapeutic Engineering

Cipriani, C., Zaccone, F., Micera, S., and Carrozza, M. C. (2008). On the shared control of an EMG-controlled prosthetic hand: Analysis of user–prosthesis interaction. IEEE Transactions on Robot, Vol. 24, No. 1, pp. 170–184.

Cotton, D. P. J., Cranny, A., White, N. M., and Chappell, P. H. (2007). A thick film piezoelectric slip sensor for a prosthetic hand. IEEE Sensors Journal, Vol. 7, No. 5, pp. 752–761.

Dahiya, R. S., Metta, G., Valle, M., and Sandini, G. (2010). Tactile sensing–from humans to humanoids. IEEE Transactions on Robotics, Vol. 26, No. 1, pp. 1–20.

Damian, D. D., Martinez, H., Dermitzakis, K., Hernandez-Arieta, A., and Pfeifer, R. (2010). Artificial ridged skin for slippage speed detection in prosthetic hand applications. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Taipei, Taiwan.

de Vignemont, F. (2010). Body schema and body image–pros and cons. Neuropsychologia, Vol. 48, No. 3, pp. 669–680.

Ehrsson, H. H., Rosén, B., Stockselius, A., Ragnö, C., Köhler, P., and Lundborg, G. (December 2008). Upper limb amputees can be induced to experience a rubber hand as their own. Brain, Vol. 131, pp. 3443–3452, doi: 10.1093/brain/awn297.

Gallagher, S. (2005). How the Body Shapes the Mind. Oxford University Press, New York.González, J., Yu, W., and Hernandez-Arieta, A. (2010). Multichannel audio biofeedback

for dynamical coupling between prosthetic hands and their users. Industrial Robot: An International Journal, Vol. 37, No. 2, pp. 148–156.

Harada, A., Nakakuki, T., Hikita, M., and Ishii, C. (2010). Robot finger design for myoelectric prosthetic hand and recognition of finger motions via surface EMG. In IEEE International Conference on Automation and Logistics (ICAL), Hong Kong, China, pp. 273–278. E-ISBN: 978-1-4244-8374-7, doi: 10.1109/ICAL.2010.5585294.

Hernandez-Arieta, A., Dermitzakis, C., Damian, D., Lungarella, M., and Pfeifer, R. (2008). Sensory-motor coupling in rehabilitation robotics. In Y. Takahashi, ed., Handbook of Service Robotics, pp. 21–36. I-Tech Education and Publishing, Vienna, Austria.

Howe, R. D. and Cutkosky, M. R. (1993). Dynamic tactile sensing: Perception of fine sur-face features with stress rate sensing. IEEE Transactions on Robotics and Automation, Vol. 9, No. 2, pp. 140–151.

Johanson, K. O. and Hsiao, S. S. (1992). Neural mechanisms of tactile form and texture perception. Annual Review of Neuroscience, Vol. 15, pp. 227–250.

Johansson, R. and Westling, G. (1984). Roles of glabrous skin receptors and sensorimotor memory in automatic control of precision grip when lifting rougher or more slippery objects. Experimental Brain Research, Vol. 56, No. 3, pp. 550–564.

Kaczmarek, K. A., Webster, J. G., Bach-y-Rita, P., and Tompkins, W. J. (1991). Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Transactions on Biomedical Engineering, Vol. 38, No. 1, pp. 1–16.

Kajimoto, H., Inami, M., Kawakami, N., and Tachi, S. (2003). Smart touch–augmentation of skin sensation with electrocutaneous display. In Proceedings of the 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS’03), Los Angeles, CA, pp. 40–46.

Kato, R., Yokoi, H., Hernandez-Arieta, A., Yu, W., and Arai, T. (2009). Mutual adaptation among man and machine by using fMRI analysis. Robotics and Autonomous Systems, Vol. 57, No. 2, pp. 161–166. ISSN 0921-8890, doi: 10.1016/j.robot.2008.07.005.

Body Awareness in Prosthetic Hands  ◾  197

Kim, K., Colgate, J. E., and Peshkin, M. A. (2008). On the design of a thermal display for upper extremity prosthetics. In Proceedings of the 2008 IEEE International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Reno, NV, pp. 413–419.

Kuiken, T. A., Dumanian, G. A., Lipschutz, R. D., Miller, L. A., and Stubblefield, K. A. (2005). Targeted muscle reinnervation for improved myoelectric prosthesis control. In Neural Engineering. Conference Proceedings. 2nd International IEEE EMBS Conference, pp. 396–399, 16–19, doi: 10.1109/CNE.2005.1419642.

Kuiken, T. A., Marasco, P. D., Lock, B. A., Harden, R. N., and Dewald, J. P. (2007). Redirection of cutaneous sensation from the hand to the chest skin of human amputees with targeted reinnervation. Proceedings of the National Academy of Sciences of the USA, Vol. 104, pp. 20061–20066.

Lee, M. H. (2000). Tactile sensing: New directions, new challenges. The International Journal of Robotic Research, Vol. 19, No. 7, pp. 636–643.

Lowe, R. J., Chappell, P. H., and Ahmad, S. A. (2010). Using accelerometers to analyse slip for prosthetic application. Measurement Science & Technology, Vol. 21 pp. 035203, doi: 10.1088/0957-0233/21/3/035203.

Maier, S. and van der Smagt, P. (2008). Surface EMG suffices to classify the motion of each finger independently. In Proceedings of MOVIC 2008, 9th International Conference on Motion and Vibration Control, Technical University Munich, Garching, Deutschland, September 15–18.

Maldonado-Lopez, R., Vidal-Verdu, F., Linan, G., Roca, E., and Rodriguez-Vazquez, A. (2007). Early slip detection with a tactile sensor based on retina. Analog Integrated Circuits and Signal Processing, Vol. 53, pp. 97–108.

Marasco, P. D., Kim, K., Colgate, J. E., Peshkin, M. A., and Kuiken, T. A. (2011). Robotic touch shifts perception of embodiment to a prosthesis in targeted reinnervation ampu-tees. Brain, Vol. 134, No. 3, pp. 747–758, doi: 10.1093/brain/awq361.

Marasco, P. D., Schultz, A. E., and Kuiken, T. A. (June 2009). Sensory capacity of rein-nervated skin after redirection of amputated upper limb nerves to the chest. Brain, Vol. 132 (Pt 6), pp. 1441–1448.

Ohka, M., Mitsuya, Y., Higashioka, I., and Kabeshita, H. (2005). An experimental optical three-axis tactile sensor for micro-robots. Robotica, Vol. 23, No. 4, pp. 457–465.

Oskoei, M. A. and Hu, H. (October 2007). Myoelectric control systems—A survey. Biomedical Signal Processing and Control, Vol. 2, No. 4, pp. 275–294. ISSN 1746-8094, doi: 10.1016/j.bspc.2007.07.009.

Parker, P., Englehart, K., and Hudgins, B. (December 2006). Myoelectric signal processing for control of powered limb prostheses. Journal of Electromyography and Kinesiology, Vol. 16, No. 6, pp. 541–548. Special Section (pp. 541–610): 2006 ISEK Congress. ISSN 1050-6411, doi: 10.1016/j.jelekin.2006.08.006.

Pylatiuk, C., Kargov, A., and Schulz, S. (2006). Design and evaluation of a low-cost force feedback system for myoelectric prosthetic hands. Journal of Prosthetics and Orthotics, Vol. 18, pp. 57–61.

Ramachandran, V. S. and Blakeslee, S. (1998). Phantoms in the Brain: Probing the Mysteries of the Human Mind. William Mollow, New York.

Ramachandran, V. S. and Hirstein, W. (1998). The perception of phantom limbs. The D. O. Hebb lecture. Brain, Vol. 121 (Pt 9), pp. 1603–1630.

Reilly, K. T., Mercier, C., Schieber, M. H., and Sirigu, A. (2006). Persistent hand motor commands. Journal of Prosthetics and Orthotics, Vol. 18, pp. 57–61.

198  ◾  Advances in Therapeutic Engineering

Rochat, P. (1998). Self-perception and action in infancy. Experimental Brain Research, Vol. 123, pp. 102–109.

Rosén, B., Ehrsson, H. H., Antfolk, C., Cipriani, C., Sebelius, F., and Lundborg, G. (2009). Referral of sensation to an advanced humanoid robotic hand prosthesis. Journal of Plastic Surgery and Hand Surgery, Vol. 43, No. 5, pp. 260–266.

Rossini, P., Micera, S., Benvenuto, A., Carpaneto, J., Cavallo, G., Citi, L., Cipriani, C., et al. (2010). Double nerve intraneural interface implant on a human amputee for robotic hand control. Clinical Neurophysiology: Official Journal of the International Federation of Clinical Neurophysiology. Vol. 121, No. 5, pp. 777–783, doi: 10.1016/j.clinph.2010.01.001.

Seps, M., Dermitzakis, K., and Hernandez-Arieta, A. (2011). Study on lower back elec-trotactile stimulation characteristics for prosthetic sensory feedback. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), San Francisco, CA, pp. 3454–3459. ISSN: 2153-0858, doi: 10.1109/IROS.2011.6095110.

Shannon, G. (1976). A comparison of alternative means of providing sensory feedback on upper limb prostheses. Medical and Biological Engineering and Computing, Vol. 14, pp. 289–294.

Smith, R. J., Huberdeau, D., Tenore, F., and Thakor, N. V. (3–6 September 2009). Real-time myoelectric decoding of individual finger movements for a virtual target task. Engineering in Medicine and Biology Society. EMBC 2009. Annual International Conference of the IEEE, pp. 2376–2379, doi: 10.1109/IEMBS.2009.5334981.

Srinivasan, M. A. and LaMotte, R. H. (1995). Tactual discrimination of softness. Journal of Neurophysiology, Vol. 73, pp. 88–101.

Szeto, A. Y. and Farrenkopf, G. R. (1992). Optimization of single electrode tactile codes. Annals of Biomedical Engineering, Vol. 20, No. 6, pp. 647–665.

Szeto, A. Y. and Saunders, F. A. (1982). Electrocutaneous stimulation for sensory commu-nication in rehabilitation engineering. IEEE Transactions on Bio-Medical Engineering, Vol. 29, No. 4, pp. 300–308.

Tremblay, M. and Cutkosky, M., (1993). Estimating friction using incipient slip sensing dur-ing a manipulation task. In IEEE International Conference on Robotics and Automation, Tokyo, Japan, Vol. 1, pp. 429–434.

Visell, Y. (January 2009). Tactile sensory substitution: Models for enaction in HCI. Interacting with Computers, Vol. 21, No. 1–2, pp. 38–53. Special issue: Enactive Interfaces. ISSN 0953-5438, doi: 10.1016/j.intcom.2008.08.004.

Von Békésy, G. (1959). Similarities between hearing and skin senses. Psychological Review, Vol. 66, pp. 1–22.

Waller, G., and Barnes, J. (November 2002). Preconscious processing of body image cues: Impact on body percept and concept. Journal of Psychosomatic Research, Vol. 53, No. 5, pp. 1037–1041. ISSN 0022-3999, doi: 10.1016/S0022-3999(02)00492-0.

Yamada, D., Maeno, T., and Yamada, Y. (2002). Artificial finger skin having ridges and dis-tributed tactile sensors used for grasp force control. Journal of Robotics and Mechatronics, Vol. 14, No. 2, pp. 140–146.