what can nonspeech tasks tell us about speech motor disabilities?

9
Journal of Phonetics (1995) 23, 139-147 What can nonspeech tasks tell us about speech motor disabilities? John W. Foikins, Jerald B. Moon, Erich S. Luschei, Donald A. Robin, Nancy Tye-Murray, and Kenneth L. Moll University of Iowa, Iowa City, Iowa 52242, U.S.A. Received 10 September 1993, and in revised form 27 September 1994 This paper considers the possible role of nonspeech tasks in the assessment of individuals with motor speech disorders. The difficulties in the definition and isolation of both speech and nonspeech tasks are discussed. A primary point is that an inability to control the movements of the speech structures may be separate from an inability or ability to use the processes that code meaning in the construction of linguistic messages. It may be possible to design nonspeech tasks that provide insight into an individual's ability or inability to control speech movements, but are separate from his or her ability to use language. 1. Introduction This paper addresses two points: 1) there is a possible role for nonspeech tasks in the assessment and treatment of individuals with motor speech disorders, and 2) it is not known, a priori, that nonspeech tasks will provide valuable information in this regard, but it is worthwhile to ask the question--to see if they will. These two points are developed as five topics are covered: concern for the definition of tasks, separation of motor processes from linguistic processes, separate assessment of speech motor subsystems, selection of tasks or variables to measure, and examples of nonspeech motor tasks. 2. Concern for the definition of tasks How does one define a separate task? All behaviors can be thought of as embedded in larger, more global behaviors. Likewise, there are many different ways to divide speech behaviors into smaller, separate tasks. Tasks can be reduced further into subtasks. Sometimes the concept of "goal" has been used, such that goals constitute the successful completion of a task of any size. Others may prefer to avoid the term "goal" and to define tasks in terms of processes organizing emergent properties. Whatever the terminology, goals or organizing principles can be hypothesized to operate on many levels: e.g., motoric, phonological, syntactic, semantic, and pragmatic. Unfortunately, there is little agreement about what the goals or principles are on any of these levels. 0095-4470/95/010139 + 09 $08.00/0 t~) 1995 Academic Press Limited

Upload: john-w-folkins

Post on 01-Nov-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Journal of Phonetics (1995) 23, 139-147

What can nonspeech tasks tell us about speech motor disabilities?

John W. Foikins, Jerald B. Moon, Erich S. Luschei, Donald A. Robin, Nancy Tye-Murray, and Kenneth L. Moll University o f Iowa, Iowa City, Iowa 52242, U.S.A.

Received 10 September 1993, and in revised form 27 September 1994

This paper considers the possible role of nonspeech tasks in the assessment of individuals with motor speech disorders. The difficulties in the definition and isolation of both speech and nonspeech tasks are discussed. A primary point is that an inability to control the movements of the speech structures may be separate from an inability or ability to use the processes that code meaning in the construction of linguistic messages. It may be possible to design nonspeech tasks that provide insight into an individual's ability or inability to control speech movements, but are separate from his or her ability to use language.

1. Introduction

This paper addresses two points: 1) there is a possible role for nonspeech tasks in the assessment and t rea tment of individuals with motor speech disorders, and 2) it is not known, a priori, that nonspeech tasks will provide valuable information in this regard, but it is worthwhile to ask the ques t ion- - to see if they will. These two points are developed as five topics are covered: concern for the definition of tasks, separat ion of motor processes from linguistic processes, separate assessment of speech motor subsystems, selection of tasks or variables to measure, and examples of nonspeech motor tasks.

2. Concern for the definition of tasks

How does one define a separate task? All behaviors can be thought of as embedded in larger, more global behaviors. Likewise, there are many different ways to divide speech behaviors into smaller, separate tasks. Tasks can be reduced further into subtasks. Somet imes the concept of "goa l" has been used, such that goals constitute the successful complet ion of a task of any size. Others may prefer to avoid the term "goa l" and to define tasks in terms of processes organizing emergent properties. Whatever the terminology, goals or organizing principles can be hypothesized to opera te on many levels: e.g., motoric, phonological, syntactic, semantic, and pragmatic. Unfortunately, there is little agreement about what the goals or principles are on any of these levels.

0095-4470/95/010139 + 09 $08.00/0 t~) 1995 Academic Press Limited

140 J . W . Folkins et ai.

It would be useful to be able to separate the goals or units organizing the operating principles of the speech motor system from the goals or principles on other levels. Many models of speech movement control assume that phonemes or some other linguistic units form the goals of speech motor control. However, it is not necessary to assume that there is any correspondence between linguistic units, including the phoneme, and control units of the speech motor system. In fact, Folkins and Bleile (1990) have suggested that borrowing linguistic units may impede our efforts to understand the organization of speech motor control.

An analogy may help to illustrate this point. When making a cake, the ingredients such as sugar, flour, and flavoring form units. When eating a cake, the units might be the piece, the bite, and the bolus. It would be unreasonable to expect to relate individual bites or other units of consumption to individual ingredients used to make the cake. Similarly, although the construction of linguistic messages precedes the speech movements used to transmit the message, the units that organize each level do not need to be related to each other.

This analogy can be taken a step further. If one were to investigate disorders related to eating cake, the units of analysis would be driven by the types of dis- orders expected. One might manipulate the ingredients of cakes to study eating disorders related to obesity. For chewing disorders, one would not manipulate ingredients of food directly unless they influenced characteristics of the bolus such as consistency or viscosity. One would be more likely to study jaw movements, biting forces, clenching forces, manipulation of objects in the mouth, etc.

3. Separating motoric processes from linguistic processes

Our perspective is that there are many speakers who have an inability to control the movements of the structures that produce speech. Such inabilities are separate from a speaker's abilities or inabilities to use the psycholinguistic processes that code meaning in the production of speech and language. Therefore, if we suspect motor deficits and we wish to measure them in any speaker, we need a way to assess motor deficits as separate and distinct from psycholinguistic deficits.

Nonspeech tasks can be designed to measure an individual's ability to reach any goals that an investigator would like to posit. Ultimately, it may be possible to find tasks that have goals that either mimic, or at least inform one about, the motor processes used in speech production. If one could define such goals, then one might also be able to assess the motor processes of speech and do so in a manner that is not dependent on, or intermixed with, an individual's psycholinguistic abilities. For example, it should be possible to use such procedures in aphasic subjects with psycholinguistic impairments.

4. Separate assessment of motor subsystems

Netsell and Rosenbek (1985) and many others have stressed the importance of evaluating the separate components of the speech motor system. Specifically, dysarthrias often influence each speech structure differentially, and an assessment of the severity and type of involvement for each structure is fundamental to understanding their impairment.

Nonspeech Tasks 141

It has often been observed during speech that all of the structures work together. Many possible combinations of movements can be used for the same speech task, and, thus, a limitation in the movements of one structure can often be compensated for by movements of another structure. This is fortunate in that it helps the speaker to overcome many movement limitations (but, of course, not all limitations can be overcome all of the time). Furthermore, the interactions between structures are so extensive that it can be quite difficult to distinguish among motor limitations and interactions by other structures.

In contrast to speech tasks, nonspeech tasks can be designed to allow assessment of the motor control of each individual articulator. That is, subjects may be asked to move only one structure and one can measure how accurately it can be moved. In addition, tasks can be constructed in which two or more moving structures interact, thus allowing assessment of the coordination between structures.

During speech, the auditory and motor systems may interact extensively. Some speakers may have hearing impairments, including difficulties in auditory percep- tion. Motor tasks can be designed with targets that are visual, thus allowing a measurement of motor impairments that is not contaminated by possible auditory disabilities.

Gary Weismer (personal communication, 1992) has pointed out that many motor tasks involve task-specific control strategies. Therefore, he argues that one cannot generalize from one task to another and, thus, one cannot use nonspeech motor tasks as a window into speech motor control processes. We agree that many aspects of motor control are task-specific (Young & Schmidt, 1991); however, speech motor tasks should be specific to the control processes of the speech motor system, not control processes borrowed from psycholinguistic analyses--a different level of analysis.

Although speech motor control may be task-specific, one can still design nonspeech tasks that mimic aspects of speech tasks under study. This should match control processes and minimize the difficulty of generalizing across tasks. It is our position that some of the largest effects of task specificity may be within the different motoric tasks that fit within the broader perceptual goals of speech. As explained above, there are extensive interactions among speech structures, however, the extent to which these interactions are dependent on the motoric goals (or processes) specified by the tasks is not easily studied if one limits the evidence to only that from speech tasks. The psycholinguistic goals get in the way and, furthermore, we cannot specify the motor goals. When using nonspeech tasks, one should eventually be able to measure the extent to which performance is influenced by hypothesized goals for components of the motor system.

5. What to measure

Tasks should be determined by the level (motoric, psycholinguistic, pragmatic, etc.) targeted for study. As explained above, motor tasks can be designed with goals that are nonlinguistic. These can be used to separate the motor and psycholinguistic abilities of an individual. Not only do nonspeech tasks allow for nonlinguistic goals to be manipulated, they also allow the investigator a great range of possible goals and great latitude in how goals are specified. Quantifying goals allows one to measure the distance between the goal and the subject's performance. During

142 J . W . Folkins et al.

speech, one could specify goals of good intelligibility, different speech rates, or changes in phonetic composition. However, the processes that organize the movements to produce these larger behaviors are not well understood (Folkins & Bleile, 1990). There are not good ideas about what is most important to measure.

In nonspeech tasks, it is easier to specify the demands that one is placing on the motor system and to measure the subject's performance relative to these demands (Moon, Zebrowski, Robin, & Folkins, 1993). This is one of the most useful aspects of nonspeech tasks. Unfortunately, in nonspeech tasks there is a problem that is parallel to the problem with speech tasks explained above; i.e., there is uncertainty concerning the goals of speech tasks that leads to uncertainty about what to measure. This uncertainty in speech leads to uncertainty in which motor demands to study in nonspeech tasks. One does not know which nonspeech tasks might be the best to study the integrity of the speech motor system.

The point is that nonspeech tasks are sometimes criticized because it is difficult to justify how they are clearly relevant to the control of speech movements (Weismer & Liss, 1991). This is not much different from speech tasks in which it is difficult to justify how the measurements are clearly relevant to the control of speech movements. One cannot be sure that the nonspeech tasks under study will tap speech-like processes nor that conclusions based on nonspeech tasks will generalize to the control of speech.

We do not know which nonspeech tasks will generalize to the control of speech until we test them. At this point, we are not sure that nonspeech tasks will eventually lead us to efficient ways of quantifying speech motor abilities. However, it is at least as important, if not more important, to explore nonspeech tasks than it is to look at speech tasks for this purpose.

6. Some examples

One salient characteristic of speech movements is the alternation between opening and closing gestures in which the peak velocity occurs roughly in the middle of the movements. Speech movements do not follow sinusoidal paths; however, it is reasonable to assume that if a subject cannot produce a roughly sinusoidal movement with a speech structure, there is a deficit in movement control that could affect speech movements as well. As shown by Moon et al. (1993), visual tracking of sinusoidal waveforms allows one to quantify skill as the distance between the target waveform and a subject's movements. The displacement and velocity can be assessed independently by manipulating amplitude and frequency.

Fig. 1 shows data from a lower lip tracking task performed by a 29 year-old subject with conduction aphasia. Lower lip movement was measured with a strain gauge system and it was displayed on an oscilloscope placed in front of the subject. The subject was instructed to track a moving target with the signal controlled by his lip movement.

During speech, this subject made predominately phonemic paraphasias (sound substitutions, e.g, "fen" for "pen"). He had poor language comprehension in both auditory and reading tasks as well as impaired writing and naming. His speech rate was normal. His lesion was in the left inferior parietal lobe and superior temporal gyms. It was our clinical impression that this subject did not have a significant motor deficit affecting the lips. This is supported by these examples of tracking data. The

Nonspeech Tasks 143

APHASIC SUBJECT

." . . ~="N.\ ~ ~'.

0 . 3 H z r = 0.97

0 . 6 H z r = 0 . 9 4

0 . 9 H z r = 0 . 9 3 Figure 1. Visual tracking of lower lip movement performed by a subject with conduction aphasia showing no significant deficit in the aspects of labial motor control assessed by this task.

subject's lower lip movement correlates reasonably well with the target for all three frequencies shown. He performs like a normal speaker on this task.

Fig. 2 shows the same task performed by a speaker that we would classify as having apraxia of speech. It is taken from Hageman, Robin, Moon, and Foikins (1994). This subject had normal language comprehension, writing, and naming; thus, there was no aphasia present. He had a moderate speech impairment with trial and error groping of the articulators during speech, error inconsistency, and more difficulty on fricatives and affricates than other speech sounds (although he did occasionally err on bilabial plosives in the initial position of words). In other words,

144 J . W . Folkins et al.

APRAXIC SUBJECT

0 . 3 H z r = 0 . 8 4

0 . 6 H z r = 0 . 7 1

0 . 9 H z r = 0 . 0 7 Figure 2. Visual tracking of lower tip movement performed by a subject with the clinical profile of apraxia of speech. Performance on this task suggests a significant deficit in labial control.

he fit the clinical profile of an apraxic speaker as described by Darley, Aronson, and Brown (1975). This speaker ' s speech rate was three times slower than normal. His lesion was in the left frontal lobe encompassing Broca 's area.

This subject showed no weakness in the lower lip; however, as shown in this figure, the visual tracking tasks suggest that there is a motor deficit. The lower lip tracking is adequate at the slow rate, where there is a correlation coefficient of 0.84 between the target movement and lip movement . At the intermediate rate there is a slight reduction in accuracy. The correlation coefficient slips to 0.71. It is interesting to note that there does not appear to be a reduction in the range of displacements or

Nonspeech Tasks 145

velocities used. The problem seems to be one of control. This becomes evident in the fast rate condition in which the correlation coefficient goes to 0.07. Even here, there is the suggestion that the subject was trying to perform the movements, but he just couldn't control the lower lip at that combination of displacement and velocity.

Both the subject with aphasia and the subject with apraxia of speech made multiple speech sound errors. However, the patient with conduction aphasia did not show motor involvement on the tracking task. In contrast, the patient with apraxia showed a decrement in tracking that suggests a motor involvement.

The previous figures on sinusoidal tracking showed an attempt to use a nonspeech task to assess movement control within the range of displacements and velocities used in speech. Fig. 3 shows an example of an entirely different nonspeech task--blowing. This figure is adapted from Kuehn and Moon (1993). It shows 95% confidence intervals for normalized electromyographic levels across a series of speech and blowing tasks produced by a normal subject. The electrodes were placed to maximize the chance of sampling from the levator veli palatini muscle. Measures a - f show electromyographic activity during the word "say" in the carrier phrase "Say ~ again." Measures g-l represent electromyographic levels during produc- tion of [m], [mam], [mim], [sis], [sus], and [pal, respectively, from within the carrier phrase. Measures m-r represent electromyographic activity during the word "again" in each carrier phrase. Finally, measures s-z represent electromyographic activity during a blowing task designed to generate intraoral pressures ranging from 5 cm H20 to the maximum pressure the subject was able to produce. The tasks have been rank ordered from the minimum electromyographic activity to the maximum electromyographic activity. The nonspeech tasks were designed to push the levator

lOO A

0 z 90-

0 80- _.J m

70-

u.l 0_ 6 0 - LL 0 50 -

40-

LU 30- n" F_O 20- > uJ 10- _ J

0

I I I I

i I

I I

1

I I

I

g e e q a p o n r m s u v w x y z

SPEECH BLOWING Figure 3. 95% confidence intervals in electromyographic activity levels recorded from electrodes placed in the levator veli palatini for a series of speech and blowing tasks. The tasks have been rank ordered and show that the blowing tasks use more activity than the speech tasks do. This figure is adapted from Kuehn and Moon (1993).

146 J .W. Folkins et al.

activity to its limit, and clearly the blowing tasks result in more activity than the speech tasks do.

The point of Fig. 3 is to demonstrate that another approach is to use nonspeech tasks to assess maximum range of some parameter (e.g., closure forces, air pressures, extent of movement) that an individual is capable of generating. This maximum may be beyond the range used in speech, even in impaired subjects. Subjects with a limited range may be able to perform many speech tasks under ideal conditions, but such speech tasks may be taxing their systems at levels near their limits. Alternatively, during speech some subjects may not be able to press the system to its limits, even though those limits are clearly greater than those used in speech performance as shown with nonspeech tasks. Thus, nonspeech tasks can help not only in assessing the motor system, they can also help in identifying strategies that could be used to take advantage of existing capabilities and improve speech performance. Thus, nonspeech tasks have a potential role in prognosis well as assessment.

Our last example of the utility of nonspeech tasks comes from speakers who are deaf. In this example, instead of using nonspeech tasks to eliminate the role of linguistic factors from assessment of motor performance, a nonspeech task was used to distinguish between linguistic and motor inabilities. Specifically, Tye-Murray & Folkins (1990) studied the stress patterns errors produced by adult speakers with no measurable hearing. Speakers who are deaf often produce atypical stress patterns. It was not known whether they do not know the correct stress pattern to produce or whether they do not have the motoric ability to produce the intended stress pattern accurately. Speakers who are deaf were asked to produce strings of syllables with a series of different stress patterns. The subjects were trained to produce desired stress patterns. The knowledge of the patterns was measured by having them tap out each pattern with an index finger. All subjects tapped the stress patterns in a manner that distinguished between stressed and unstressed syllables. This study found that when subjects who are deaf were trained to understand a desired stress pattern, they could speak it successfully. Thus, unlike many other aspects of speech motor control in speakers who are deaf (Tye-Murray, Folkins and Zimmermann, 1987; Tye-Murray, 1992), stress patterning errors in deaf speakers do not appear to be limited by motoric inabilities.

7. Conclusion

There are many studies that have attempted to use nonspeech tasks to assess the motor capabilities of individuals with impaired speech. There is not space to review more of them here. Weismer (personal communication, 1992) has argued that links between performance on nonspeech tasks and speech impairments have not been shown in the literature. We disagree, but that is not the issue. The most important point is that nonspeech tasks offer a number of advantages over speech tasks in the assessment of motor disorders that may influence speech abilities. We should continue to explore their potential as assessment tools. Eventually, the use of both speech tasks and nonspeech tasks may offer a more powerful approach than limiting analyses to performance during speech.

Nonspeech Tasks 147

References

Darley, F. L., A. E. Aronson, & J. R. Brown (1975) Motor Speech Disorders, Philadelphia: W. B. Saunders.

Folkins, J. W. & K. M. Bleile (1990) Taxonomies in biology, phonetics, phonology, and speech motor control, Journal of Speech and Hearing Disorders, 55, 596-611.

Hageman, C., D. A. Robin, J. B. Moon, & J. W. Folkins (1994) Visuomotor tracking abilities of apraxic speakers, Clinical Aphasiology, 22, 219-229.

Kuehn, D. & J. B. Moon (1993) Levator veli palatini muscle activity in relation to intraoral air pressure changes. Annual Conference of the American Cleft Palate-Craniofacial Association, Pittsburgh, April 1993.

Moon, J. B., P. Zebrowski, D. A. Robin, & J. W. Folkins (1993) Visuomotor tracking ability of young adult speakers, Journal of Speech and Hearing Research, 36,672-682.

Netsell, R. & J. Rosenbek (1985) Treating the dysarthrias. In Speech and Language Evaluation in Neurology: Adult Disorders (J. Darby, editor), pp. 363-392, Grune & Stratton: Orlando, FL.

Tye-Murray. N. (1992) Articulatory organization strategies and the roles of audition, Volta Review, 94, 243-259.

Tye-Murray, N. & J. W. Folkins (1990) Jaw and lip movements of deaf talkers while producing utterances with known stress patterns, Journal of the Acoustical Society of America, 87, 2675-2683.

Tye-Murray, N., G. N. Zimmermann, & J. W. Folkins (1987) Movement timing in deaf and hearing speakers: Comparison of phonetically heterogeneous syllable strings, Journal of Speech and Hearing Research, 30, 411-417.

Weismer, G. (1992) Personal communication, letter to John Folkins dated August 9, 1992. Weismer, G. & J. M. Liss (1991) Acoustic/perceptual taxonomies of speech production deficits in motor

speech disorders. In Dysarthria and Apraxia of Speech: Perspectives on Management (C. A. Moore, K. M. Yorkston, and D. R. Beukelman, editors), pp. 245-270. Baltimore, MD: Paul H. Brooks Publishing Co.

Young, D. E. & R. A. Schmidt (1991) Motor programs as units of movement control. In Making Them Move: Mechanics, Control, and Animation of Articulated Figures. (N. I. Badler, B. A. Barsky, and D. Zeltzer, editors), pp. 129-155. San Mateo: Morgan Kaufman.