large, colorful, or noisy? attribute- and modality ... · large, colorful, or noisy? attribute- and...

15
Cognitive, Affective, & Behavioral Neuroscience 2001, 1 (3), 207-221 Stored knowledge about object concepts is generally described as comprising both functional/associative in- formation, such as what an object is used for and where it is found, and perceptual/sensory information, such as what the object looks like (e.g., color, size, shape), any sound it makes, its smell, taste, and texture, and so on. It has been further proposed that the cortical regions sub- serving different aspects of semantic representations might be located in or near the sensory or motor cortices through which the knowledge is acquired and experi- enced (e.g., Allport, 1985; Gainotti, Silveri, Daniele, & Giustoli,1995; Pulvermüller, 1999). Although direct ev- idence for the contributionof separable neural systems to the representation and processing of different types of perceptual knowledge has been provided by a small num- ber of neuropsychological studies, attempts to localize perceptual attribute knowledge by using neuroimaging techniques either have been restricted to a single type of visual attribute (color) or have examined a range of broadly defined visual attributes (e.g., [relative] size/ shape judgments, feature/part identification). If the re- trieval of different perceptual attributes engages non- identical and sensorily relevant cortical regions, careful definition of the attribute(s) of interest and comparison across a variety of attributes is critical. The positionemis- sion tomography(PET) study reported here was designed to investigatethe distinctiveness of the neural substrates involved in various types of perceptual attribute knowl- edge, both within and between modalities. Neuropsychological Data Few neuropsychological studies have reported selec- tive impairment or sparing of the knowledge of various perceptual attributes, and the insights provided by even these few studies are limitedby a lack of systematicity in the range of attributes tested. Localization of the critical underlying neural structures has also been problematic, because most reported cases involve extensive and/or diffuse cortical damage. Nonetheless, the specific per- ceptual attributedissociationsreported in those few stud- ies that have investigated a range of relevant attributes provide some support for fractionation of perceptual se- mantic attributesand their separate cortical localizations. Several studies have reported the relative sparing of knowledge pertaining to the perceptual attribute of size, in the context of impaired knowledge about other visual perceptual attributes (e.g., the color, overall shape, and parts of objects; Coltheart et al., 1998; Forde, Francis, Riddoch, Rumiati, & Humphreys, 1997; Sartori & Job, 1988; see also Sheridan & Humphreys, 1993). This dis- sociation has led to suggestions that size information 207 Copyright 2001 Psychonomic Society, Inc. We thank both the staff at the Wolfson Brain Imaging Centre, par- ticularly Nahal Mavaddat, Iona Kendall, Tim Donovan, Dylan Pritchard, and Gary Hawes, and the volunteers who underwent the scans. We are also grateful to Dr. Facundo Manes for defining the temporal lobe re- gion of interest for us. Correspondence concerning this article should be addressed to M. L. Kellenbach, MRC Cognition and Brain Sciences Unit, 15 Chaucer Road, Cambridge CB2 2EF, England (e-mail: marion. [email protected]). Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge MARION L. KELLENBACH, MATTHEW BRETT, and KARALYN PATTERSON MRC Cognition and Brain Sciences Unit, Cambridge, England Position emission tomography was used to investigate whether retrieval of perceptual knowledge from long-term memory activatesunique cortical regions associated with the modality and/or attribute type retrieved. Knowledge about the typical color, size, and sound of common objects and animals was probed, in response to written words naming the objects. Relative to a nonsemantic control task, all the attribute judgments activated similar left temporal and frontal regions. Visual (color, size) knowl- edge selectivelyactivatedthe right posterior inferior temporal (PIT) cortex, whereas sound judgments elicited selective activation in the left posterior superior temporal gyrus and the adjacent parietal cor- tex. All of the attribute judgments activated a left PIT region, but color retrieval generated more acti- vation in this area. Size judgments activatedthe right medial parietal cortex. These results indicate that the retrieval of perceptual semantic information activates not only a general semantic network, but also cortical areas specialized for the modality and attribute type of the knowledge retrieved.

Upload: others

Post on 02-Jun-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

Cognitive, Affective, & Behavioral Neuroscience2001, 1 (3), 207-221

Stored knowledge about object concepts is generallydescribed as comprising both functional/associative in-formation, such as what an object is used for and whereit is found, and perceptual/sensory information, such aswhat the object looks like (e.g., color, size, shape), anysound it makes, its smell, taste, and texture, and so on. Ithas been further proposed that the cortical regions sub-serving different aspects of semantic representationsmight be located in or near the sensory or motor corticesthrough which the knowledge is acquired and experi-enced (e.g., Allport, 1985; Gainotti, Silveri, Daniele, &Giustoli, 1995; Pulvermüller, 1999). Although direct ev-idence for the contributionof separable neural systems tothe representation and processing of different types ofperceptual knowledge has been provided by a small num-ber of neuropsychological studies, attempts to localizeperceptual attribute knowledge by using neuroimagingtechniques either have been restricted to a single type ofvisual attribute (color) or have examined a range ofbroadly defined visual attributes (e.g., [relative] size/shape judgments, feature/part identification). If the re-

trieval of different perceptual attributes engages non-identical and sensorily relevant cortical regions, carefuldefinition of the attribute(s) of interest and comparisonacross a variety of attributes is critical. The positionemis-sion tomography (PET) study reported here was designedto investigate the distinctiveness of the neural substratesinvolved in various types of perceptual attribute knowl-edge, both within and between modalities.

Neuropsychological DataFew neuropsychological studies have reported selec-

tive impairment or sparing of the knowledge of variousperceptual attributes, and the insights provided by eventhese few studies are limited by a lack of systematicity inthe range of attributes tested. Localization of the criticalunderlying neural structures has also been problematic,because most reported cases involve extensive and/ordiffuse cortical damage. Nonetheless, the specific per-ceptual attribute dissociations reported in those few stud-ies that have investigated a range of relevant attributesprovide some support for fractionation of perceptual se-mantic attributes and their separate cortical localizations.Several studies have reported the relative sparing ofknowledge pertaining to the perceptual attribute of size,in the context of impaired knowledge about other visualperceptual attributes (e.g., the color, overall shape, andparts of objects; Coltheart et al., 1998; Forde, Francis,Riddoch, Rumiati, & Humphreys, 1997; Sartori & Job,1988; see also Sheridan & Humphreys, 1993). This dis-sociation has led to suggestions that size information

207 Copyright 2001 Psychonomic Society, Inc.

We thank both the staff at the Wolfson Brain Imaging Centre, par-ticularly Nahal Mavaddat, Iona Kendall, Tim Donovan, Dylan Pritchard,and Gary Hawes, and the volunteers who underwent the scans. We arealso grateful to Dr. Facundo Manes for defining the temporal lobe re-gion of interest for us. Correspondence concerning this article shouldbe addressed to M. L. Kellenbach, MRC Cognition and Brain SciencesUnit, 15 Chaucer Road, Cambridge CB2 2EF, England (e-mail: [email protected]).

Large, colorful, or noisy?Attribute- and modality-specific

activations during retrieval of perceptualattribute knowledge

MARION L. KELLENBACH, MATTHEW BRETT, and KARALYN PATTERSONMRC Cognition and Brain Sciences Unit, Cambridge, England

Position emission tomography was used to investigate whether retrieval of perceptual knowledgefrom long-term memory activatesunique cortical regions associatedwith the modality and/or attributetype retrieved.Knowledge about the typical color, size, and sound of common objects and animals wasprobed, in response to written words naming the objects. Relative to a nonsemantic control task, allthe attribute judgments activated similar left temporal and frontal regions. Visual (color, size) knowl-edge selectivelyactivatedthe right posterior inferior temporal (PIT) cortex, whereas sound judgmentselicited selective activation in the left posterior superior temporal gyrus and the adjacent parietal cor-tex. All of the attribute judgments activated a left PIT region, but color retrieval generated more acti-vation in this area.Size judgments activatedthe right medial parietalcortex. These results indicate thatthe retrievalof perceptual semantic information activatesnot only a general semantic network, but alsocortical areas specialized for the modality and attribute type of the knowledge retrieved.

Page 2: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

208 KELLENBACH, BRETT, AND PATTERSON

may be a nonperceptual semantic attribute, rather than avisual perceptual attribute (Sartori & Job, 1988; see alsoColtheart et al., 1998), or may represent a “higher” levelof visual representation that may be spared when morespecific knowledge is impaired (Sartori & Job, 1988).Alternatively, size knowledge can be viewed as distinctfrom other visual object properties in that it is more spa-tially defined. In the absence of clear empirical evidencefavoring any of these alternatives, the present study takesthe last possibility as the working hypothesis to be in-vestigated.To our knowledge, no patient has yet been re-ported with a selective impairment of size knowledgerelative to other visual attributes. Of course, not all per-ceptual attributes are visual, and two neuropsychologicalstudies have shown that auditory information may bespared in patients with impaired knowledge about visualattributes (excludingsize; Coltheart et al., 1998;Sartori &Job, 1988). A selective impairment of auditory relative tovisual knowledge has not yet been reported.

Although extremely limited, these neuropsychologi-cal data suggest that the visual attribute of color may bedissociable from both size (another visual attribute) andsound knowledge and provide support for the generalproposal that knowledge about the perceptual attributesof objects involves a cortical network comprising func-tionally and neuroanatomically distinct regions.

Neuroimaging DataFunctional imaging has enabled investigation of the

neural regions associated with different types of seman-tic knowledge in normal subjects. Only four neuroimag-ing studies to date, however, have directly compared theretrieval of nonperceptual and (visual) perceptual knowl-edge (Cappa, Perani, Schnur, Tettamanti, & Fazio, 1998;Martin, Haxby, Lalonde, Wiggs, & Ungerleider, 1995;Mummery, Patterson, Hodges, & Price, 1998;Thompson-Schill, Aguirre, D’Esposito, & Farah, 1999). Althoughall four studies have reported increased activation in thetemporal lobe (as well as a number of frontal and pari-etal regions) for visual perceptual knowledge, relative tofunctional attributes and nonsemantic control tasks, var-ious regions of the temporal lobe were implicated. Boththe Cappa et al. and the Thompson-Schill et al. studiesreported activation of (somewhat different) regions ofthe left or bilateral posterior inferior temporal (PIT) cor-tex, but Mummery et al. (1998) found that the left the an-teromedial temporal cortex was uniquely associated withthe retrieval of visual semantic knowledge (color). Theseinconsistencies can perhaps be explained by the varietyof tasks utilized and the diverse nature of what consti-tuted a visual perceptual attribute across these studies.

A small number of functional imaging studies havespecifically investigated the neural substrates support-ing color knowledge, with rather more consistent find-ings. In three overlapping studies, Martin and colleagues(Chao & Martin, 1999; Martin et al., 1995; Wiggs, Weis-berg, & Martin, 1999) compared object naming with ob-ject color naming of achromatic line drawings and ob-

served increased rCBF associated with color naming inthe left lateralized (bilateral in Martin et al., 1995) PITcortex, including areas of the parahippocampal gyrus,the fusiform gyrus, and/or the inferior temporal gyrus.The retrieval of color attribute knowledge in these stud-ies also consistently activated left inferior and superiorparietal regions and numerous frontal areas (most no-tably, the left middle and inferior frontal gyri), althoughthese latter activations have been observed in a largerange of semantic retrieval tasks and are, therefore, notunique to color knowledge (e.g., Frith, Friston, Liddle, &Frackowiak, 1991; Martin et al., 1995; Mummery et al.,1998; Mummery, Patterson, Hodges, & Wise, 1996; Pe-terson, Fox, Posner, Mintun, & Raichle, 1988; Wise et al.,1991). Using a very different approach, Paulesu and col-leagues (Paulesu et al., 1995) investigated the corticalregions engaged when people with word–color synes-thesia experienced colors in response to hearing wordsand also reported a left PIT activation.

Vandenberghe and colleagues (Vandenberghe, Price,Wise, Josephs, & Frackowiak, 1996) compared knowl-edge of semantic association and object size in a match-ing task using triads of pictures or words and found nocortical areas uniquely activated during the process-ing of size information. This (null) result at least doesnot contradict the hypothesis that size information maybe more similar to verbal/semantic knowledge than to vi-sual semantic knowledge, as has been suggested on thebasis of the neuropsychological data, but does not di-rectly address the question of how the neural substratesof size knowledge compare with those of other percep-tual attributes.

Given the sparse functional imaging data on the corti-cal representation of visual perceptual knowledge, it isperhaps not surprising that perceptual attributes in othermodalities, such as auditory and olfactory knowledge,remain unexplored.The functional imaging evidence forinvolvement of the ventral visual processing route asso-ciated with tasks requiring the retrieval of color knowl-edge suggests that other types of perceptual attribute in-formation may also engageprocessing regions specializedfor modality and/or attribute type. Evidence tangentiallysupportive of this hypothesis comes from the resultsfrom two recent studies of episodic memory for stimuliconsisting of paired visual and auditory components(Nyberg, Habib, McIntosh,& Tulving,2000;Wheeler, Pe-terson, & Buckner, 2000). Secondary auditory cortex wasactivated during the recall of sounds that were cued by averbal or visual cue with which the sound was paired atencoding, and secondary visual processing areas wereactivated by the recall of pictures. Nyberg et al. also re-ported activation of auditory association cortex in re-sponse to words that had earlier been paired with asound, even when the retrieval task was a simple old/newdiscrimination, during which the recall of the pairedsound was incidental. In these studies, the tasks requiredretrieval of episodic information from the time of en-coding, so the results can only be suggestive with regard

Page 3: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

RETRIEVAL OF PERCEPTUAL ATTRIBUTE KNOWLEDGE 209

to the issue addressed here: whether modality-specificactivations apply to the retrieval of perceptual attributeknowledge from long-term semantic memory.

The Present StudyThe PET study reported here was designed to investi-

gate the modality and attribute specificity of the PIT ac-tivations previously reported to be associated with colorknowledge and to determine whether retrieval of othertypes of perceptual knowledge similarly activates uniqueregions of the cortex. If no differences were found be-tween the activations elicited by the retrieval of a range ofperceptual attributes, this would be consistent with allsemantic knowledge residing in a single shared network.On the other hand, if differential activations were asso-ciated with retrieving the various attributes, this wouldsupport the hypothesis proposed here: that differenttypes of perceptual knowledge recruit additional areas ofthe sensorily relevant cortex in a distributed semantic-processing network. Written words referring to objectswere used, and the perceptual attributes examined wereobject color, size, and sound. These particular attributeswere chosen because neuropsychological evidence sug-gests that both size and sound knowledge are dissociablefrom color knowledgeand because this range of attributesallows the investigation of the neural correlates of per-ceptual attributeknowledge both within (visual: color andsize) and between (visual and auditory) modalities.

Several specific predictions were made. First, giventhe consistent results reported by Martin and colleagues(Chao & Martin, 1999; Martin et al., 1995; Wiggs et al.,1999) and the role of ventral cortical areas in the pro-cessing of nonspatial visual object properties (e.g., Cor-betta, Miezin,Dobmeyer, Shulman, & Peterson, 1991), re-trieval of color knowledge was expected to activate PITregions, with a left hemisphere predominance. Second,on the basis that size judgments are likely to require a

degree of spatial processing, we predicted that retrievalof size knowledge would engage dorsal parietal regionscomprising part of the dorsal visual processing system(e.g., Corbetta et al., 1991). Activations in line with thesepredictions would indicate attribute-specific cortical ac-tivationduring semantic retrieval. It was less clear whetherto expect ventral, as well as dorsal, activation in responseto size judgments. Third, we predicted that retrieval ofknowledge about an object’s sound would selectively en-gage cortical regions responsive to auditory processinglocated in or near posterior regions of auditory associa-tion cortex (posterior superior temporal gyrus, tem-poroparietal junction). Such activation would constitutean example of modality-specific cortical activation as-sociated with semantic retrieval.

METHOD

SubjectsTen males (mean age, 28.3 years; range, 22–43 years), who were

right-handed, had English as their f irst language, and had normal orcorrected-to-normal vision, were paid for their participation. Allwere comprehensively screened for medical and other exclusioncriteria and provided informed consent prior to scanning.

Design and MaterialsThree experimental conditions were designed to require selective

access to color, size, and sound knowledge about familiar objects.Yes/no questions were used to probe each of these types of knowl-edge selectively (see Table 1). The questions were further clarifiedfor the subjects in the following manner: Colored was defined asnot black and/or white; for the sound question, either the objectcould make noise spontaneously (e.g., a dog barking) or the noisecould be associated with use of the object (e.g., a drill); for the sizecondition, the objects were either obviously large (e.g., a bus) orobviously small (e.g., a coin), with no medium-sized objects in-cluded (e.g., a chair). No reference object was suggested for mak-ing the size judgments. The subjects’ understanding of the taskswas consolidated by using examples and through practice runs priorto the commencement of scanning.

Table 1Question Posed for Each Experimental Condition,

With Examples of Objects Designed to Elicit Positive andNegative Responses to These Questions

“Yes” Response “No” ResponseAttribute Probed Question Object Object

Color Is it colored? banana skunkSound Does it make a noise? drill pillowSize Is it small? coin bus

Table 2Mean Familiarity (on a 5-Point Scale) and Word Length in Letters (With Standard Deviations)

for Each Attribute Condition, Split by Expected Positive and Negative Responses

Positive Response Negative Response Total

Familiarity Word Length Familiarity Word Length Familiarity Word Length

Condition M SD M SD M SD M SD M SD M SD

Color 2.2 1.1 8.0 2.7 1.9 1.0 7.3 2.6 2.1 1.1 7.0 2.7Sound 2.1 1.0 7.0 2.8 2.0 1.0 6.8 2.1 2.0 1.0 7.1 2.3Size 2.3 1.1 6.3 2.0 1.8 1.0 6.3 1.9 2.0 1.1 6.3 2.1

Page 4: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

210 KELLENBACH, BRETT, AND PATTERSON

Seventy-two nouns were selected for each of the experimentalconditions, with 36 chosen to elicit a positive response and 36 toelicit a negative response. Of these, half were from natural objectcategories, and half were from man-made object categories, with asimilar number of tools included in each semantic condition. Thenouns were matched for mean word length (number of letters) andconcept familiarity (see below) across conditions and response type(Table 2). Although the use of unique stimuli in each conditionleaves open the possibility that any observed effects could be inter-preted as stimulus specif ic, rather than condition specific, this pos-sibility is minimized by the stimuli being well matched across con-ditions. Furthermore, by avoiding stimulus repetitions, the presentdesign precludes the contamination of activations across conditionsthat might be associated with an attribute that has been the focus ofselective retrieval’s being “primed” in subsequent conditions inwhich that attribute is irrelevant for the same stimulus.

The 216 (72 3 3) experimental stimuli were selected from a setof 522 concepts, for which familiarity ratings had previously beenobtained from 36 paid volunteers. Familiarity was rated on a 5-pointscale, and following Snodgrass and Vanderwart (1980), volunteerswere asked to rate “the degree to which you come into contact withor think about each of the objects referred to by the words.” Thevolunteers were reminded to rate their familiarity with the object,rather than with the word itself, and were given three examples witha range of responses. A correlation was performed on the ratings forthose items for which Snodgrass and Vanderwart familiarity ratingswere also available, yielding a robust correlation coefficient of .87.

A control condition was also constructed that was visually similarto the experimental conditions (consisting of uppercase letters) al-though having no semantic content. This condition comprised conso-nant letter strings (a single consonant per string), which were matchedto the experimental conditions on mean number of letters. Half of theletter strings contained an “X” (e.g., VVVXV), whereas the other halfdid not. The position of the “X” in the letter string, when present, wasvaried randomly, with the exclusion of the first and last positions. Forthis condition, the question posed was “Does it contain an X?”

ProcedureThe subjects were instructed in and given examples of the tasks

before being positioned in the PET scanner. Once in the scanner,they were also given a short practice for each condition to famil-iarize them with both the tasks and the scanner set up. The stimuliwere presented on a monitor suspended over the scanner, which was100 cm from the subject’s eyes and positioned so he had an unob-structed view. Behavioral responses were recorded via a two-buttonmouse, and responses were made with the middle (no) and index(yes) fingers of the left hand. Speed and accuracy of responses weregiven equal weighting in the instructions.

Each subject was scanned 12 times, with three scans for each ofthe four conditions and a different set of 24 stimuli presented dur-ing each condition repetition. The scan order of the conditions wascounterbalanced across subjects, while within each scan the orderof the stimuli was randomized for each subject.

Prior to image acquisition, condition and response instructionswere displayed for 7 sec, and 4 unique lead-in items appropriate to thecondition were presented. Onset of the 24 experimental stimuli coin-cided with the onset of image acquisition. Stimulus words were pre-sented for 1,800 msec each, with an interstimulus interval of 200 msec.After the presentation of the experimental stimuli (48 sec), the sub-jects viewed a fixation point (colon) for the remainder of the scan-ning period (42 sec). This technique of switching from the task ofinterest shortly after the count rate reaches its peak has been usedto improve the signal-to-noise ratio by reducing isotope washoutfrom activated regions (e.g., Cherry, Woods, Doshi, Banerjee, &Mazziotta, 1995; see also Gerlach, Law, Gade, & Paulson, 1999;Kanwisher, Woods, Iacoboni, & Mazziotta, 1997; Simons, Graham,Owen, Patterson, & Hodges, 2001).

PET Scanning and Data AnalysisA GE Advance scanner was used to obtain 12 scans for each sub-

ject, each comprising 35 image slices at an intrinsic resolution ofapproximately 4.0 3 5.0 3 4.5 mm. Each subject received twelve20-sec intravenous bolus injections of 300 MBq ml-1 H215O, at aflow rate of 10 ml min-1, through a forearm cannula. This methodresults in images of rCBF integrated over a 90-sec period from thetime the tracer enters cerebral circulation. Head movements wererestricted through the use of foam padding.

We used SPM99 software for image analysis (Wellcome Depart-ment of Cognitive Neurology, London). The 12 scans for each sub-ject were first realigned, using the 1st scan as a reference. Inspec-tion of the realignment parameters indicated that none of the 10subjects had head movements exceeding 8 mm. For each subject,we spatially normalized the mean of the PET images to a templatematching the Montreal Neurological Institute (MNI) standard tem-plate. Each of the individual realigned PET images was thenresliced, using the same transformations, to give 12 scans per sub-ject matched to the MNI brain. These images were smoothed witha 16-mm FWHM isotropic Gaussian kernel to increase the signal-to-noise ratio. As is standard for SPM99, the effect of global sig-nal in each scan was removed with subject-specific global signalcovariates in the statistical analysis. Task related differences inrCBF were estimated for each voxel, using SPM99, with reactiontime (RT) entered as a covariate of interest; time (scan order) andhead movement (three planes of rotation, three dimensions of trans-lation) were set as confounding factors in the analysis (Brett,Bloomfield, Brooks, Stein, & Grasby, 1999). Peak activations out-side specified regions of interest (ROIs; see below) were thresh-olded with the conservative criterion of p < .05 corrected for mul-tiple comparisons across the entire brain volume. In addition, in linewith our a priori hypotheses, signif icance thresholds for a numberof brain regions were adjusted, using two ROIs. An experiencedneurologist blind to the PET data def ined a bilateral ROI encom-passing the inferior and middle temporal gyri and the ventral sur-face of the temporal lobe, using MRIcro (Rorden & Brett, 2001).This ROI corresponds to our hypotheses regarding specific PIT ac-tivation in association with visual knowledge. We used a secondROI to look at the differences between PIT activations between thevarious semantic conditions. We defined the ROI by taking a sphereof 10-mm radius around the voxel with the maximum activation inthis area when the three semantic conditions were contrasted withthe control condition. 1 This voxel had coordinates 58, 45, 19(x, y, and z in millimeters; see the top panel of Figure 1). Finally,when the condition of interest concerned the object’s sound, al-though no clearly relevant data yet exist to define a precise ROI, wereport activations in the posterior temporal auditory associationareas and the temporal-parietal junction at a lower threshold, sincewe had hypothesized a priori that this auditory processing regionmight be involved in the retrieval of sound knowledge.

We have presented results as SPM maximum intensity projec-tions, the tables giving coordinates with statistics and brain slicesthrough selected regions. The SPM maximum intensity projections(see Figure 1), or “glass brain” views, have white pixels where anyvoxel in a line perpendicular to the plane of the page, passingthrough that pixel location, has a t statistic above threshold. The in-tensity of the image at that point is relative to the maximum t sta-tistic on that line. Thus, for a lateral projection, a white pixel couldreflect a high t statistic in the left or right temporal lobe or inter-vening structures. The tables give results abstracted from the SPMresults output. Coordinates are in terms of the MNI brain, to whichthe data have been spatially normalized. To estimate the Brodmannareas of each activation focus, we have used the Talairach atlas (Ta-lairach & Tournoux, 1988). The brain in this atlas is not the sameshape or size as the MNI brain, so we have used a simple transformof the MNI coordinates (http://www.mrc-cbu.cam.ac.uk/ Imaging/mnispace.html) to estimate their locations in the Talairach atlas.

Page 5: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

RETRIEVAL OF PERCEPTUAL ATTRIBUTE KNOWLEDGE 211

Last, Figures 2 and 3 use color intensity overlaid on the MNI braintemplate to show activation in various key regions. The color in-tensity represents the size of effect at each voxel, in terms of esti-mated ml/min/dl blood flow change. The outlined areas show where

this change is signif icant at the given threshold and, therefore,where the t statistic (which is the size of the effect divided by an es-timate of its standard error) is above the threshold given by randomfield theory (Worsley et al., 1996).

Figure 1. SPM maximum intensity projections (“glass brain” views) showing regions of increased blood flow forthe main comparisons of interest. The figures are thresholded to t values � 3.94, which is the maximum t value cor-responding to the inferior temporal lobe region of interest (ROI). *Color retrieval showed increased blood flow rel-ative to both size and sound retrieval conditions in the left posterior inferior temporal (PIT) lobe when a more spe-cific ROI was applied (Tables 5 and 6, Figure 3A).

All Semantic(Color + Size + Sound)

minusControl

Visual(Color + Size)

minusSound

SoundminusVisual

(Color + Size)

*ColorminusSize

SizeminusColor

sagittal coronal transverse

Page 6: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

212 KELLENBACH, BRETT, AND PATTERSON

Figure 2. Differential activation when visual (color and size) judgments were contrasted with sound judgments (red)and when sound judgments were contrasted with visual (color and size) judgments (blue). Color intensity is overlaidon the MNI brain template to show activation in various key regions. The color intensity represents the effect size ateach voxel, in terms of estimated ml/min/dl blood flow change. The outlined areas show where this change is signifi-cant at the given threshold and, therefore, where the t statistic (which is the size of the effect divided by an estimateof its standard error) is above the threshold given by random field theory (Worsley et al., 1996). (A) Top: The out-lined region indicates activation at p < .05 corrected for the inferior temporal lobe region of interest (ROI)—the leftposterior inferior temporal lobe was activated more strongly by visual judgments than by sound judgments. Bottom:parameter estimate plot for each condition at the local maxima corresponding to the outlined region. (B) Top: Theoutlined region indicates activation at p < .05 corrected for the entire brain volume—the right superior parietal cor-tex was activated more strongly by visual judgments than by sound judgments. Bottom: parameter estimate plot foreach condition at the local maxima corresponding to the outlined region. (C) Top: The arrow indicates the region inthe temporoparietal ROI that was activated more strongly by sound judgments than by visual judgments. Bottom:parameter estimate plot for each condition at the local maxima corresponding to the indicated region.

Page 7: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

RETRIEVAL OF PERCEPTUAL ATTRIBUTE KNOWLEDGE 213

Figure 3. Mean differential activation when color judgments were contrasted with size judgments (red ) and when sizejudgments were contrasted with color judgments (blue). Color intensity is overlaid on the MNI brain template of theMontreal Neurological Institute to show activation in various key regions. The color intensity represents the effect sizeat each voxel, in terms of estimated ml/min/dl blood flow change. The outlined areas show where this change is signifi-cant at the given threshold and, therefore, where the t statistic (which is the size of the effect divided by an estimate of itsstandard error) is above the threshold given by random field theory (Worsley et al., 1996). (A) Top: The arrow indicatesthe region in the 10-mm radius spherical region of interest of the left posterior inferior temporal lobe that was activatedmore strongly by color judgments than by size judgments. Bottom: parameter estimate plot for each condition at the localmaxima corresponding to the indicated region. (B) Top: The outlined region indicates activation at p < .05 corrected forthe entire brain volume—the right medial parietal cortex was activated more strongly by size judgments than by colorjudgments. Bottom: parameter estimate plot for each condition at the local maxima corresponding to the outlined region.

Page 8: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

214 KELLENBACH, BRETT, AND PATTERSON

RESULTS

Behavioral DataReaction time. RTs differed between the four condi-

tions [F(3,27) 5 22.69,p < 0.001;Table 3]. Predictably, re-sponses were significantly faster in the control conditionthan in any of the semantic retrieval conditions [t (1,9) 55.82 7.13, all ps < .0001], whereas none of the seman-tic retrieval conditions differed significantly from anyother [Bonferroni corrected a 5 .008; t(1,9) 5 1.54–2.70,

all ps > .02]. Note that RT was included as a covariate inthe analysis of the PET data, making it possible to inter-pret the neuroimaging data after adjusting for differ-ences in RT.

Accuracy. Note that the error rates are dependent onthe agreement of the subjects’ and the experimenters’judgments as to whether the objects fit the criteria inquestion (e.g., that a pearl is not usually colored) and thathigh levels of accuracy were not critical to the aims ofthe present study—selective retrieval of the specifiedtype of knowledge was the crucial aspect. Error rates dif-fered between the four conditions [F(3,27) 5 12.47, p <.001; Table 3]. The error rate for the color retrieval con-dition was higher relative to the other three conditions,although only significantly higher than that of the sizeand control conditions [Bonferroni corrected a 5 .008;t(1,9) 5 3.89 and 4.27, p 5 .004 and .002]. The highererror rate in the color condition probably reflects thesomewhat subjective nature of the decisions for somestimuli in this condition. Whereas the error rates for thecolor and sound retrieval conditions did not differ sig-

Table 4Brain Regions Activated by Each of the Semantic Retrieval Conditions as Compared With the Control Condition

Anatomical Localization Coordinates

(Estimated Brodmann Area) x y z T Uncorr Corr ROI Corr

Color ControlFrontal lobeL inferior frontal/orbital gyrus (11/47) 34 36 16 6.08 .000 .001L medial inferior frontal gyrus/insula (44) 32 12 16 5.41 .000 .006L inferior frontal gyrus (45/47) 46 38 04 5.13 .000 .017

Temporal lobeL inferior temporal gyrus (37) 60 44 18 5.38 .000 .007 .000 (IT)L fusiform gyrus (20) 28 06 46 5.05 .000 .023 .003 (IT)L inferior temporal gyrus (20) 52 12 30 4.20 .000 .303 .023 (IT)L fusiform gyrus (20) 34 38 20 4.10 .000 .382 .031 (IT)

Size ControlFrontal lobeL inferior frontal gyrus (44) 32 12 16 6.42 .000 .000L inferior frontal gyrus (11/47) 32 40 18 5.43 .000 .006

Temporal lobeL fusiform gyrus (20) 38 08 46 4.58 .000 .104 .007 (IT)L inferior temporal gyrus (20) 40 32 20 3.94 .000 .530 .049 (IT)L inferior temporal gyrus (37) 58 46 18 3.97 .000 .500 .045 (IT)

Sound ControlFrontal lobeL superior frontal gyrus (9) 12 52 34 5.37 .000 .007L inferior frontal gyrus (11/47) 38 34 14 8.36 .000 .000L inferior frontal gyrus (45) 46 16 16 5.39 .000 .007L inferior frontal gyrus (8/9) 36 12 28 5.34 .000 .008L inferior frontal gyrus (47) 56 20 06 5.08 .000 .020

Temporal lobeL inferior temporal gyrus (20) 52 12 36 5.37 .000 .002 .000 (IT)L inferior temporal/fusiform gyrus (20) 36 34 22 5.64 .000 .003 .000 (IT)L inferior temporal/parahippocampal gyrus (20) 34 08 48 5.50 .000 .004 .000 (IT)L inferior temporal gyrus (37) 56 44 20 4.41 .000 .173 .012 (IT)L middle/superior temporal gyrus (21/22) 50 48 04 4.28 .000 .245 (TP)

CerebellumR cerebellum 16 86 32 5.25 .000 .011

44 80 34 4.79 .000 .054

Note—Anatomical localization of maximal t values are in stereotaxic coordinates in millimeters (R, right; L, left). Activations in bold type indi-cate region maxima. T, t value; Uncorr, uncorrected p values; Corr, p values corrected for entire brain volume; ROI-Corr, p values corrected forrelevant region of interest, as indicated in parentheses (see text); IT, inferior temporal lobe; TP, temporoparietal.

Table 3Mean Reaction Time (RT, in Milliseconds,With Standard Deviation), Error Rate, and

Miss Rate for Each Attribute Condition

RT

Condition M SD Errors (%) Misses (%)

Color 991.93 159.00 9.4 1.5Size 936.46 107.29 4.4 0.6Sound 898.47 115.79 2.6 0.8Control 764.62 99.67 1.5 0.0

Page 9: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

RETRIEVAL OF PERCEPTUAL ATTRIBUTE KNOWLEDGE 215

nificantly, the sound condition had a higher error ratethan did the control condition only [Bonferroni cor-rected a 5 .008: t(1,9) = 3.71, p 5 .005]. Misses weredeterminedby use of a cut off RT of 1,950 msec. Althoughthe miss rate differed between conditions [F(3,27) 53.65, p 5 .025], the experimental conditions did not dif-fer significantly when compared individually with eachother.

PET DataThe rCBF data were analyzed in two steps. First, in

order to identify activity associated with such processesas orthographic and phonologicaldecoding, word identi-fication, and more general aspects of object knowledge,as well as areas uniquely activated by retrieval of eachsemantic attribute type, each of the experimental condi-tions (color, size, and sound) was compared with the sameprelexical control condition(X detection task; see Table 4).Very similar cortical regions were more highly activatedby each of the attribute retrieval conditions, relative tothe control condition. All of the experimental conditionselicited the same left-lateralized regions of the inferiortemporal and fusiform gyri (BA 37 and 20) and the infe-rior frontal lobe (Figure 1).2 In addition, regions of theposterior left middle/superior temporal gyrus, the leftmedial frontal cortex at the level of the superior frontalgyrus, and the right cerebellum were activated abovethreshold in the sound minus control comparison only.

Although comparisons between the experimental andthe control conditionsare useful for identifying areas as-sociated with lexical decoding and general semantic re-

trieval, it is direct comparisons between the experimen-tal conditionsthat address the central questionspertainingto modality- and attribute-specific activations in a con-trolled manner. The second series of rCBF analyses in-volved two sets of contrasts between the semantic re-trieval conditions.First, modality-specificactivationswereinvestigated by contrasting the two visual conditions(color and size together) with sound retrieval. A secondset of analyses investigated attribute-specific activationswithin the visual modality by comparing the color andthe size retrieval conditions. In addition, each of the twovisual conditions was also compared individually withthe sound condition to verify that (1) any observed ef-fects of modality were robust for both visual conditionsand (2) any activations identified as specific to a partic-ular visual attribute were not also elicited by sound judg-ments.

Modality-Specific ActivationsActivations specific to visual attributes (color and

size). Relative to the sound condition, retrieval of visual(color and size) knowledge activated right PIT areas (BA37/20) and a region of the superior parietal lobule (BA 7)also in the right hemisphere (see Table 5 and Figure 1,Figure 2A and 2B). Whereas the right PIT activationwassignificant for both color and size, relative to sound, thesuperior parietal lobule activationfailed to reach correctedsignificance when color judgments were compared withsound knowledge (t 5 4.46, p 5 .146; Table 6).

Activations specific to sound. The retrieval of soundknowledge relative to visual (color and size) knowledge

Table 5Modality- and (Visual) Attribute-Specific Activations

Anatomical Localization Coordinates

(Estimated Brodmann Area) x y z T Uncorr Corr ROI Corr

Modality-Specific Activations(Color 1 Size) SoundTemporal lobeR inferior temporal gyrus (37/20) 64 42 20 5.29 .000 .010 .001 (IT)

60 52 14 4.44 .000 .157 .011 (IT)Parietal lobeR superior parietal lobule (7) 38 70 42 5.22 .000 .012

Sound (Color 1 Size)Temporal & Parietal lobesL inferior parietal lobule (40) 52 54 26 4.32 .000 .217 (TP)L superior temporal gyrus (22/39) 52 62 30 4.24 .000 .272 (TP)

Frontal lobeL superior/medial frontal gyrus (9) 10 46 26 5.25 .000 .011L inferior frontal gyrus (11/47) 38 28 14 5.25 .000 .011

(Visual) Attribute-Specific ActivationsColor SizeTemporal lobeL inferior temporal gyrus (37) 62 42 16 3.60 .000 .847 .149 (IT)

62 42 16 3.60 .000 .009 (sphere)Size ColorParietal lobeR precuneus (7) 14 64 30 4.85 .000 .044

Note—Anatomical localization of maximal t values are in stereotaxic coordinates in millimeters (R, right; L, left). Activations in bold type indi-cate region maxima. T, t value; Uncorr, uncorrected p values; Corr, p values corrected for entire brain volume; ROI Corr, p values corrected forrelevant region of interest, as indicated in parentheses (see text); IT, inferior temporal lobe; sphere, left posterior inferior temporal 10-mm radiusspherical ROI; TP, temporoparietal.

Page 10: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

216 KELLENBACH, BRETT, AND PATTERSON

yielded activation of a posterior area of the left superiortemporal gyrus at the junction of the parietal lobe (BA22/39/40; Table 5 and Figures 1 and 2C). Although thisactivation did not reach significance when corrected forthe number of voxels across the whole brain, we considerit valid to report this result because of its close proxim-ity to the posterior temporal auditory association areas,which is the region in which we hypothesizeda priori thatactivity would be associated with the retrieval of soundknowledge. Further comparisons between the soundconditionand the color and size judgments separately in-dicated that this temporoparietal activation was evidentrelative to both types of visual knowledge (Table 6).

(Visual) Attribute-Specific ActivationsActivations specific to color. No brain regions were

specifically associated with color knowledge, using a sig-nificance criterion controlling for the whole brain. Whena ROI (10-mm radius sphere) was applied to the centerof the left PIT area (BA 37: 58, 45, 19) activatedby all the semantic retrieval conditions relative to thecontrol condition (as explained above), color judgmentselicited relatively more activation, as compared withboth size and sound knowledge ( 62, 42, 16, t 53.60, p 5 .009, and 64, 48, 18, t 5 3.07, p 5 .036,respectively—ROI corrected), whereas size and sound

did not differ from one another at this threshold (ROImaximum t value 5 2.93; Figures 1 and 3A).3

Activations specific to size. Judgments about objectsize resulted in enhanced activation of a medial rightparietal area (precuneus: BA 7), relative to conditions re-quiring retrieval of either color or sound knowledge (Ta-bles 5 and 6, Figures 1 and 3B).

DISCUSSION

Activations Not Specific to Modality or AttributeRelative to the non-semantic control condition, re-

trieval of color, size, and sound knowledge activated lefttemporal and inferior frontal regions largely compatiblewith the results of previous neuroimaging studies prob-ing the retrieval of semantic knowledge (e.g., Mummeryet al., 1998; Price, Moore, Humphreys, & Wise, 1997;Vandenberghe et al., 1996). The present study confirmsthat this network is engaged by tasks requiring access tohighly specific semantic attributes, as well as more gen-eral judgments of semantic categorizationor association.The activation of left inferior frontal regions by all threeattribute retrieval tasks is consistent with a generalized(non–attribute-specific) role for this region and accordswith recent proposals that the left inferior frontal cortexparticipates in the selectionof semantic information (Fiez,

Table 6Brain Regions Activated by Remaining Direct Comparisons Between the Semantic Retrieval Conditions

Anatomical Localization Coordinates

(Estimated Brodmann Area) x y z T Uncorr Corr ROI Corr

Color–SoundTemporal lobeR inferior temporal gyrus (37/20) 64 42 22 5.38 .000 .007 .000 (IT)L inferior temporal gyrus (37) 64 48 18 3.07 .001 .036 (sphere)

Parietal lobeR superior parietal lobule (7) 40 70 44 4.46 .000 .146

Size–SoundTemporal lobeR inferior temporal gyrus (37) 62 54 12 4.05 .000 .427 .036 (IT)

66 60 4 3.73 .000 .745Parietal lobeR precuneus (31/7) 16 68 28 5.95 .000 .001R precuneus (7) 12 58 36 5.61 .000 .003R superior parietal lobule (7) 38 72 40 4.76 .000 .059

Sound–ColorTemporal lobeL superior temporal gyrus (22) 54 56 24 4.24 .000 .275 (TP)

64 44 22 3.52 .000 .903 (TP)Sound–Size

Temporal and Parietal lobesL inferior parietal lobule (40) 46 –50 28 3.76 .000 .717 (TP)L superior temporal gyrus (22/39) 56 –62 30 3.45 .000 .935 (TP)

Frontal lobeL inferior frontal gyrus (11/47) 38 30 14 6.07 .000 .001L superior/medial frontal gyrus (9) 10 46 28 5.81 .000 .001L superior frontal gyrus (9) 16 64 20 4.79 .000 .053

Note—Anatomical localization of maximal t values are in stereotaxic coordinates in millimeters (R, right; L, left). Activations in bold type indi-cate region maxima. T, t value; Uncorr, uncorrected p values; Corr, p values corrected for entire brain volume; ROI Corr, p values corrected forrelevant region of interest, as indicated in parentheses (see text for details); IT, inferior temporal lobe; sphere, left posterior inferior temporal 10-mm radius spherical ROI; TP, temporoparietal.

Page 11: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

RETRIEVAL OF PERCEPTUAL ATTRIBUTE KNOWLEDGE 217

1997; Mummery et al., 1999; Thompson-Schill et al.,1997; see also Gabrieli, Brewer, & Poldrack, 1998). Fo-cusing on or selecting a specific semantic attribute wasclearly a critical mechanism required by the three exper-imental tasks in the present study.

Unlike the studies reported by Martin and colleagues(Chao & Martin, 1999; Martin et al., 1995; Wiggs et al.,1999), an anterior region of the temporal lobe was acti-vated in the current study. This anterior temporal lobeactivation seems likely to reflect the particular demandsof the semantic attribute judgments in the present para-digm rather than cognitive operations, such as associa-tive semantic access, that should also have occurred inthe Wiggs et al. study, which similarly compared colorknowledge retrieval with a lower level control condition.The exact conditions under which the left anterior tem-poral lobe is recruited by semantic tasks remains unclear,but the association is strongly supported by the consis-tent finding of significant atrophy in this region in se-mantic dementia, a neurodegenerative condition that re-sults in relatively selective deterioration of semanticmemory (Hodges, Patterson, Oxbury, & Funnell, 1992;Mummery et al., 2000; Snowden, Neary, & Mann,1996). This area has also been implicated in the pro-cessing of natural objects, relative to manmade objects(e.g., see Moore & Price, 1999; Mummery et al., 1996).The present study was not designed to shed new light onthis issue.

The region of the left PIT lobe that has previouslybeen specifically associated with color knowledge (Chao& Martin, 1999; Martin et al., 1995; Wiggs et al., 1999)was also elicited by all the attribute retrieval conditions,relative to the control task, although the present activa-tion had a somewhat more lateral focus. This left tem-poral region thus seems to be involved in the retrieval ofperceptual semantic information more generally, ratherthan color information per se, which is compatible withthe evidence that this area forms part of a cortically dis-tributed network contributing to semantic processingovera range of tasks (e.g., Mummery et al., 1998; Price et al.,1997; Vandenberghe et al., 1996). Nonetheless, consis-tent with previous reports that this PIT region plays acritical role in color knowledge processing, we were ableto show that color judgments activated this region morestrongly than did either size or sound knowledge re-trieval (see below for further discussion of this result).

Modality-Specific ActivationsVisual attributes (color and size). Both color and

size judgments, relative to the sound condition, activateda region of the PIT lobe in the right hemisphere, whichwas homologousto the left-lateralizedPIT focus observedwhen each of the semantic attribute retrieval conditionswas compared with the control condition (see above).This suggests that this area on the right is selectively in-volved in representation and/or processing of visual se-mantic attributes, rather than semantic attributesmore gen-erally, which engage this region in the left hemisphere.

One interpretation of the functional role of the rightPIT area is in terms of visual imagery. Although the pres-ent study did not explicitly require the use of mental im-agery, it has previously been suggested that performinga color knowledge task involves creating and inspectinga colored mental image of the object (De Vreese, 1991;Farah, 1984). Consistent with this proposal, a number ofneuroimaging studies on the processing of explicit visualobject/shape imagery have implicated a region of thePIT lobe (in some cases, right lateralized), comparablewith that observed during visual attribute retrieval in thepresent study (D’Esposito et al., 1997; Kosslyn et al.,1993; Mellet, Petit, Mazoyer, Denis, & Tzourio, 1998;Mellet et al., 1996; Mellet, Tzourio, Denis, & Mazoyer,1998; Mellet et al., 2000; Roland & Gulyas, 1995; seealso Goldenberg et al., 1989).

It is noteworthy that the studies by Martin and col-leagues have also reported bilateral PIT activations as-sociated with the retrieval of color knowledge (Chao &Martin, 1999; Martin et al., 1995; Wiggs et al., 1999), al-though in two of these studies the activation on the rightwas only significant when color naming was comparedwith a lower control task, not when compared to objectnaming (Chao & Martin, 1999; Wiggs et al., 1999). Mar-tin et al. also reported right-hemisphere PIT activationsfor color naming relative to action naming, again con-sistent with a hypothesis of visual attribute specificityfor this right PIT activation.Further compatible with thepresent findings, Gerlach et al. (1999) reported a similarrightPIT activation in a task comparisonproposed to indexstored knowledge about the visual forms of objects.

Visual knowledge retrieval in our study was also as-sociated with increased activity in a region of the rightsuperior parietal lobule, although this activation was re-liable at our conservative statistical threshold only forthe comparison between size and sound judgments, mak-ing it difficult to draw general conclusions about the roleof this region in the retrieval of visual knowledge.

Sound. Retrieval of knowledge about object sounds,relative to color and size judgments, activated the left pos-terior superior temporal gyrus/sulcus and contiguous re-gions of the inferior parietal cortex. This selective acti-vation of regions adjacent to auditory association cortexis consistent with our prediction that accessing and re-trieving sound knowledge would elicit modality-specificactivations in and/or near to auditory association cortex.These activations elicited by judgments of sound weresomewhat more posterior (~1 cm) to the regions associ-ated with the episodic recall of auditory information(Nyberg et al., 2000; Wheeler et al., 2000) or imageryfor music (Halpern & Zatorre, 1999; Zatorre, Halpern,Perry, Meyer, & Evans, 1996), but an almost identicalcortical area was activated when subjects passively lis-tened to auditorily presented words or pseudowords, rel-ative to fixation (Fiez, Raichle, Balota, Tallal, & Peter-son, 1996; Peterson, Fox, Posner, Mintun, & Raichle,1988, 1989).4 Since this activation is not generallyelicited by nonlinguistic auditory stimuli, it has been hy-

Page 12: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

218 KELLENBACH, BRETT, AND PATTERSON

pothesized to index speech-specific auditory processingand, perhaps, to reflect speech-based short-term audi-tory storage mechanisms (Fiez et al., 1996; see alsoBinder et al., 1997; Paulesu, Frith, & Frackowiak, 1993;Wise et al., 2001). The present result indicates that thisarea can play a role in the processing of (meaningful)nonspeech sounds and that these need not be externallygenerated. The limited evidence available does not pro-vide sufficient basis for any clear conclusion regardingthe functional significance of this activation, althoughprevious suggestions that this region subserves auditoryworking memory are at least not incompatible with theidea that it may be supporting a form of auditory imageryengaged during object sound judgments in the presentexperimental context.

When compared with the control condition, the soundcondition uniquely generated additional activation in theleft posterior middle/superior temporal gyrus that wasslightly more anterior and inferior to the regions acti-vated in the direct contrasts between the sound and thevisual retrieval conditions. It therefore appears that au-ditory association cortex may have been activated byjudgments of sound, although it was not evident in thecomparisons with the visual retrieval conditions. Thispattern might be expected if weak activation of auditoryassociation areas occurred automatically in response tosound information associated with the objects during thevisual judgment conditions.

(Visual) Attribute-Specific ActivationsColor. A region of the left PIT cortex activated by all

the semantic retrieval conditions (relative to the controlcondition) was also significantly more activated by ob-ject color judgments. Although the focus was slightlyless medial (~1 cm) than the activations reported by Mar-tin and colleagues (Chao & Martin, 1999; Martin et al.,1995; Wiggs et al., 1999), it seems likely that similar as-pects of color knowledge are being indexed and that thesmall difference in activation foci may reflect differ-ences in the tasks used. The proximity of this activationto color perception areas is consistent with the hypothe-sis that retrieval of specific perceptual knowledge en-gages cortical regions near to the relevant sensory cortexand is in keeping with engagement of a visual mental im-agery process that does not activate the primary sensorycortex (D’Esposito et al., 1997;Mellet et al., 1996); as wasnoted above, performing a color knowledge task hasbeen proposed to entail creating and inspecting a coloredmental image of the object (De Vreese, 1991;Farah, 1984).The only study that has explicitly examined the neuralsubstrates of color imagery reported activation in thePIT cortex (although this activation was right lateralized;R. J. Howard et al., 1998), whereas Paulesu et al. (1995)have shown that word–color synesthetes activate the leftPIT lobe while experiencing color images in response towords. Object-based imagery has also been reported toactivate this left PIT region (D’Esposito et al., 1997;

Kosslyn et al., 1993; Mellet, Petit, et al., 1998; Melletet al., 1996; Mellet, Tzourio, et al., 1998; Mellet et al.,2000; Roland & Gulyas, 1995; see also Goldenberg etal., 1989; Nakamura et al., 2000). In our study, however,this region was also activated by size and sound knowl-edge, relative to the control condition, suggesting thatthis area not only is specialized for color knowledge pro-cessing, but also plays a more general role in word pro-cessing and/or semantic retrieval. One possibility is thatknowledge about the visual appearance of objects is ac-tivated by all the tasks, although this does not accountfor the enhanced activation in the color attribute retrievalcondition. Alternatively, color imagery is automaticallyactivated during retrieval of perceptual informationabout objects regardless of the task, but the requirementto make an explicit judgment regarding this aspect ofknowledge results in increased activation.

Intriguingly, although the left PIT region was associ-ated particularly with retrieval of color knowledge, thehomologous region of the right PIT gyrus respondedequally strongly when either type of visual (color or size)knowledge was retrieved (see above; plots in Figures 1Band 2A), suggesting that the right and the left PIT cor-tices play somewhat specialized roles in perceptual (andparticularly visual) knowledge.

Size. Relative to both the color and the sound condi-tions, size judgments were uniquely associated with in-creased activityin a right-hemispheremedial parietal struc-ture (precuneus). Returning to our earlier hypothesis thatthe right PIT activations associated with visual (colorand size) semantic judgments might reflect object/shape-based visual imagery, it is less clear whether this argu-ment can be extended to explain the right medial parietalactivations selectively activated during the retrieval ofsize knowledge.Although the dorsal visual processing sys-tem plays a primary role in the spatial processing of ex-ternal visual stimuli (e.g., Haxby et al., 1994) and parietalactivations have been consistently obtained in visual im-agery tasks with a spatial element (M. S. Cohenet al., 1996;Harris et al., 2000; Kosslyn et al., 1993; Kosslyn, Di-Girolamo, Thompson,& Alpert, 1998; Mellet et al., 1996;Mellet, Tzourio, Denis, & Mazoyer, 1995; Mellet et al.,2000; see also Carpenter, Just, Keller, Eddy, & Thulborn,1999), these activations have generally been reported asin and around the intraparietal sulcus and are, therefore,lateral and superior to that found in the present study.This is not, however, true for all preceding studies (Mel-let et al., 1995; Wheeler et al., 2000), although these par-ticular studies utilized memory-based paradigms. Con-sistent with this observation, precuneus activations havemost often been reported in the context of episodic re-trieval tasks. Activationof this region was initially thoughtto reflect visual imagery processes during retrieval (e.g.,Fletcher et al., 1995), although this interpretation hassubsequently been challenged (e.g., Buckner, Raichle,Miezin, & Peterson, 1996; Krause et al., 1999; Maguire,Frith, & Morris, 1999) and the functional role of this

Page 13: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

RETRIEVAL OF PERCEPTUAL ATTRIBUTE KNOWLEDGE 219

structure is the subject of current investigation and de-bate. Interestingly, Mummery et al. (1998) reported (right)precuneus involvement in a semantic retrieval task re-quiring judgments about where objects are typicallyfound, which was argued to involve a spatial element.Taken in conjunction with the present f inding, this resultmay suggest that the medial parietal cortex is activatedin semantic retrieval tasks with a spatial component, aswell as during episodic retrieval.

Although size judgments apparently failed to elicit asimilar parietal activation in the study reported by Van-denberghe et al. (1996), various differences between theirparadigm and ours could account for the lack of corre-spondence in results. For example, the main experimen-tal condition in the previous study was a difficult asso-ciative matching task, whereas their size-matching taskwas also extremely demanding (both tasks elicited reac-tion times of around 2.6–2.9 sec), making it difficult toderive a clear definition of the processes being indexed.

The enhanced activation of this medial parietal regionin conjunction with the right PIT (and the superior pari-etal lobule) activation observed during the retrieval ofboth size and color information indicates that size knowl-edge may involve several processes or mechanisms. Wehave hypothesized that the right PIT and parietal activa-tions index visual imagery mechanisms engaged duringthe retrieval of visual attribute information requiring in-ternal inspection prior to a decision. This interpretationsuggests that size knowledge judgments engage bothobject-based and, perhaps, more spatially specific visualimagery processes, in keeping with the intuitive assum-ption that this task requires knowledge both of what theobject in question looks like and how large it is relativeto some criterion.

ConclusionsThese results support the hypothesis that, in addition

to a general cortical network activated by all semantictasks, the retrieval of perceptual semantic knowledge en-gages modality-and attribute-specific cortical processingareas. Consistent with our predictions for modality-specific activations, (1) retrieval of visual attributeknowl-edge (color and size) activated a right-lateralized PIT re-gion associated with the processing of visual objectproperties, and (2) judgments about the sounds that ob-jects make engaged left-lateralized posterior superiortemporal and temporoparietal regions adjacent to the au-ditory association cortex. In agreement with previous re-ports and our predictions based on these, retrieval ofcolor knowledge elicited more left PIT activation thandid either size or sound knowledge. This latter regionwas also activated by size and sound judgments relativeto the control task, however, suggesting that this corticalarea is not exclusively engaged by the retrieval of colorinformation. In addition to the general semantic network,judgments about object size uniquely recruited medialdorsal parietal structures (precuneus), the functional roleof which is underspecified.

The evidence for the recruitment of distinct neuralareas during the retrieval of object size, sound, and coloris consistent with the limited neuropsychological datathat have revealed dissociations between knowledge ofthese particular attributes. These findings do not neces-sarily indicate neuroanatomically separate subsystemssupporting the long-term representation of various per-ceptual attributes and modalities.Rather, we propose thatthe data are consistent with the idea that a distributednetwork of cortical regions specialized for the process-ing of the particular sensory aspects of a multisensorystimulus is engaged both during the acquisition of per-ceptual knowledge about objects and during the retrievalof such knowledge (Allport, 1985; see also Ishai, Unger-leider, Martin, Schouten, & Haxby, 1999). Within thisframework, the interrelation of multiple sensory repre-sentations into an integrated whole presumably involvesone or more additional cortical regions (e.g., medial tem-poral lobe structures; N. J. Cohen et al., 1999).

Although we have speculated that the attribute- andmodality-specific activations observed here may reflectthe operation of sensorily specific imagery mechanisms,the precise role played by imagery in these processes re-mains an important and unresolved issue. It will be dif-ficult to differentiate between this interpretation and theproposal that retrieval of semantic knowledge involves(re-) representations in/near the sensory and motor cor-tical regions through which the knowledge is acquiredand experienced (e.g., Allport, 1985;Gainotti et al., 1995).Indeed, the difficulty in distinguishing between thesetwo accounts perhaps indicates that these conceptualiza-tions are largely overlapping.

REFERENCES

Allport, D. A. (1985). Distributed memory, modular subsystems anddysphasia. In S. K. & Newman, R. Epstein (Eds.), Current perspec-tives in dysphasia (pp 32-60). Edinburgh: Churchill Livingstone.

Belin, P., Zatorre, R. J., Lafaille, P., Ahad, P., & Pike, B. (2000).Voice-selective areas in human auditory cortex. Nature, 403, 309-312.

Binder, J. R., Frost, J. A., Hammeke, T. A., Cox, R. W., Rao, S. M., &

Prieto,T. (1997).Human brain language areas identifiedby functionalmagnetic resonance imaging. Journal of Neuroscience, 17, 353-362.

Brett, M., Bloomfield, P., Brooks, D., Stein, J., & Grasby, P. M.

(1999). Scan order effects in PET activation studies are caused bymotion artefact. NeuroImage, 9, S56.

Buckner, R. L., Raichle, M. E., Miezin, F. M., & Peterson, S. E.

(1996). Functional anatomic studies of memory retrieval for auditorywords and visual pictures. Journal of Neuroscience, 16, 6219-6235.

Cappa, S. F., Perani, D., Schnur, T., Tettamanti, M., & Fazio, F.

(1998).The effects of semantic category and knowledgetypeon lexical-semantic access: A PET study. NeuroImage, 8, 350-359.

Carpenter, P. A., Just, M. A., Keller, T. A., Eddy, W. F., & Thul-

born, K. R. (1999). Time course of fMRI-activation in language andspatial networks during sentence comprehension. NeuroImage, 10,216-224.

Chao, L. L., & Martin, A. (1999). Cortical regions associated withperceiving, naming, and knowing about colors. Journal of CognitiveNeuroscience, 11, 25-35.

Cherry, S. R., Woods, R. P., Doshi, N. K., Banerjee, P. K., & Maz-

ziotta, J. C. (1995). Improved signal-to-noise in PET activation studiesusing switched paradigms. Journalof Nuclear Medicine, 36, 307-314.

Cohen, M. S., Kosslyn, S. M., Breiter, H. C., DiGirolamo, G. J.,

Page 14: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

220 KELLENBACH, BRETT, AND PATTERSON

Thompson, W. L., Anderson, S. K., Brookheimer,S. Y., Rosen, B. R.,

& Belliveau,J. W. (1996). Changes in cortical activity during mentalrotation: A mapping study using functional MRI. Brain, 119, 89-100.

Cohen, N. J., Ryan, J., Hunt, C., Romine,L., Wszalek,T., & Nash, C.

(1999). Hippocampal system and declarative (relational) memory:Summarizing the data from functional neuroimaging studies. Hip-pocampus, 9, 83-98.

Coltheart, M., Inglis, L., Cupples, L., Michie, P., Bates, A., &

Budd, B. (1998). A semantic system specific to the storage of infor-mation about the visual attributes of animate and inanimate objects.Neurocase, 4, 353-370.

Corbetta, M., Miezin, F. M., Dobmeyer, S., Shulman, G. L., & Pe-

terson, S. E. (1991). Selective and divided attention during visualdiscriminations of shape, color, and speed: Functional anatomy bypositron emission tomography. Journal of Neuroscience, 11, 2383-2402.

Demonet, J. F., Wise, R., & Frackowiak, R. S. J. (1993). Languagefunctions explored in normal subjects by positron emission tomog-raphy: A critical review. Human Brain Mapping, 1, 39-47.

D’Esposito,M.,Detre,J. A.,Aguirre,G. K., Stallcup,M.,Alsop, D. C.,

Tippet, L. J., & Farah,M. J. (1997). A functional MRI study of men-tal image generation. Neuropsychologia, 35, 725-730.

De Vreese, L. P. (1991). Two systems for color-naming defects: Verbaldisconnection vs colour imagery disorder.Neuropsychologia, 29, 1-18.

Farah, M. J. (1984). The neurological basis of mental imagery: A com-ponential analysis. Cognition, 18, 245-272.

Fiez, J. A. (1997). Phonology, semantics, and the role of the left infe-rior prefrontal cortex. Human Brain Mapping, 5, 79-83.

Fiez, J. A.,Raichle,M. E., Balota,D. A., Tallal,P., & Peterson, S. E.

(1996). PET activation of posterior temporal regions during auditoryword presentation and verb generation. Cerebral Cortex, 6, 1-10.

Fletcher, P. C., Frith, C. D., Baker, S. C., Shallice, T., Frack-

owiak, R. S. J., & Dolan, R. J. (1995). The mind’s eye: Precuneusactivation in memory-related imagery. NeuroImage, 2, 195-200.

Forde, E. M. E., Francis, D., Riddoch, M. J., Rumiati, R. I., &

Humphreys, G. W. (1997). On the links between visual knowledgeand naming: A single case study of a patient with a category-specificimpairment for living things.Cognitive Neuropsychology, 14, 403-458.

Frith, C. D., Friston, K. J., Liddle, P. F., & Frackowiak, R. S. J.

(1991).A PETstudyofword finding.Neuropsychologia, 29, 1137-1148.Gabrieli, J. D. E., Brewer, J. B., & Poldrack, R. A. (1998). Images

of medial temporal lobe functions in human learning and memory.Neurobiology of Learning & Memory, 70, 275-283.

Gainotti, G., Silveri, M. C., Daniele, A., & Giustoli, L. (1995).Neuroanatomical correlates of category-specific semantic disorders:A critical survey. Memory, 3, 247-264.

Gerlach, C., Law, I., Gade, A., & Paulson, O. B. (1999). Perceptualdifferentiation and category effects in normal object recognition: APET study. Brain, 122, 2159-2170.

Goldenberg, G., Podreka, I., Steiner, M., Willems, K., Suess, E.,

& Lüder, D. (1989). Regional cerebral blood flow patterns in visualimagery. Neuropsychologia, 27, 641-664.

Halpern, A. R., & Zatorre,R. J. (1999). When that tune runs throughyour head: A PET investigation of auditory imagery for familiarmelodies. Cerebral Cortex, 9, 697-704.

Harris, I. M., Egan, G. F., Sonkkila,C., Tochon-Danguy,H. J., Pax-

inos, G., & Watson, J. D. G. (2000). Selective right parietal lobe acti-vation during mental rotation: A parametric PET study. Brain, 123,65-73.

Haxby, J. V., Horwitz, B., Ungerleider, L. G., Maisog, J. M.,

Pietrini, P., & Grady, C. L. (1994). The functional organization ofhuman extrastriate cortex: A PET-rCBF study of selective attentionto faces and locations. Journal of Neuroscience, 14, 6336-6353.

Hodges, J. R., Patterson, K., Oxbury, S., & Funnell,E. (1992). Se-mantic dementia: Progressive fluent aphasia with temporal lobe at-rophy. Brain, 115, 1783-1806.

Howard, D., Patterson, K., Wise, R., Brown, W. D., Friston, K.,

Weiller,C., & Frackowiak,R. (1992). The cortical localizations ofthe lexicons: Positron emission tomography evidence. Brain, 115,1769-1782.

Howard, R. J., Ffytche, D. H., Barnes, J., McKeefry, D., Ha, Y.,

Woodruff, P. W., Bullmore,E. T., Simmons, A., Williams, S. C. R.,

David, A. S., & Brammer, M. (1998). The functional anatomy ofimagining and perceiving colour. NeuroReport, 9, 1019-1023.

Ishai, A., Ungerleider, L. G., Martin, A., Schouten, J. L., &

Haxby, J. V. (1999). Distributed representation of objects in thehuman ventral visual pathway. Proceedings of the NationalAcademyof Sciences, 96, 9379-9384.

Kanwisher, N., Woods, R. P., Iacoboni, M., & Mazziotta, J. C.

(1997). A locus in human extrastriate cortex for visual shape analy-sis. Journal of Cognitive Neuroscience, 9, 133-142.

Kosslyn, S. M., Alpert, N. M., Thompson, W. L., Maljkovic, V.,

Weise, S. B., Chabris, C. F., Hamilton, S. E., Rauch,S. L., & Buo-

nanno, F. S. (1993). Visual-mental imagery activates topographicallyorganized visual cortex: PET investigations. Journal of CognitiveNeuroscience, 5, 263-287.

Kosslyn,S. M., DiGirolamo,G. J., Thompson, W. L., & Alpert, N. M.

(1998). Mental rotation of objects versus hands: Neural mechanismsrevealed by positron emission tomography. Psychophysiology, 3,151-161.

Krause,B. J., Schmidt,D.,Mottaghy,F. M., Taylor,J., Halsband,U.,

Herzog, H., Tellmann, L., & Muller-Gartner, H.-W. (1999).Episodic retrieval activates the precuneus irrespective of the imagerycontent of word pair associates: A PET study. Brain, 122, 255-263.

Maguire, E. A., Frith, C. D., & Morris, R. G. M. (1999). The func-tional neuroanatomy of comprehension and memory: The impor-tance of prior knowledge. Brain, 122, 1839-1850.

Martin, A., Haxby, J., Lalonde, F., Wiggs, C., & Ungerleider, L.

(1995). Discrete cortical regions associated with knowledge of colorand knowledge of action. Science, 270, 102-105.

Mellet, E., Petit, L., Mazoyer, B., Denis, M., & Tzourio, N.

(1998). Reopening the imagery debate: Lessons from functionalanatomy. NeuroImage, 8, 129-139.

Mellet, E., Tzourio, N., Crivello,F., Joliot, M., Denis, M., & Ma-

zoyer,B. (1996). Functional anatomy of spatial mental imagery gen-erated from verbal instruction. Journal of Neuroscience, 16, 6504-6512.

Mellet, E., Tzourio, N., Denis, M., & Mazoyer, B. (1995). Apositron emission tomography study of visual, and mental, spatial ex-ploration. Journal of Cognitive Neuroscience, 7, 433-445.

Mellet, E., Tzourio, N., Denis, M., & Mazoyer, B. (1998). Corticalanatomy of mental imagery of concrete nounsbased on their diction-ary definition. NeuroReport, 9, 803-809.

Mellet, E., Tzourio-Mazoyer, N., Bricogne, S., Mazoyer, B.,

Kosslyn, S. M., & Denis, M. (2000). Functional anatomy of high-resolution visual mental imagery. Journal of Cognitive Neuroscience,12, 98-109.

Moore, C. J., & Price, C. J. (1999). A functional neuroimaging studyof the variables that generate category-specific object processing dif-ferences. Brain, 122, 943-962.

Mummery, C. J., Patterson, K., Hodges, J. R., & Price, C. J. (1998).Functional neuroanatomy of the semantic system: Divisible by what?Journal of Cognitive Neuroscience, 10, 766-777.

Mummery,C. J., Patterson, K., Hodges, J. R., & Wise, R. J. S. (1996).Generating “tiger” as an animal name or a word beginning with T:Differences in brain activation. Proceedings of the Royal Society ofLondon: Series B, 263, 989-995.

Mummery, C. J., Patterson, K., Price, C. J., Ashburner, J., Frack-

owiak, R. S. J., & Hodges, J. R. (2000). A voxel-based morphome-try study of semantic dementia: Relationship between temporal lobeatrophy and semantic memory. Annals of Neurology, 47, 36-45.

Mummery, C. J., Patterson, K., Wise, R. J. S., Vandenberghe, R.,

Price, C. J., & Hodges, J. R. (1999). Disrupted temporal lobe con-nections in semantic dementia. Brain, 122, 61-73.

Nakamura, K., Honda, M., Tomohisa, O., Hanakawa, T., Toma, K.,

Fukuyama,H., Konishi, J., & Shibasaki, H. (2000). Participation ofthe left posterior inferior temporal cortex in writing and mental recallof kanji orthography. Brain, 123, 954-967.

Nyberg, L., Habib, R., McIntosh, A. R., & Tulving, E. (2000). Re-activation of encoding-related brain activity during memory retrieval.Proceedings of the National Academy of Sciences, 97, 11120-11124.

Paulesu, E., Frith, C. D., & Frackowiak, R. S. (1993). The neural

Page 15: Large, colorful, or noisy? Attribute- and modality ... · Large, colorful, or noisy? Attribute- and modality-specific activations during retrieval of perceptual attribute knowledge

RETRIEVAL OF PERCEPTUAL ATTRIBUTE KNOWLEDGE 221

correlates of the verbal component of working memory. Nature, 362,342-345.

Paulesu, E., Harrison, J., Baron-Cohen, S., Watson, J. D., Gold-

stein, L., Heather, J., Frackowiak, R. S., & Frith, C. D. (1995).The physiology of coloured hearing: A PET activation study ofcolour–word synaesthesia. Brain, 118, 661-676.

Peterson,S. E., Fox, P. T.,Posner,M. I., Mintun, M.,& Raichle,M. E.

(1988). Positron emission tomographic studies of the cortical ana-tomy of single word processing. Nature, 331, 585-589.

Peterson,S. E., Fox, P. T.,Posner,M. I., Mintun, M.,& Raichle,M. E.

(1989). Positron emission tomographic studies of the processing ofsingle words. Journal of Cognitive Neuroscience, 1, 153-170.

Price, C. J., Moore, C. J., Humphreys, G. W., & Wise, R. S. J. (1997).Segregating semantic from phonological processes during reading.Journal of Cognitive Neuroscience, 9, 727-733.

Pulvermüller, F. (1999). Words in the brain’s language. Behavioral &Brain Sciences, 22, 253-336.

Roland, P. E., & Gulyas, B. (1995). Visual memory, visual imagery,and visual recognition of large field patterns by human brain: Func-tional anatomy by positron emission tomography. Cerebral Cortex, 1,79-93.

Rorden C., & Brett, M. (2001). Stereotaxic display of brain lesions.Behavioural Neurology, 12, 191-200.

Sartori, G., & Job, R. (1988). The oyster with four legs: A neuropsy-chological study on the interaction between vision and semantic in-formation. Cognitive Neuropsychology, 5, 105-132.

Scott, S. K., Blank, C., Rosen, S., & Wise, R. J. S. (2000). Identifi-cation of a pathway for intelligible speech in the left temporal lobe.Brain, 123, 2400-2406.

Sheridan,J., & Humphreys,G. W. (1993).A verbal-semantic category-specific recognition impairment. Cognitive Neuropsychology, 10,143-184.

Simons, J. S., Graham, K. S., Owen, A. M., Patterson, K., &

Hodges, J. R. (2001). Perceptual and semantic components of mem-ory for objects and faces: A PET study. Journal of Cognitive Neuro-science, 13, 430-443.

Snodgrass, J. G., & Vanderwart, M. (1980). A standardized set of260 pictures: Norms for name agreement, image agreement, famil-iarity, and visual complexity. Journal of Experimental Psychology:Human Learning & Memory, 6, 174-215.

Snowden, J. S., Neary, D., & Mann, D. M. A. (1996). Fronto-temporallobar degeneration:Fronto-temporaldementia, progressive aphasia,semantic dementia. New York: Churchill Livingstone.

Talairach, J., & Tournoux, P. (1988). Co-planar stereotaxic atlas ofthe human brain. Stuttgart: Thieme.

Thompson-Schill,S.L.,Aguirre,G. K.,D’Esposito,M., & Farah,M. J.

(1999). A neural basis for category and modality specificity of se-mantic knowledge. Neuropsychologia, 37, 671-676.

Vandenberghe,R., Price, C., Wise, R., Josephs, O., & Frackowiak,

R. S. J. (1996). Functional anatomy of a common semantic system forwords and pictures. Nature, 383, 254-256.

Wheeler, M. E., Peterson, S. E., & Buckner, R. L. (2000). Memo-ry’s echo: Vivid remembering reactivates sensory-specific cortex.Proceedings of the NationalAcademy of Sciences, 97, 11125-11129.

Wiggs, C. L., Weisberg, J., & Martin, A. (1999). Neural correlates ofsemantic and episodic memory retrieval. Neuropsychologia, 37, 103-118.

Wise, R. J., Chollet, F., Hadar, U., Friston, K., Hoffner, E., &

Frackowiak, R. (1991). Distribution of cortical neural networks in-volved in word comprehension and word retrieval. Brain, 114, 1803-1817.

Wise, R. J., Scott, S. K., Blank, S. C., Mummery, C. J., Murphy, K.,

& Warburton, E. A. (2001). Separate neural subsystems within“Wernike’s area.” Brain, 124, 83-95.

Worsley,K. J., Marrett,S., Neelin, P., Vandal, A. C., Friston, K. J.,

& Evans, A. C. (1996). A unified statistical approach for determin-ing significant signals in images of cerebral activation. Human BrainMapping, 4, 58-73.

Zatorre,R. J., Halpern,A. R., Perry,D. W., Meyer,E., & Evans, A. C.

(1996). Hearing in the mind’s ear: A PET investigation of musicalimagery and perception. Journalof Cognitive Neuroscience, 8, 29-46.

NOTES

1. We centered this ROI on the PIT activations elicited in our ownstudy. An alternative might be to center the ROI on the activationselicited in earlier studies by Martin and colleagues (Chao & Martin,1999; Martin et al., 1995; Wiggs et al., 1999), but this was considereda less optimal option, both because we used a very different paradigmand because the structural correspondences between the SPM95 tem-plates used by Martin and colleagues, the Talairach and Tournoux(1988) brain, and the brain templates from the MNI employed here areunclear.

2. The ventral extension of the anterior temporal activation in Figure 1suggests that the activations associated with the contrasts between thesemantic and the control conditions may have been contaminated bysome artifact (such as movement; Brett et al., 1999). Analysis of themovement parameters showed no statistically significant differences be-tween any of the conditions, however. Furthermore, the critical left PITactivation appears unaffected, in that the direct contrasts between the se-mantic attribute conditions show no evidence of any similar contami-nation (Figure 1).

3. Although the plot in Figure 3A does not correspond to the centerof the spherical ROI described here, it corresponds to the focus of theobserved enhancement of activation associated with color knowledge.

4. Auditory presentation of verbal stimuli also often activatesslightly more anterior regions of the left superior temporal gyrus (e.g.,Belin, Zatorre, Lafaille, Ahad, & Pike, 2000; Demonet, Wise, & Frack-owiak, 1993; D. Howard et al., 1992; Scott, Blank, Rosen, & Wise,2000; Wise et al., 1991; Wise et al., 2001), and althoughoften describedtogether with the temporoparietal activations focused on here, it re-mains to be shown whether activations of these areas are functionallydistinct.

(Manuscript received March 13, 2001;revision accepted for publication May 21, 2001.)