consciousness explained

10
Artificial Intelligence 60 (1993) 303-312 303 Elsevier ARTINT 1052 Triple Book Review Daniel C. Dennett, Explained* Consciousness Robert Ornstein, The Evolution of Consciousness: Of Darwin, Freud, and Cranial Fire: The Origins of the Way We Think** William Seager, Consciousness*** Metaphysics of Joseph O'Rourke Department of Computer Science, Smith College, Northampton, MA 01063, USA Received October 1992 1. Introduction "The problem of consciousness, also known as the mind-body prob- lem, is probably the largest outstanding obstacle in our quest to scien- * (Little Brown; Canada, 1991); xiii+511 pages, ISBN 0-316-18065-3. ** (Prentice Hall; New York, 1991); xiv+305 pages, ISBN 0-13-587569-2. *** (Routledge; London, 1991 ); viii+262 pages, ISBN 0-415-06357-4. Correspondence to: J. O'Rourke, Department of Computer Science, Northampton, MA 01063, USA. E-mail: [email protected]. Smith College, 0004-3702/93/$ 06.00 (~) 1993 -- Elsevier Science Publishers B.V. All fights reserved

Upload: joseph-orourke

Post on 15-Jun-2016

219 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Consciousness explained

Artificial Intelligence 60 (1993) 303-312 303 Elsevier

ARTINT 1052

Triple Book Review

Daniel C. Dennett, Explained*

Consciousness

Robert Ornstein, The Evolution of Consciousness: Of Darwin, Freud, and Cranial Fire: The Origins of the Way We Think**

William Seager, Consciousness***

Metaphysics of

J o s e p h O ' R o u r k e

Department of Computer Science, Smith College, Northampton, MA 01063, USA

Received October 1992

1. Introduction

"The problem of consciousness, also known as the mind-body prob- lem, is probably the largest outstanding obstacle in our quest to scien-

* (Little Brown; Canada, 1991); xiii+511 pages, ISBN 0-316-18065-3. ** (Prentice Hall; New York, 1991); xiv+305 pages, ISBN 0-13-587569-2. *** (Routledge; London, 1991 ); v i i i+262 pages, ISBN 0-415-06357-4. Correspondence to: J. O'Rourke, Department of Computer Science,

Northampton, MA 01063, USA. E-mail: [email protected]. Smith College,

0004-3702/93/$ 06.00 (~) 1993 - - Elsevier Science Publishers B.V. All fights reserved

Page 2: Consciousness explained

304 ./. O'Rourke

tifically understand reality" [1]. That this opinion is prevalent among philosophers, cognitive psychologists, and neurobiologists may come as a surprise to researchers in artificial intelligence (AI), who perhaps take their cue from Minsky's opinion: "as far as I'm concerned, the so-called problem of body and mind does not hold any mystery: Minds are sim- ply what brains do" [12, p. 287]. This assertion does not dispel the mystery for many, and an understanding of consciousness is under hot pursuit from several quarters, resulting in a spate of recent books on the topic. Here I will discuss three, covering the spectrum from psy- chology to philosophy. Along the way I hope to dislodge any smug at- titudes that the mind-body problem is nonexistent or trivial or irrele- vant.

The three books under review could hardly be more different. Ornstein's The Evolution of Consciousness: Of Darwin, Freud, and Cranial Fire: The Origins of the Way We Think can be classified as popular psychology. Al- though Dennett is a philosopher, his Consciousness Explained draws on psychology, neurophysiology, and computer science, and is aimed at an audience much wider than professional philosophers. Seager's Metaphysics of Consciousness is written by a philosopher for philosophers and is the most academic of the three. The central concern of all three is to "explain" consciousness, although their senses of explanation are disparate.

The term "consciousness" is notoriously variegated. The Oxford English Dictionary (OEDJ lists six distinct meanings, and none of the three authors bothers to define it precisely. This lack of precision is not as serious an obstruction to understanding as it might seem, as discussion usually focuses on particular more well-defined mental states. And besides, as Seager puts it, "the most notable feature of the mind-body relation remains our ig- norance of it" [p. 4], so understanding any aspect of consciousness is an achievement.

I suspect that many AI researchers subscribe to the mind-body theory known as "functionalism", which claims (roughly) that if the functional roles of aspects of mentality are reproduced, consciousness necessarily emerges. Although most in AI are neither familiar with this term nor aware of the philosophical debates surrounding it, research in AI influences these debates. As AI systems become more complex, I believe the influence will increasingly run in both directions.

I will start with a brief sketch of the three books to provide orientation on their views, but I will make no attempt to summarize their complete contents. Next I will indicate what is problematic about consciousness, and then turn to several specific aspects of the problem on which the three authors express opinions. I will focus especially on functionalism, hoping at the least to transform any unexamined functionalist into an examined functionalist.

Page 3: Consciousness explained

Book Review 305

2. Characterizations

2.1. Ornstein

Ornstein's book is the least structured of the three, partitioned into twenty- six loosely connected chapters, festooned with fifty cartoonish drawings that I found only distracting. He emphasizes the evolutionary development of the brain and mind, and the implications of our evolutionary inheritance on the way our brains function today. He views most higher mental facilities as almost accidental, serendipitous use of surplus brain cells evolved to protect the brain against heat loss. He calls the unconscious talents of our brain "simpletons" and imagines squadrons of them rapidly shifting in and out of control. Here his theory is akin to Minsky's "Society of Mind" [12] (which he does not cite). Ornstein believes we can modify our behavior by deeper understanding of this shifting process, effecting a "conscious evolution".

2.2. Dennett

Dennett's book is more substantial and written in a delightfully fluid style, a happy medium between Ornstein's discursiveness and Seager's rigidity. His aim is clear from his audacious title: to explain consciousness. I think he falls short of this lofty goal, although the attempt is enthralling. His argument is long and complex, and I cannot do it justice here.

One focus of his energy is exposing what he believes are confusions in the philosophical literature. In particular, he debunks the "Cartesian Theater", the notion that there is some centered locus in the brain where information comes together to form consciousness. He proposes to replace this metaphor with his own "Multiple Drafts" model in which all mental activity consists of intertwining parallel skeins of processes of interpretation and elaboration. His claim is that there is no need to re-represent the data, no need to move it to a center, despite the search by neuroscientists such as Crick for neural correlates of visual awareness [3 ].

A second focus is less successful: his attempt to "disqualify" qualia. Qualia are the raw phenomenal subjective "feels" of mental experiences, the locus of considerable philosophical scrutiny. I will have more to say on this topic below, but roughly, his explanation of this aspect of consciousness is to deny that there is anything to explain!

2.3. Seager

Seager writes directly to philosophers of mind; and some sections are epitomes of the "angels on the head of a pin" donnish philosophy debate that many find so pointless. Nevertheless his highly structured organization permits the resolute to plunge into the philosophical thicket and return

Page 4: Consciousness explained

306 J. O'Rourke

with a greater appreciation of its complexity. He critically examines a variety of mind-body theories, finding arguments for and against each, none conclusive either way. His search for a minimal theory resistant to attacks results in one that has only narrow applicability: to intentional states (beliefs, desires, and so on). The more intrinsic, ineffable mental phenomena (pains, visual awareness, and so on), remain for him mysterious: he feels this type of conscious experience fits no known paradigm, and ultimately may be inexplicable.

3. Varieties of consciousness

"Consciousness" is a word with many meanings. Indeed, Natsoulas wrote a series of papers on the six concepts of consciousness in the OED, and used his analysis to explicate James' stream of consciousness [14]. Some debates are confused by unwitting conflation of these meanings. The opposite ex- treme is to choose such a narrow definition that the mystery disappears. Minsky restricts his attention (in [1 1]) to the ability to sense what is happening "within and outside ourselves", and concludes that humans "do not possess much consciousness" and that "machines are already poten- tially more conscious than are people". This may be true for his restricted definition but hardly illuminates consciousness in its full variety.

Its full variety is usefully partitioned by Seager into intentional, content- ful mental states (beliefs, desires), and more private, intrinsic properties (pains, color qualia). The former states require a more global, third-person perspective and seem closest to accurate capture by existing mind-brain theories. The latter "ineffable" states demand a first-person perspective and remain the sticking point of most theories. The main "problem of conscious- ness", then, is to explain consciousness: how do physical events in the brain engender these most private conscious experiences?

4. Evolution

Part of the answer might be found in the evolution of the brain. As Ornstein's title indicates, he focuses on evolution; but he concentrates his speculations on the growth of the brain and on the development of mental habits aimed at survival. Seager hovers one abstraction above the details of evolutionary development, which I consider a weakness of his argument. Dennett seems to strike the happiest balance, including a thorough discussion of evolutionary development and mechanisms, moving the discussion a significant step into cultural evolution.

Page 5: Consciousness explained

Book Review 307

Dennett suggests that a mechanism known as the "Baldwin effect" could have moved the evolutionarily advantageous skill of language into the geno- type. This effect favors phenotypic plasticity, which permits individuals to quickly discover especially useful behavioral talents [7]. But he believes that consciousness, unlike our language facility, is not hardwired into the genotype: rather it is implemented in "software" and therefore invisible to neuroanatomy. This software is a product of "cultural evolution", an attractive notion proposed by Dawkins in The Selfish Gene [5]. Dennett suggests that the benefit of silently talking to oneself eventually led to the consciousness software. He views this software as effectively implementing a virtual sequential "Joycean machine" in the parallel architecture of the brain, and it is this machine that leads to the experience of a stream of consciousness.

The unit of cultural evolution is Dawkins' "meme': units of self-replicating cultural traits, such as catchy tunes, religious faith, education, and computer viruses. Thus Dennett sees consciousness as a huge complex of memes, a view I find compelling, albeit somewhat vague. Neither Ornstein nor Seager mentions memes. Nor do they mention Jaynes' theory of the recent rise of consciousness [9], which supports Dennett's view of consciousness as a late development. 1

5. Mind-body theories

Knowing why consciousness emerged from evolution does not constitute an explanation of how the physical activities of the brain evoke our experi- ence of consciousness, how the mind is related to the body. The three authors are unanimous in their rejection of dualism, which postulates that mind is a non-material substance. 2 They are equally disdainful of epiphenomenalism, a near-dualism that holds that mental events can have no effects in the physical world whatever, although physical events do cause mental events. What remains after rejecting dualism is materialism or physicalism, which claims the brain is exclusively physical and all its causal powers derive from the causal powers of its physical parts.

Seager analyzes five versions of physicalism, and finds all wanting but none refuted. I will sketch each tersely to give a sense for the variety and then follow with a more detailed analysis of functionalism.

(1) Type-identity theory is the strongest version of physicalism: it iden- tifies psychological states with certain exactly specified brain states, claiming an eventual reduction of psychology to neurophysiology.

I Dennett does not endorse the specifics of Jaynes' theory, however. 2Despite the unpopularity of this view, a version is held by Popper and Eccles [ 15 ].

Page 6: Consciousness explained

308 J. O'Rourke

(2) Functionalism does not claim that psychological terms refer to definite brain states; but rather they refer to these indirectly via their function in an abstract system realized physically. It is the functional structure of the cognitive system that creates the mental.

(3) Token-identity theory (Davidson [4] ) claims that there is an irrecon- cilable opposition between mental state ascriptions and physical state descriptions; rather they are parallel descriptions of one world, in which every item has a physical description. It weakens the identity claim of type-identity theory to representative or "token" mental and physical events.

(4) Instrumentalism (Dennett [6]) denies that there is a set of physical states that even approximately mirrors psychological states. Rather the brain behaves "instrumentally" according to folk psychology, 3 largely because evolutionary pressure leads to creatures describable this way.

(5) Eliminative materialism (Churchland [2] ) goes even further, claim- ing that folk psychology will disappear with advances in neuroscience. Whereas instrumentalism maintains the truth of psychological charac- terizations, eliminative materialism claims folk psychology is a false theory.

Seager labels his minimal theory "constitutive global supervenience'. Su- pervenience is a complex technical term that plays a key role in Seager's book; that it is not even mentioned by Dennett or Ornstein is indicative of a conceptual chasm. S-properties supervene on B-properties (base properties) if any two systems that agree on the B-properties necessarily possess the S- properties. Global supervenience permits the properties to depend on more than an individual's brain: including, e.g., culturally-determined language conventions. Constitutive supervenience demands more than mere correla- tion: for example, causal efficacy in the supervening relation fi la Searle's "special causal powers" of neurons [18]. One can see from these subtleties how carefully nuanced is Seager's position.

Dennett strenuously resists any single label, but defends a version of func- tionalism throughout his book. He calls it "teleofunctionalism" to emphasize its grounding in evolutionarily advantageous functions. 4 We will see below how he skirts the problems of standard functionalism.

Ornstein does not delve into the philosophical issues enough to earn a label.

3"Folk psychology" is our shared, commonsense theory of human behavior that successfully explains, e.g., that we carry umbrellas because of our belief in their efficacy in keeping us dry.

4"Teleology" is the study of evidence for design in nature.

Page 7: Consciousness explained

Book Review 309

6. Functionalism

Functionalism claims that any system that realizes the functional struc- ture of the brain will automatically possess all the mental properties of the mind--so intelligent machines will be conscious. Functionalism is un- controversial for many phenomena: any system that cools its contents is a type of refrigerator. Somewhat more controversial is a functionalist theory of life: any system that adapts to its environment, metabolizes energy, and reproduces, is a form of life. 5 But is it the case that a state that results from bodily trauma, and elicits wincing, retraction of the afflicted body part, and in general plays the functional role of pain, must feel like pain, in fact is pain? Since functionalism is widely accepted in the AI community, it should be instructive to examine some of the reasons why philosophers find this position problematical.

Making precise the phrase "realizes the functional structure" is the first difficulty. Interpreted too loosely, functionalism slides into behaviorism. Suppose intelligent behavior were achieved with an extremely large (but finite) static lookup table. 6 Does the table enjoy a mental life? Most would say no. This leads to attempts to dig deeper than I/O behavior, and demand a similar functional organization.

Putnam Putnam, the original functionalist [16], explored formalizing the defini-

tion of "functionally realize" in terms of finite state automata (FSA). But in recent work [ 17 ] he finds these definitions can lead to the absurd conclu- sion that any sufficiently complex system (e.g., the economy of Peru) can realize any FSA. This conclusion is reached by gerrymandering the states of the system to map to the FSA so that all "functional" transitions are preserved. The danger of tightening the definition of functional realization too much is that one can exclude the possibility of mentality in robots or extraterrestials, for at a low level, their states and transitions would surely be rather different.

Putnam now believes the theory is unworkable for several reasons [17]. Meanings are not "in the head", so beliefs and other intentional states must incorporate references to the environment. "Sociofunctionalists" respond by including the environment in the purview of the mind-body relation (this is Seager's global supervenience relation). But Putnam feels the theory runs aground on the shoals of synonymy, possible world counterfactuals, and other hazards of the philosophical sea.

5This example is due to David Chalmers. 6S eager points out that this idea is present (in essence) in Leibnitz's work [ 10] in 1702[

Page 8: Consciousness explained

310 J. O'Rourke

Shifted spectra No doubt the most vexing aspect of consciousness for functionalism is

sensory qualia: our internal, subjective experiences of color, pain, pitch, and so on. Seager invites us to consider how functionalism can handle color qualia: the way it seems to you when you look at colors. Imagine a person's color spectrum shifted by a sudden surreptitious replacement of their cones with ones tuned to ultraviolet reflectances. Surely (Seager says) the nature of the color qualia would not change, but they are not functionally isomorphic. A functionalist could respond by redrawing the boundary so that what is crucial is the functional architecture of the color vision system, not the functional behavior of the sensors. But now the functionalist can be backed into admitting that two behaviorally-equivalent beings with mildly different functional architectures do not experience the same qualia.

Blindsight "Blindsight" is often offered as a disproof of functionalism. Blindsight is

a phenomenon observed in patients who have suffered partial damage to their visual cortex resulting in a distinct blindspot in one hemisphere. These patients nevertheless can "guess" when a visual stimulus is present in their blindspot, despite having no conscious experience of visual perception there. This seems to indicate that the function of vision can be present without the attendant mental experience; both Ornstein and Seager interpret it this way (Seager more cautiously). Dennett, however, uses this phenomenon as grist for his Multiple Drafts mill. He emphasizes how the experimenter's questions act as probes into the patient's stream of consciousness, eliciting not facts but rather answers more akin to fictional narratives. He suggests that time and training could change what a blindsighter considers conscious.

Searle Functionalism implies that a suitably programmed silicon-based computer

brain would be conscious, a claim vigorously challenged by Searle's notorious "Chinese Room" argument [18]. Neither Seager nor Dennett find force in Searle's argument; Ornstein doesn't mention it. Dennett sides with the "systems reply" [8], claiming that Searle fails to imagine the full complexity of the Chinese Room system: understanding can emerge (contra Searle) "from lots of distributed quasi-understanding in a large system" [p. 439]. Seager focuses on Searle's claim that neurons possess special causal powers. He argues that these speculative powers explain nothing, for "if one asked why such and such neuronal structure gave rise to ... conscious experience ... Searle would have no answer" [p. 181]. Seager is not objecting to the lack of a detailed account of how consciousness arises, but contends "we have no account at all" [p. 181].

Page 9: Consciousness explained

Book Review 311

7. Qualia

Is the subjective experience of pain something over and above a complex of "dispositions to react" in certain ways? Seager says yes; Dennett says no. This appears to be the crux of their difference, which I will illustrate with two final examples on color qualia.

Inverted spectra

Imagine another neurological Gedanken experiment: your cone cells are rewired to invert your color spectrum, so that grass is red and so on. There is evidence (mentioned by Ornstein and Dennett) that gradual behavioral adaptation would occur. Suppose it would; after complete adaptation of your "reactive dispositions", are your color qualia inverted? Dennett says no, because your qualia are your reactive dispositions. Seager's position is more cautious and complex. He concludes there is a nomological (lawlike) link between the physical properties of the brain and qualia, a view incompatible with Dennett's.

Tetrachromates

Pigeons are tetrachromates: 7 they have four different color receptors, and so their subjective color space is four-dimensional. Seager feels that our extensive neurological knowledge "does not afford us the least idea of what these extra 'hues' look like" [p. 151]: "the particularities of ... modes of consciousness are not exhausted by the constituting structure of the conscious being" [p. 219]. Dennett 's position on Nagel's famous paper "What is it like to be a bat?" [13 ] implies that he would claim that pigeon qualia are accessible to us (with great effort): "the structure of a bat's mind is just as accessible as the structure of a bat's digestive system" [p. 447].

8. Conclusion

Ornstein's book is easy reading, informative, but ultimately unsatisfying: he risks few clear claims, and even those are not well supported. Seager is not for the philosophically timid, but his book is a model of careful reasoning that clarifies significantly while deepening the complexity of the issues. Although I remain unconvinced by Dennett, his theory is the most tantalizing. He is one of the few philosophers to take AI seriously, and those in AI should consider returning the favor.

7They might even be pentachromates.

Page 10: Consciousness explained

312 J. O'Rourke

Acknowledgements

I have benefited from discussions with David Chalmers and Jeff Dalton, and from the editors' comments.

References

[ 1] D.J. Chalmers, Consciousness and cognition, Center of Research on Concepts and Cognition, Indiana University, Bloomington, IN (1991 ).

[2] P.M. Churchland, A Neurocomputational Perspective: The Nature of Mind and the Structure of Science, MIT Press, (1989), Chapter 1.

[3] F. Crick and C. Koch, The problem of consciousness, Sci. Am. 267 (3) (1992) 153-159. [4] D. Davidson, Essays on Actions and Events (Oxford University Press, Oxford, 1980). [5] R. Dawkins, The Selfish Gene (Oxford University Press, Oxford, 1976). [6] D.C. Dennett, The Intensional Stance (MIT Press, Cambridge, MA, 1987). [7] G.E. Hinton and S.J. Nowland, How learning can guide evolution, Tech. Rept. CMU-

CS-86-128, Carnegie Mellon University, Pittsburgh, PA (1987). [8] D.R. Hofstadter and D.C. Dennett, The Mind's I: Fantasies and Reflections on Self and

Soul (Basic Books, New York, 1981). [9] J. Jaynes, The Origin of Consciousness in the Breakdown of the Bicameral Mind

(Houghton Mifflin, Boston, MA, 1976). [10] G.W. Leibnitz, Reply to the thoughts on the system of preestablished harmony .... in:

Gottfried Wilhelm Leibnitz: Philosophical Papers and Letters (Reidel, Dordrecht, 1969). [ l 1 ] M. Minsky, Machinery of consciousness, in: 75th Anniversary Symposium on Science in

Society (National Research Council of Canada, to appear). [12] M. Minsky, Society o f Mind (Simon and Schuster, New York, 1986). [13] T. Nagel, What is it like to be a hat? Philos. Rev. 83 (1974) 435-450. [14] T. Natsoulas, Concepts of consciousness, J. Mind Behav. 4 (1983) 13-59. [15] K. Popper and J. Eccles, The Self and the Brain (Springer, New York, 1977). [ 16 ] H. Putnam, Mind, Language and Reality: Philosophical Papers (Cambridge University

Press, Cambridge, 1975). [17] H. Putnam, Representation and Reality (MIT Press, Cambridge, MA, 1988). [18] J.R. Searle, Minds, brains and programs, Behav. Brain Sci. 3 (1980) 417-457.