systemic networks, relational networks, and neural networks sydney lamb [email protected] part ii:...

Download Systemic Networks, Relational Networks, and Neural Networks Sydney Lamb lamb@rice.edu Part II: GuangZhou 2010 November 3 Sun Yat Sen University

If you can't read please download the document

Upload: april-hersom

Post on 13-Dec-2015

216 views

Category:

Documents


3 download

TRANSCRIPT

  • Slide 1

Systemic Networks, Relational Networks, and Neural Networks Sydney Lamb [email protected] Part II: GuangZhou 2010 November 3 Sun Yat Sen University Slide 2 Topics in this presentation Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 3 Topics Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 4 Aims of SFL SFG aims (primarily) to describe the network of choices available in a language For expressing meanings SFL differs from Firth, and also from Lamb, in that priority is given to the system (Halliday, 2009:64) The organizing concept of a systemic grammar is that of choice (that is, options in meaning potential) (Halliday 1994/2003: 434 Slide 5 Aims of Neurocognitive linguistics (NCL) NCL aims to describe the linguistic system of a language user As a dynamic system It operates Speaking, comprehending, learning, etc. It changes as it operates Evidence that can be used Texts Findings of SFL Slips of tongue and mind Unintentional puns Etc. Slide 6 NCL seeks to learn.. How information is represented in the linguistic system How information is represented in the linguistic system How the system operates in speaking and understanding How the system operates in speaking and understanding How the linguistic system is connected to other knowledge How the linguistic system is connected to other knowledge How the system is learned How the system is learned How the system is implemented in the brain How the system is implemented in the brain Slide 7 The linguistic system of a language user: Two viewing platforms Cognitive level: the cognitive system of the language user without considering its physical basis The cognitive (linguistic) system Field of study: cognitive linguistics Neurocognitive level: the physical basis Neurological structures Field of study: neurocognitive linguistics Slide 8 Topics Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 9 Cognitive Linguistics First occurrence in print: [The] branch of linguistic inquiry which aims at characterizing the speakers internal information system that makes it possible for him to speak his language and to understand sentences received from others. (Lamb 1971) Slide 10 Operational Plausibility To understand how language operates, we need to have the linguistic information represented in such a way that it can be used for speaking and understanding (A competence model that is not competence to perform is unrealistic ) Slide 11 Relational network notation Thinking in cognitive linguistics was facilitated by relational network notation Developed under the influence of the notation used by Halliday for systemic networks Earlier steps leading to relational network notation appear in papers written in 1963 Slide 12 More on the early days In the 1960s the linguistic system was viewed (by Hockett and Gleason and me and others) as containing items (of unspecified nature) together with their interrelationships Cf. Hocketts Linguistic units and their relations (Language, 1966) Early primitive notations showed units with connecting lines to related units Slide 13 The next step: Nodes The next step was to introduce nodes to go along with such connecting lines Allowed the formation of networks systems consisting of nodes and their interconnecting lines Hallidays notation (which I first saw in 1964) used different nodes for paradigmatic (or) and syntagmatic (and) relationships Just what I was looking for Slide 14 From systemic networks to relational networks Three notational adaptations Rotate 90 degrees, so that upwards would be toward meaning (at the theoretical top) and downwards would be toward phonetics (at the theoretical bottom) Replace the brace for and with a (more node-like appearing) triangle; Retaining the bracket for or, allow the connecting lines to connect at a point Slide 15 The downward OR abab a b Slide 16 The downward AND abab a b Slide 17 The 90 Rotation: Upward and Downward Expression (phonetic or graphic) is at the bottom Therefore, downward is toward expression Upward is toward meaning (or other function) more abstract network meaning expression Slide 18 Orientation of Nodes Downward AND and OR nodes: Branching on the expression side Multiple branches to(ward) expression Upward AND and OR nodes: Branching on the content side Multiple branches to(ward) content Slide 19 Downward and upward branching a b Slide 20 The meaning of up/down: Neurological interpretation At the bottom are the interfaces to the world outside the brain: Sense organs on the input side Muscles on the output side Up is more abstract Slide 21 The ordered AND We need to distinguish simultaneous from sequential For sequential, the ordered AND Its two (or more) lines connect to different points at the bottom of the triangle (in the case of the downward and) to represent sequential activation leading to sequential occurrence of items a b First a then b Slide 22 The downward ordered or For the or relation, we dont have sequence since only one of the two (or more) lines is activated But an ordering feature for this node is useful to indicate precedence So we have precedence ordering. The line connecting to the left takes precedence If conditions allow for its activation to be realized, it will be chosen in preference to the other line Slide 23 The downward ordered or (original notation) a b marked choice unmarked choice (a.k.a. default ) The marked choice takes precedence: It is chosen if the conditions that constitute the marking are present Slide 24 The downward ordered or (revised notation) a b marked choice unmarked choice (a.k.a. default ) The unmarked choice is the one that goes right through. The marked choice is off to the side either side Slide 25 The downward ordered or (revised notation) a b unmarked choice marked choice (a.k.a. default ) The unmarked choice is the one that goes right through. The marked choice is off to the side either side Slide 26 Sometimes the unmarked choice has zero realization b unmarked choice marked choice The unmarked choice is nothing. In other words, the marked choice is optional. Slide 27 Operational Plausibility To understand how language operates, we need to have the information represented in such a way that it can be directly used for speaking and understanding Competence as competence to perform The information in a persons mind is knowing how not knowing that Information in operational form Able to operate without manipulation from some added performance system Slide 28 Relational networks: Cognitive systems that operate Language users are able to use their languages Such operation takes the form of activation of lines and nodes The nodes can be defined on the basis of how they treat incoming activation Slide 29 Nodes are defined in terms of activation: The AND a b Downward activation from k goes to a and later to b Upward activation from a and later from b goes to k Downward ordered AND k Slide 30 Nodes are defined in terms of activation a b The OR condition is not Achieved locally at the node itself it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both Downward unordered OR k p q Slide 31 Nodes are defined in terms of activation: The OR a b Upward activation from either a or b goes to k Downward activation from k goes to a and [sic] b Downward unordered OR k Slide 32 Nodes are defined in terms of activation a b The OR condition is not achieved locally at the node itself it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both Downward unordered OR k p q Slide 33 The Ordered AND: Upward Activation Activation moving upward from below Slide 34 The Ordered AND: Downward Activation Activation coming downward from above Slide 35 Downward Activation ANDOR Upward Downward Slide 36 Upward Activation AND OR Upward Downward Slide 37 Upward activation through the or The or operates as either-or for activation going from the plural side to the singular side. For activation from plural side to singular side it acts locally as both-and, but in the context of other nodes the end result is usually either-or Slide 38 Upward activation through the or bill BILL 1 BILL 2 Usually the context allows only one interpretation, as in Ill send you a bill for it Slide 39 Upward activation through the or bill BILL 1 BILL 2 But if the context allows both to get through, we have a pun: A duck goes into a pub and orders a drink and says, Put it on my bill. Slide 40 Zhong Guo: Shadow Meaning CENTRAL CHINA KINGDOM zhong guo Slide 41 The ordered OR: How does it work? default Ordered This line taken if possible Node-internal structure (not shown in abstract notation) is required to control this operation Slide 42 Topics Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 43 A purely relational network After making these adaptations to systemic network notation, resulting in relational network notation (abstract form), it became apparent (one afternoon in the fall of 1964) that relational networks) need not contain any items at all The entire structure could be represented in the nodes and their interconnecting lines Slide 44 Morpheme as item and its phonemic representation boy b - o - y Symbols? Objects? Slide 45 Relationship of boy to its phonemes boy As a morpheme, it is just one unit Three phonemes, in sequence b o y Slide 46 The nature of this morphemic unit BOY Noun b o y The object we are considering Slide 47 The morpheme as purely relational BOY Noun b o y We can remove the symbol with no loss of information. Therefore, it is a connection, not an object boy Slide 48 Another way of looking at it BOY Noun b o y Slide 49 Another way of looking at it BOY Noun b o y Slide 50 A closer look at the segments b boy y Phonological features o The phonological segments also are just locations in the network not objects (Bob) (toy) Slide 51 boy as label (not part of the structure) BOY Noun b o y boy Just a label to make the diagram easier to read Slide 52 Objection I If there are no symbols, how does the system distinguish this morpheme from others? Answer: Other morphemes necessarily have different connections Another node with the same connections would be another (redundant) representation of the same morpheme Slide 53 Objection II If there are no symbols, how does the system know which morpheme it is? Answer: If there were symbols, what would read them? Miniature eyes inside the brain? Slide 54 Relations all the way Perhaps all of linguistic structure is relational Its not relationships among linguistic items; it is relations to other relations to other relations, all the way to the top at one end and to the bottom at the other In that case the linguistic system is a network of interconnected nodes Slide 55 Objects in the mind? When the relationships are fully identified, the objects as such disappear, as they have no existence apart from those relationships The postulation of objects as some- thing different from the terms of relationships is a superfluous axiom and consequently a metaphysical hypothesis from which linguistic science will have to be freed. Louis Hjelmslev (1943/61) Slide 56 Compare SF Networks nodes and lines, plus symbols SF networks have and and or nodes They also have symbols for linguistic items E.g., polarity, positive, negative And symbols for relationships/operations SymbolMeaningExample +insertion+ x /conflationX / Y expansionX (P Q) ^orderingX ^Z : preselection: w ::classification::z =lexification=t Slide 57 Syntax is also purely relational: Example: The Actor-Goal Construction CLAUSE DO-SMTHG Vt Nom Material process (type 2) Syntactic function Semantic function Variable expression Slide 58 Syntax is also purely relational: Linked constructions CL Nom DO--SMTHG Vt Nom Material process (type 2) TOPIC-COMMENT Slide 59 Add another type of process CL DO-TO-SMTHG THING-DESCR BE-SMTHG be Nom Vt Adj Loc Slide 60 More of the English Clause DO-TO-SMTHG BE-SMTHG be Vt Vi to -ing CL Subj Pred Conc Past Mod Predicator FINITE Slide 61 The system of THEME, System network for THEME SELECTION Halliday ( 2004: 80 ) Slide 62 THEME SELECTION PREDICATOR THEME ADJUNCT THEME WH- THEME SUBJECT THEME (Unmarked in imperative) Non-wh-theme Other ( Unmarked in wh-interrogative and exclamative) ( Unmarked in declarative and yes/no interrogative) Direct translation of Hallidays system network Slide 63 Theme selection in operation This direct translation seems not to represent the way theme selection works in the cognitive system of the person forming a clause Rather, whatever will be the theme the specific item, not a high-level category to which it belongs, is active at the start of the clause formation Having been activated it comes first, as theme and the rest of the clause follows, as Rheme Slide 64 (Getting ready to add Theme) BE-SMTHG Vi to -ing CL Subj Pred Conc Past Mod Predicator FINITE Slide 65 Add Theme-Rheme BE-SMTHG Vi to -ing CL Subj Pred Predicator FINITE THEME RHEME Nom DECLARE Slide 66 Yes-No Questions to -ing Pred VP Perf Prog Subj ASK DECLARE Finite Slide 67 Yes-No Questions: Finite as Theme Pred Subj ASK Finite CL THEME RHEME DECLARE Nom Slide 68 Circumstance in the Verb Phrase be Vt Vi VP Obj Vbl Phrase Circumstance They did it I saw them He was walking in the garden a couple of days ago while she was away Slide 69 Circumstance as Theme Vi VP Vbl Phrase Circumstance THEME RHEME Slide 70 Conclusion: Relationships all the way to.. How far? What is at the bottom? Introductory view: it is phonetics In the system of the speaker, we have relational network structure all the way down to the points at which muscles of the speech-producing mechanism are activated At that interface we leave the purely relational system and send activation to a different kind of physical system For the hearer, the bottom is the cochlea, which receives activation from the sound waves of the speech hitting the ear Slide 71 What is at the top? Is there a place up there somewhere that constitutes an interface between a purely relational system and some different kind of structure? Somehow at the top there must be meaning Slide 72 What are meanings? DOG C Perceptual properties of dogs All those dogs out there and their properties In the Mind The World Outside For example, DOG Slide 73 How High is Up? Downward is toward expression Upward is toward meaning/function Does it keep going up forever? No as it keeps going it arches over, through perception Conceptual structure is at the top Slide 74 The great cognitive arch The Top Slide 75 Topics Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 76 Systemic Networks vis--vis Relational Networks: How related? They operate at different levels of precision Compare chemistry and physics Chemistry for molecules Physics for atoms Both are valuable for their purposes Slide 77 Different levels of investigation: Living Beings Systems Biology Cellular Biology Molecular Biology Chemistry Physics Slide 78 Levels of Precision Advantages of description at a level of greater precision: Greater precision Shows relationships to other areas Disadvantages of description at a level of greater precision: More difficult to accomplish Therefore, cant cover as much ground More difficult for consumer to grasp Too many trees, not enough forest Slide 79 Three Levels of precision for language Systemic networks Abstract relational network notation Narrow relational network notation (coming up) Slide 80 Topics Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 81 Narrow relational network notation Developed later Used for representing network structures in greater detail internal structures of the lines and nodes of the abstract notation The original notation can be called the abstract notation or the compact notation Slide 82 Toward Greater Precision The nodes evidently have internal structures Otherwise, how to account for their behavior? We can analyze them, figure out what internal structure would make them behave as they do Slide 83 The Ordered AND: How does it know? Activation coming downward from above How does the AND node know how long to wait before sending activation down the second line? It must have internal structure to govern this function We use the narrow notation to model the internal structure Slide 84 Internal Structure Narrow Network Notation As each line is bidirectional, it can be analyzed into a pair of one-way lines Likewise, the simple nodes can be analyzed as pairs of one-way nodes Slide 85 Abstract and narrow notation Abstract notation also known as compact notation The two notations are like different scales for making a map Narrow notation shows greater detail and greater precision Narrow notation ought to be closer to the actual neural structures www.ruf.rice.edu/~lngbrain/shipman www.ruf.rice.edu/~lngbrain/shipman Slide 86 Narrow and abstract network notation Narrow notation Closer to neurological structure Nodes represent cortical columns Links represent neural fibers (or bundles of fibers) Uni-directional Abstract notation Nodes show type of relationship ( OR, AND ) Easier for representing linguistic relationships Bidirectional Not as close to neurological structure eat apple Slide 87 More on the two network notations The lines and nodes of the abstract notation represent abbreviations hence the designation abstract Compare the representation of a divided highway on a highway map In a more compact notation it is shown as a single line In a narrow notation it is shown as two parallel lines of opposite direction Slide 88 Two different network notations Narrow notation ab b Abstract notation Bidirectional ab f Upward Downward Slide 89 Downward Nodes: Internal Structure AND OR 2 1 Slide 90 Upward Nodes: Internal Structure AND OR 2 1 Slide 91 Downward and, upward direction W 2 The Wait Element Slide 92 AND vs. OR In one direction their internal structures are the same In the other, it is a difference in threshold hi or lo threshold for hi or lo degree of activation required to cross Slide 93 Thresholds in Narrow Notation 1234 OR AND You no longer need a basic distinction AND vs. OR You can have intermediate degrees, between AND and OR The AND/OR distinction was a simplification anyway doesnt always work! Slide 94 The Wait Element w Keeps the activation alive AB Activation continues to B after A has been activated Downward AND, downward direction Slide 95 Structure of the Wait Element W 1 2 www.ruf.rice.edu/~lngbrain/neel Slide 96 Node Types in Narrow Notation T Junction Branching Blocking Slide 97 Two Types of Connection Excitatory Inhibitory Type 1 Type 2 Slide 98 Types of inhibitory connection Type 1 connect to a node Type 2 Connects to a line Used for blocking default realization For example, from the node for second there is a blocking connection to the line leading to two Slide 99 Type 2 Connects to a line TWO ORDINAL 2 second two -th Slide 100 Additional details of structure can be shown in narrow notation Connections between upward and downward directions Varying degrees of connection strength Variation in threshold strength Contrast Slide 101 The two Directions 1 2 w w Slide 102 The Two Directions w w Two Questions: 1. Are they really next to each other? 2. How do they communicate with each other? 1 2 Slide 103 Separate but in touch w w 1 2 Down Up In phonology, we know from aphasiology and neuroscience that they are in different parts of the cerebral cortex Slide 104 Phonological nodes in the cortex w w 1 2 Arcuate fasciculus Frontal lobe Temporal lobe Slide 105 Topics Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 106 Another level of precision Systemic networks Abstract relational network notation Narrow relational network notation Cortical columns and neural fibers Neurons, axons, dendrites, neurotransmitters Slide 107 Narrow RN notation as a set of hypotheses Question: Are relational networks related in any way to neural networks? We can find out Narrow RN notation can be viewed as a set of hypotheses about brain structure and function Every property of narrow RN notation can be tested for neurological plausibility Slide 108 Some properties of narrow RN notation Lines have direction (they are one-way) But they tend to come in pairs of opposite direction (upward and downward) Connections are either excitatory or inhibitory Nerve fibers carry activation in just one direction Cortico-cortical connections are generally reciprocal Connections are either excitatory or inhibitory (from different types of neurons, with two different neurotransmitters) Slide 109 More properties as hypotheses Nodes have differing thresholds of activation Inhibitory connections are of two kinds Additional properties (too technical for this presentation ) Neurons have different thresholds of activation Inhibitory connections are of two kinds (Type 2: axo-axonal) All are verified Type 1 Type 2 Slide 110 The node of narrow RN notation vis--vis neural structures The node corresponds not to a single neuron but to a bundle of neurons The cortical column A column consists of 70-100 neurons stacked on top of one another All neurons within a column act together When a column is activated, all of its neurons are activated Slide 111 The node as a cortical column The properties of the cortical column are approximately those described by Vernon Mountcastle [T]he effective unit of operationis not the single neuron and its axon, but bundles or groups of cells and their axons with similar functional properties and anatomical connections. Vernon Mountcastle, Perceptual Neuroscience (1998), p. 192 Slide 112 Three views of the gray matter Different stains show different features Nissl stain shows cell bodies of pyramidal neurons Slide 113 The Cerebral Cortex Grey matter Columns of neurons White matter Inter-column connections Slide 114 Microelectrode penetrations in the paw area of a cats cortex Slide 115 The (Mini)Column Width is about (or just larger than) the diameter of a single pyramidal cell About 3050 m in diameter Extends thru the six cortical layers Three to six mm in length The entire thickness of the cortex is accounted for by the columns Roughly cylindrical in shape If expanded by a factor of 100, the dimensions would correspond to a tube with diameter of 1/8 inch and length of one foot Slide 116 Cortical column structure Minicolumn 30-50 microns diameter Recurrent axon collaterals of pyramidal neurons activate other neurons in same column Inhibitory neurons can inhibit neurons of neighboring columns Function: contrast Excitatory connections can activate neighboring columns In this case we get a bundle of contiguous columns acting as a unit Slide 117 Levels of precision Systemic networks Abstract relational network notation Narrow relational network notation Cortical columns and neural fibers Neurons, axons, dendrites, neurotransmitters Intraneural structures Pre-/post-synaptic terminals Microtubules Ion channels Etc. Slide 118 Levels of precision Informal functional descriptions Semi-formal functional descriptions Systemic networks Abstract relational network notation Narrow relational network notation Cortical columns and neural fibers Neurons, axons, dendrites Intraneural structures and processes Slide 119 Topics Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 120 Competition vis--vis Hallidays systems Halliday (not an exact quote): Putting the emphasis on systems gives recognition to the importance of Saussure's principle that everything meaningful has meaning in contrast to what could have been selected instead Slide 121 Paradigmatic contrast: Competition a b 2 2 For example, /p/ vs. /k/ Slide 122 Simplified model of minicolumn II: Inhibition of competitors Thalamus Other cortical locations II III IV V VI Cells in neighboring columns Cell Types Pyramidal Spiny Stellate Inhibitory Slide 123 Local and distal connections excitatory inhibitory Slide 124 Paradigmatic contrast: Competition a b abab Slide 125 Paradigmatic contrast: Competition a b 2 2 abab Slide 126 Competition vis--vis Hallidays systems Halliday (not an exact quote): Putting the emphasis on systems gives recognition to the importance of Saussure's principle that everything meaningful has meaning in contrast to what could have been selected instead Slide 127 Topics Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 128 Precision vis--vis variability Description at a level of greater precision encourages observation of variability At the level of the forest, we are aware of the trees, but we tend to overlook the differences among them At the level of the trees we clearly see the differences among them But describing the forest at the level of detail used in describing trees would be very cumbersome At the level of the trees we tend to overlook the differences among the leaves At the level of the leaves we tend to overlook the differences among their component cells Slide 129 Linguistic examples At the cognitive level we clearly see that every persons linguistic system is different from that of everyone else We also see variation within the single persons system from day to day At the level of narrow notation we can treat Variation in connection strengths Variation in threshold strength Variation in levels of activation We are thus able to explain prototypicality phenomena learning etc. Slide 130 Variation in Connection Strength Connections get stronger with use Every time the linguistic system is used, it changes Can be indicated roughly by Thickness of connecting lines in diagrams or by Little numbers written next to lines Slide 131 Variation in threshold strength Thresholds are not fixed They vary as a result of use learning Nor are they integral What we really have are threshold functions, such that A weak amount of incoming activation produces no response A larger degree of activation results in weak outgoing activation A still higher degree of activation yields strong outgoing activation S-shaped (sigmoid) function N.B. All of these properties are found in neural structures Slide 132 Threshold function --------------- Incoming activation ------------------- Outgoing activation Slide 133 Topics in this presentation Aims of SFL and NCL From systemic networks to relational networks Relational networks as purely relational Levels of precision in description Narrow relational network notation Narrow relational networks and neural networks Enhanced understanding of systemic-functional choice Enhanced appreciation of variability in language Slide 134 T h a n k y o u f o r y o u r a t t e n t I o n ! Slide 135 References Halliday, M.A.K., 1994/2003. Appendix: Systemic Theory. In On Language and Linguistics (vol. 3 in the Collected Works of M.A.K. Halliday (ed. Jonathan Webster). London: Continuum Halliday, M.A.K., 2009. Methods techniques problems. In Continuum Companion to Systemic Functional Linguistics (eds. M.A.K. Halliday & Jonathan Webster). London: Continuum Hockett, Charles F., 1961. Linguistic units and their relations (Language, 1966) Lamb, Sydney, 1971. The crooked path of progress in cognitive linguistics. Georgetown Roundtable. Lamb, Sydney M., 1999. Pathways of the Brain: The Neurocognitive Basis of Language. John Benjamins Lamb, Sydney M., 2004a. Language as a network of relationships, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: Continuum Lamb, Sydney M., 2004b. Learning syntax: a neurocognitive approach, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: Continuum Mountcastle, Vernon W. 1998. Perceptual Neuroscience: The Cerebral Cortex. Cambridge: Harvard University Press. Slide 136 For further information.. www.rice.edu/langbrain