[Progress in Brain Research] Models of Brain and Mind - Physical, Computational and Psychological Approaches Volume 168 || The emergence of mind and brain: an evolutionary, computational, and philosophical approach
Post on 07-Apr-2017
R. Banerjee & B.K. Chakrabarti (Eds.)
Progress in Brain Research, Vol. 168
Copyright r 2008 Elsevier B.V. All rights reservedCHAPTER 10The emergence of mind and brain: an evolutionary,computational, and philosophical approachKlaus MainzerChair for Philosophy of Science, Institute of Interdisciplinary Informatics, University of Augsburg,D-86135 Augsburg, Germany
Abstract: Modern philosophy of mind cannot be understood without recent developments in computerscience, artificial intelligence (AI), robotics, neuroscience, biology, linguistics, and psychology. Classicalphilosophy of formal languages as well as symbolic AI assume that all kinds of knowledge must explicitlybe represented by formal or programming languages. This assumption is limited by recent insights into thebiology of evolution and developmental psychology of the human organism. Most of our knowledge isimplicit and unconscious. It is not formally represented, but embodied knowledge, which is learnt by doingand understood by bodily interacting with changing environments. That is true not only for low-level skills,but even for high-level domains of categorization, language, and abstract thinking. The embodied mindis considered an emergent capacity of the brain as a self-organizing complex system. Actually, self-organization has been a successful strategy of evolution to handle the increasing complexity of the world.Genetic programs are not sufficient and cannot prepare the organism for all kinds of complex situations inthe future. Self-organization and emergence are fundamental concepts in the theory of complex dynamicalsystems. They are also applied in organic computing as a recent research field of computer science.Therefore, cognitive science, AI, and robotics try to model the embodied mind in an artificial evolution.The paper analyzes these approaches in the interdisciplinary framework of complex dynamical systems anddiscusses their philosophical impact.
Keywords: brain; mind; complex systems; nonlinear dynamics; self-organization; computational systems;artificial mindsFrom linear to nonlinear dynamics
The brain is a complex cellular system of 1011
neurons and 1014 synaptic connections. In order tounderstand and to model the emergence of itsmental functions, we must study the nonlineardynamics of complex systems. In general, aCorresponding author. Tel.: +49-821-598-5568; Fax: +49-821-598-5584; E-mail: firstname.lastname@example.org
DOI: 10.1016/S0079-6123(07)68010-8 115dynamical system is a time-depending multi-component system of elements with local statesdetermining a global state of the whole system.In a planetary system, for example, the state ofa planet at a certain time is determined by itsposition and momentum. The states can also referto moving molecules in a gas, the excitation ofneurons in a neural network, nutrition of organ-isms in an ecological system, supply and demandof economic markets, the behavior of social groupsin human societies, routers in the complex network
116of the internet, or units of a complex electronicequipment in a car. The dynamics of a system, thatis, the change of systems states depending on time,is represented by linear or nonlinear differentialequations. In the case of nonlinearity, several feed-back activities take place between the elements ofthe system. These many-body problems correspondto nonlinear and nonintegrable equations withinstabilities and sometimes chaos (Mainzer, 2007).
From a philosophical point of view, mathema-tical linearity means a strong concept of causalitywith similar causes or inputs of a dynamical systemleading to similar effects or outputs: small changesin the parameters or small perturbations added tothe values of the variables produce small changesin subsequent values of the variables. Further on,composite effects of linear systems can be reducedto the sum of more simple effects. Therefore,scientists have used linear equations to simplifythe way in which we think about the behavior ofcomplex systems. The principle of superpositionhas its roots in the concept of linearity. But, in thecase of nonlinearity, similar causes lead toexponentially separating and expanding effects:small changes in the parameters or small perturba-tions added to the values of the variables canproduce enormous changes in subsequent values ofthe variables because of its sensitivity to initialconditions. In this case, the whole is more than thesum of its elements.
The mathematical theory of nonlinear dynamicsdistinguishes different types of time-dependingequations, generating different types of behavior,such as fixed points, limit cycles, and chaos. In atop-down approach of model building, we startwith an assumed mathematical model of a naturalor technical system and deduce its behavior bysolving the corresponding dynamical equationsunder certain initial conditions. The solutions canbe represented geometrically as trajectories in thephase space of the dynamical system and classifiedby different types of attractors. But, in practice, weoften adopt the opposite method of a bottom-upapproach. Physicists, chemists, biologists, physi-cians, or engineers start with data mining inan unknown field of research. They only get afinite series of measured data corresponding totime-depending events of a dynamical system.From these data they must reconstruct thebehavior of the system in order to guess its typeof a dynamical equation. Therefore, the bottom-upapproach is called time series analysis. In manycases, we have no knowledge of the system fromwhich the data was acquired. Time series analysisthen aims to construct a black box, which takethe measured data as input and provides asoutput a mathematical model describing the data(Small, 2005; Floridi, 2004). In practice, therealistic strategy of research is a combination ofthe top-down approach with model building andthe bottom-up approach with time series analysisof the measured data.
In classical measurement theory, measurementerror is analyzed by statistical methods, such ascorrelation coefficient and autocorrelation func-tion. But these standard procedures are not ableto distinguish between data from linear andnonlinear models. In nonlinear data analysis, themeasured data are used in a first step toreconstruct the dynamics of the system in a phasespace. Nonlinear dynamical systems generatingchaos must be determined by at least threeequations. For example, a three-dimensionalattractor is generated in a phase space with threecoordinates x(t), y(t), and z(t), which are deter-mined by three time-depending nonlinear differ-ential equations. But, in practice, it is oftendifficult to distinguish several variables of asystem. Nevertheless, if only one variable can bemeasured, an attractor with a finite number ofdimensions can be reconstructed from the mea-sured time series with great similarity to theoriginal attractor of the system. We must onlyassume that we can also measure the derivative ofthat variable, and further higher order derivativesup to some finite level d. Then, if the dimensionof the system is less than d, we have enoughinformation to completely describe the system withd differential or difference equations. Measuring dderivatives is equivalent to measuring the systemat d different time intervals. Therefore, accordingto Takens theorem, the measured time series ofa variable can be embedded in a reconstructedphase space with d dimensions. The sequence ofpoints created by embedding the measured timeseries is a reconstructed trajectory of the original
117system, generating an attractor with great simila-rity to the original one of the system.
In practice, decisions about chaotic dynamicsare rather difficult. How can we decide that a timeseries of measured data is not generated by noisyirregularity but by highly structured chaoticattractors? A chaotic attractor is determined by atrajectory in a bounded region of a phase spacewith aperiodic behavior and sensitive dependenceon initial conditions. These criteria determinism,boundedness, aperiodicity, and sensitivity canbe checked by several techniques of time seriesanalysis. In the case of noise, the trajectories spreadunbounded all over the phase space. A chaoticattractor is finite and always bounded in a certainregion of the phase space. Aperiodicity meansthat the states of a dynamical system never returnto their previous values. But values of statesmay return more or less to the vicinity of previousvalues. Thus, aperiodicity is a question of degreewhich can be studied in recurrence plots ofmeasured points. Such plots depict how thereconstructed trajectory recurs or repeats itself.The correlation integral defines the density ofpoints in a recurrence plot where the measuredtime series are closer than a certain degree ofdistance.
If a time series is generated by a chaotic system,the trajectory of the time series, which is recon-structed from the measurement data of embed-ding, has the same topological properties as theoriginal attractor of the system, as long as theembedding dimension is large enough. Takensproved a method for finding an appropriateembedding dimension for the reconstruction ofan attractor. But this method yields no procedurefor finding a chaotic attractor, because its existencehas been already assumed in order to determine itsdimension from the measured data.
Another way to characterize chaotic dynamics isto measure the strength of their sensitive depen-dence on initial data. Consider two trajectoriesstarting from nearly the same initial data. Inchaotic dynamics only a tiny difference in theinitial conditions can result in the two trajectoriesdiverging with exponential speed in the phasespace after a short period of time. In this case, it isdifficult to calculate long-term forecasts, becausethe initial data can only be determined with afinite degree of precision. Tiny deviations in digitsbehind the decimal point of measurement datamay lead to completely different forecasts. This isthe reason why attempts to forecast weather failin an unstable and chaotic situation. In principle,the wing of a butterfly may cause a global changeof development. This butterfly effect can bemeasured by the so-called Lyapunov exponent L.A trajectory x(t) starts with an initial state x(0).If it develops exponentially fast, then it isapproximately given by |x(t)|B|x(0)|eLt. The expo-nent is smaller than zero if the trajectory isattracted by attractors, such as stable points ororbits. It is larger than zero if it is divergent andsensitive to very small perturbations of the initialdata.
An attractor is typically a finite region in thephase space. Sensitivity to initial conditions meansthat any nearby points on the attractor in thephase space diverge from each other. They cannot,however, diverge forever, because the attractor isfinite. Thus, trajectories from nearby initial pointson the attractor diverge and are folded back ontothe attractor, diverge and are folded back, etc.The structure of the attractor consists of manyfine layers, like an exquisite pastry. The closerone looks, the more detail in the adjacent layersof the trajectories is revealed. Thus, the attractoris fractal. An attractor that is fractal is calledstrange. There are also chaotic systems that areonly exponentially sensitive to initial conditionsbut not strange. Attractors can also be strange(fractal), but not chaotic. The fractal dimension ofan attractor is related to the number of indepen-dent variables needed to generate the time seriesof the values of the variables. If d is the smallestinteger greater than the fractal dimension of theattractor, then the time series can be generated bya set of d differential equations with d independentvariables. For example, a strange attractor offractal dimension 2.03 needs three nonlinearcoupled differential equations to generate itstrajectory.
In summary, dynamical systems can be classifiedby attractors with increasing complexity from fixedpoints, periodic and quasi-periodic up to chaoticbehavior. This classification of attractors can be
118characterized by different methods, such as typicalpatterns of time series, their power spectrum,phase portraits in a phase space, Lyapunovexponents, or fractal dimensions. A remarkablemeasure of complexity is the KolmogorovSinai(KS) entropy, measuring the information flowin a dynamical system (Deco and Schurmann,2000; Mainzer, 2007). A dynamical system can beconsidered an information-processing machine,computing a present or future state as outputfrom an initial past state of input. Thus, thecomputational efforts to determine the states of asystem characterize the computational complexityof a dynamical system. The transition from regularto chaotic systems corresponds to increasingcomputational problems, according to the compu-tational degrees in the theory of computationalcomplexity. In statistical mechanics, the informa-tion flow of a dynamical system describes theintrinsic evolution of statistical correlationsbetween its past and future states. The KS-entropyis an extremely useful concept in studying the lossof predictable information in dynamical systems,according to the complexity degrees of theirattractors. Actually, the KS-entropy yields ameasure of the prediction uncertainty of a futurestate provided the whole past is known (with finiteprecision).
In the case of fixed points and limit cycles,oscillating or quasi-oscillating behavior, there is nouncertainty or loss of information, and theprediction of a future state can be computed fromthe past. Consequently, the KS-entropy is zero.In chaotic systems with sensitive dependence onthe initial states, there is a finite loss of informa-tion for predictions of the future, according to thedecay of correlations between the past states andthe future state of prediction. The finite degree ofuncertainty of a predicted state increases linearlyto its number of steps in the future, given the entirepast. In the case of chaos, the KS-entropy has afinite value (larger than zero). But in the case ofnoise, the KS-entropy becomes infinite, whichmeans a complete loss of predicting informationcorresponding to the decay of all correlations (i.e.,statistical independence) between the past and thenoisy state of the future. The degree of uncertaintybecomes infinite.Self-organization and emergence in evolution
How can the knowledge of chaos be applied inorder to control risky and unstable situations incomplex systems? This question will be a challengefor modeling the brain with millions of interactingcells in nonlinear dynamics. It seems to beparadoxical that chaotic systems which are extre-mely sensitive to the tiniest fluctuations can becontrolled. But nowadays the control of chaos hasbeen realized in chemical, fluid, and biologicalsystems. In technology, for example, the intrinsicinstability of chaotic celestial orbits is routinelyused to advantage by international space agencieswho divert spacecraft to travel vast distances usingonly modest fuel expenditures. All techniques ofchaos control make use of the fact that chaoticsystems can be controlled if disturbances arecountered by small and intelligently appliedimpulses. Just as an acrobat balances about anunstable position on a tightrope by the applicationof small correcting movements, a chaotic systemcan be stabilized about any of an infinite numberof unstable states by continuous application ofsmall corrections.
Two characteristics of chaos make the applica-tion of control techniques possible. First, chaoticsystems alternatively visit small neighborhoods ofan infinite number of periodic orbits. The presenceof an infinite number of periodic orbits embeddedwithin a chaotic trajectory implies the existence ofan enormous variety of different behaviors withina single system. Thus, the control of chaos opensup the potential for a great flexibility in operatingperformance within a single system.
A second characteristic of chaos that is impor-tant for control applications is its exponentialsensitivity. It follows that the state of chaoticsystem can be drastically altered by the applicationof small perturbations. Therefore, uncontrolledchaotic systems fluctuate wildly. But, on the otherside, controlled chaotic systems can be directedfrom one state to a very different one using onlyvery small controls. Obviously, controlling strate-gies require that the system state lie close to thedesired state. In such a case, the system dynamicscan be linearized, making control calculationsrapid and effective. In chaotic systems, ergodicity
119ensures that the system state will eventuallywander arbitrarily close to the desired state. Butin higher dimensional or slowly varying systems,the time taken for the state to move on its ownfrom one state to another can be prohibitive. Inthis case, fully nonlinear control strategies havebeen devised that use chaotic sensitivity to steerthe system state from any given initial point to adesired state. Since chaotic systems amplify con-trol impulses exponentially, the time needed tosteer such a system can be quite short. Thesestrategies have been demonstrated both in systemsin which a large effect is desired using very modestparameter expenditures (e.g., energy and fuel) andin systems in which rapid switching between statesis needed (e.g., computational and communicationapplications).
Nonlinear dynamics does not only yield chaosand noise, but also order. The emergence of orderand structures in evolution can be explained bythe dynamics of attractors in complex systems(Mainzer, 2007). They result from collectivepatterns of interacting elements in the sense ofmany-body problems that cannot be reduced tothe features of single elements in a complex system.Nonlinear interactions in multi-component (com-plex) systems often have synergetic effects, whichcan neither be traced back to single causes nor beforecasted in the long run or controlled in all itsdetails. Again, the whole is more than the sum ofits parts. This popular slogan for emergence isprecisely correct in the context of nonlinearity.
The mathematical formalism of complex dyna-mical systems is taken from statistical mechanics.If the external conditions of a system are changedby varying certain control parameters (e.g., tem-perature), the system may undergo a change in itsmacroscopic global states at some critical point.For instance, water as a complex system ofmolecules changes spontaneously from a liquidto a frozen state at a critical temperature of zeroCelsius. In physics, those transformations ofcollective states are called phase transitions.Obviously they describe a change of self-organizedbehavior between the interacting elements of acomplex system.
According to Landau, the suitable macrovari-ables characterizing the change of global order aredenoted as order parameters. For example, theemergence of magnetization in a ferromagnet isa self-organized behavior of atomic dipoles thatis modeled by a phase transition of an orderparameter, the average distribution of microstatesof the dipoles, when the system is annealed to theCurie point. The concept of order parameterscan be generalized for phase transitions, when thesystem is driven away from equilibrium byincreasing energy (Haken and Mikhailov, 1993).If, for example, the fluid of a stream is drivenfurther and further away from thermal equili-brium, by increasing fluid velocity (control para-meter), then fluid patterns of increasing complexityemerge from vortices of fixed points, periodicoscillations to chaotic turbulence. Roughly speak-ing, we may say that old structures becomeunstable, broken down by changing control para-meters, and new patterns and attractors emerge.
More mathematically, nonlinear differentialequations are employed to model the dynamicsof the system. At first, we study the behavior ofthe elements on the microlevel in the vicinity ofa critical point of instability. In a linear-stabilityanalysis, one can distinguish stable and unstablemodes which increase to the macroscopic scale,dominating the macrodynamics of the wholesystem. Thus, some few unstable modes becomethe order parameters of the whole system. From amethodological point of view, the introduction oforder parameters for modeling self-organizationand the emergence of new structures is a giantreduction of complexity. The study of, perhaps,billions of equations, characterizing the behaviorof the elements on the microlevel, is replaced bysome few equations of order parameters, chara-cterizing the macrodynamics of the whole system.Complex dynamical systems and their phasetransitions deliver a successful formalism to modelself-organization and emergence. Further on, theknowledge of characteristic order parametersand critical values of control parameters open achance to influence the whole dynamics and tocreate desired states of technical systems by self-organization. The formalism does not dependon special, for example, physical laws, but mustbe appropriately interpreted for biological andtechnical applications.
120According to the general scheme of nonlineardynamics, biological organisms function on manylevels that have emerged step-by-step duringevolution. It is a question of granulation as tohow deep we like to lay the initial layer ofmicrodynamics. As far as we know at least atomicdynamics influence states of living organisms.During prebiotic evolution, interacting atoms andmolecules created complex biomolecules (e.g.,proteins) by catalytic and autocatalytic processeswhich are the building blocks of cells. Interactingcells achieved complex cellular systems like organsor organisms which are elements of populations.Further on, interacting populations becameelements of ecological networks as examples ofcomplex systems. Thus, from the nonlineardynamics at each level, there emerge new entitiesthat are characterized by order parameters. Exam-ples of order parameters are characteristic macro-scopic features of phenotypes which aredetermined by the genotype of an organism onthe microlevel. The macrodynamics of these orderparameters determine the microdynamics of thenew entities, providing the basis of macrodynamicson the following level. In principle, the dynamicsof each level can be modeled by appropriatenonlinear differential equations. In this case, thesucceeding hierarchical level can be mathemati-cally derived from the previous one by a linear-stability analysis.
How can order be regulated and controlled inliving organisms? This is a key question withrespect to modeling in organic computing. Impor-tant examples are bio-oscillators which can beconsidered to be the order parameters of life.Nature abounds with rhythmic behavior thatclosely intertwines the physical and biologicalsciences. The diurnal variations in dark and lightgive rise to circardian physiological rhythms. Butthe rhythmic nature of biological process is notonly controlled by external processes. It oftenarises from the intrinsic dynamics of complexnonlinear networks. Since all biological systemsare thermodynamically open to the environmentthey are dissipative, that is, they give up energy totheir surroundings in the form of heat. Thus, forthe oscillator to remain periodic, energy must besupplied to the system in such a way as to balancethe continual loss of energy due to dissipation.If a balance is maintained, then the phase spaceorbits become a stable limit cycle, that is, all orbitsin the neighborhood of this orbit merge with itasymptotically. Such a system is called a bio-oscillator, which left to itself begins to oscillatewithout apparent external excitation. The self-generating or self-regulating features of bio-oscillators depend on the intrinsic nonlinearity ofthe biological system.
How can perturbations of such systems be usedto explore and control their physiological proper-ties? This question does not only inspire newtherapeutic methods in medicine, but also techni-cal applications in organic computing. An exampleof a complex cellular system is the heart which canbe considered a bio-oscillator (Bassingthwaighteet al., 1994). In the simple case of an embryonicchick heart, a cardiac oscillator can be describedby a system of ordinary differential equations witha single unstable steady-state and displayingan asymptotically stable limit cycle oscillation thatis globally attracting. After short perturbations,the pulses return quickly to the limit cycle. Thedynamics can be studied in the corresponding timeseries of ECG-curves. The dynamics of a mamma-lian heart is much more complex. The questionarises if observed fluctuations are the result ofthe oscillations being unpredictably perturbedby the cardiac environment, or are a consequenceof cardiac dynamics being given by a chaoticattractor, or both. In healthy patients, the heartrate is modulated by a complex combination ofrespiratory, sympathetic, and parasympatheticregulators. For unhealthy patients, the ideas ofchaos control can be incorporated into therapeuticsituations. Control is attempted by stimulating theheart at appropriate times. Repeated interventionprevents the rhythm from returning to the chaoticmode.
Obviously, the total and global chaos of asystem is dangerous. But local chaotic fluctuationsare physiologically advantageous. Sustained perio-dicities are often unhealthy. To maintain health,the variables of a physiological system must beable to extend over a wide range to provide flexibleadaption. Healthy people have greater variabilityin their heart rates than those with heart disease.
121Thus, local chaotic fluctuations may provide theplasticity to cope with the exigencies of anunpredictable and changing environment.
Chaotic systems can be controlled more finelyand more quickly than linear systems. In linearsystems, the response of the output dependslinearly on the input. Small changes in a parameterof a linear system produce only small changes inthe output. The variable controlling a chaoticphysiological response may need to change by onlya small amount to induce the desired large changein the physiological state. Moreover, a chaoticphysiological system can switch very rapidly fromone physiological state to another. Natural self-control and self-organization of complex physio-logical systems open a wide range of medical andengineering applications.
The traditional notion of health is one ofhomeostasis and is based on the idea that thereexists an ideal state in which the body is operatingin a maximally efficient way. In this opinion,illness is considered to be the deviation of the bodyfrom this state, and it is the business of thephysician to assist the patient in regaining thisstate again. The nonlinear dynamics of biologicalsystems suggest the replacement of homeostasis byhomeodynamics allowing a more flexible view ofhow the systems work and making room for theconcept of systems with complex responses, evento the point of inherent instability. The mamma-lian organism is composed of multiple nested loopsof nonlinear interacting systems on the physiolo-gical level. How much greater are the possibilitiesfor complex behavior at the psychic levels of thebrain.Self-organization and emergence of brain and mind
The coordination of the complex cellular andorganic interactions in an organism needs a newkind of self-organizing control. That was madepossible by the evolution of nervous systems thatalso enabled organisms to adapt to changing livingconditions and to learn from experiences with theirrespective environments. The hierarchy of anato-mical organizations varies over different scales ofmagnitude, from molecular dimensions to that ofthe entire central nervous system (CNS). Theresearch perspectives on these hierarchical levelsmay concern questions, for example, of howsignals are integrated in dendrites, how neuronsinteract in a network, how networks interact in asystem like vision, how systems interact in theCNS, or how the CNS interacts with its environ-ment. Each stratum may be characterized bysome order parameters determining its particularstructure, which is caused by complex interactionsof subelements with respect to the particular levelof hierarchy.
On the microlevel of the brain, there aremassively many-body problems which need areductionist strategy to get a handle with theircomplexity. In the case of EEG-pictures, acomplex system of electrodes measures local states(electric potentials) of the brain. The whole state ofa patients brain on the microlevel is representedby local time series. In the case of, for example,petit mal epilepsy, they are characterized by typicalcyclic peaks. The microscopic states determinethe macroscopic electric field patterns during acyclic period. Mathematically, the macroscopicpatterns can be determined by spatial modes andorder parameters, that is, the amplitude of thefield waves. In the corresponding phase space, theydetermine a chaotic attractor characterizing petitmal epilepsy.
The neural self-organization on the cellular andsubcellular level is determined by the informationprocessing in and between neurons. Chemicaltransmitters can effect neural information proces-sing with direct and indirect mechanisms of greatplasticity. Long-term potentiation (LTP) of synap-tic interaction is an extremely interesting topic ofrecent brain research. LTP seems to play anessential role for the neural self-organization ofcognitive features such as, memory and learning.The information is assumed to be stored in thesynaptic connections of neural cell assemblies withtypical macroscopic patterns.
But while an individual neuron does not see orreason or remember, brains are able to do so.Vision, reasoning, and remembrance are under-stood as higher level functions. Scientists whoprefer a bottom-up strategy recommend thathigher level functions of the brain can be neither
122addressed nor understood until particular propertyof each neuron and synapse is explored andexplained. An important insight of the complexsystem approach discloses that emergent effectsof the whole system are synergetic system effectswhich cannot be reduced to the single elements.They are results of nonlinear interactions. There-fore, the whole is more than the (linear) sum of itsparts. Thus, from a methodological point of view,a purely bottom-up strategy of exploring the brainfunctions must fail. On the other hand, theadvocates of a purely top-down strategy proclaim-ing that cognition is completely independent of thenervous system are caught in the old Cartesiandilemma How does the ghost drive the machine?
Today, we can distinguish several degrees ofcomplexity in the CNS. The scales considermolecules, membranes, synapses, neurons, nuclei,circuits, networks, layers, maps, sensory systems,and the entire nervous system. The researchperspectives on these hierarchical levels mayconcern questions, for example, of how signalsare integrated in dendrites, how neurons interactin a network, how networks interact in a systemlike vision, how systems interact in the CNS,or how the CNS interacts with its environment.Each stratum may be characterized by some orderparameters determining its particular structures,which is caused by complex interactions ofsubelements with respect to the particular levelof hierarchy. Beginning at the bottom, we maydistinguish the orders of ion movement, channelconfigurations, action potentials, potential waves,locomotion, perception, behavior, feeling, andreasoning.
The different abilities of the brain need mas-sively parallel information processing in a complexhierarchy of neural structures and areas. We knowmore or less complex models of the informationprocessing in the visual and motoric systems. Eventhe dynamics of the emotional system is interactingin a nonlinear feedback manner with severalstructures of the human brain. These complexsystems produce neural maps of cell assemblies.The self-organization of somatosensoric maps iswell known in the visual and motoric cortex. Theycan be enlarged and changed by learning proce-dures such as the training of an apes hand.Positron emission tomography (PET) picturesshow macroscopic patterns of neurochemicalmetabolic cell assemblies in different regions ofthe brain which are correlated with cognitiveabilities and conscious states such as looking,hearing, speaking, or thinking. Pattern formationof neural cell assemblies are even correlated withcomplex processes of psychic states. Perturbationsof metabolic cellular interactions (e.g., cocaine)can lead to nonlinear effects initiating complexchanges of behavior (e.g., addiction by drugs).These correlations of neural cell assemblies andorder parameters (attractors) of cognitive andconscious states demonstrate the connection ofneurobiology and cognitive psychology in recentresearch, depending on the standards of measuringinstruments and procedures.
Many questions are still open. Thus, we canonly observe that someone is thinking and feeling,but not, what he is thinking and feeling. Furtheron, we observe no unique substance calledconsciousness, but complex macrostates of thebrain with different degrees of sensoric, motoric,or other kinds of attention. Consciousness meansthat we are not only looking, listening, speaking,hearing, feeling, thinking, etc., but we know andperceive ourselves during these cognitive processes.Our self is considered an order parameter of astate, emerging from a recursive process of multi-ple self-reflections, self-monitoring, and supervis-ing of our conscious actions. Self-reflection ismade possible by the so-called mirror neurons(e.g., in the Broca area) which let primates(especially humans) imitate and simulate interest-ing processes of their companions. Therefore, theycan learn to take the perspectives of themselvesand their companions in order to understand theirintentions and to feel with them. The emergenceof subjectivity is neuropsychologically well under-stood.
The brain does not only observe, map, andmonitor the external world, but also internal statesof the organism, especially its emotional states.Feeling means self-awareness of ones emotionalstates which is mainly caused by the limic system.In neuromedicine, the Theory of Mind (ToM)even analyzes the neural correlates of social feelingwhich are situated in special areas of the
123neocortex. For example, people suffering fromAlzheimer disease, loose their feeling of empathyand social responsibility because the correlatedneural areas are destroyed. Therefore, our moralreasoning and deciding have a clear basis in braindynamics.
From a neuropsychological point of view, theold philosophical problem of qualia is alsosolvable. Qualia mean properties which are con-sciously experienced by a person. In a thoughtexperiment a neurobiologist is assumed to becaught in a black-white room. Theoretically, sheknows everything about neural information pro-cessing of colors. But she never had a chanceto experience colors. Therefore, exact knowledgesays nothing about the quality of consciousexperience. Qualia in that sense emerge bybodily interaction of self-conscious organisms withtheir environment which can be explained by thenonlinear dynamics of complex systems. There-fore, we can explain the dynamics of subjectivefeelings and experiences, but, of course, the actualfeeling is an individual experience. In medicine, thedynamics of a certain pain can often be completelyexplained by a physician, although the actualfeeling of pain is an individual experience of thepatient.
In order to model the brain and its complexabilities, it is quite adequate to distinguish thefollowing categories. In neuronal-level models,studies are concentrated on the dynamic andadaptive properties of each nerve cell or neuron,in order to describe the neuron as a unit. Innetwork-level models, identical neurons are inter-connected to exhibit emergent system functions.In nervous system-level models, several networksare combined to demonstrate more complexfunctions of sensory perception, motor functions,stability control, etc. In mental-operation-levelmodels, the basic processes of cognition, thinking,problem solving, etc. are described.
In the complex systems approach, the micro-scopic level of interacting neurons should bemodeled by coupled differential equations model-ing the transmission of nerve impulses by eachneuron. The HodgkinHuxley equation is anexample of a nonlinear diffusion reaction equationwith an exact solution of a traveling wave, givinga precise prediction of the speed and shape of thenerve impulse of electric voltage. In general, nerveimpulses emerge as new dynamical entities likering waves in BZ-reactions or fluid patterns innonequilibrium dynamics. In short they are theatoms of the complex neural dynamics. On themacroscopic level, they generate a cell assemblywhose macrodynamics is dominated by orderparameters. For example, a synchronously firingcell assembly represents some visual perceptionof a plant which is not only the sum of itsperceived pixels, but characterized by sometypical macroscopic features like form, back-ground, or foreground. On the next level, cellassemblies of several perceptions interact in acomplex scenario. In this case, each cell assemblyis a firing unit, generating a cell assembly of cellassemblies whose macrodynamics is characterizedby some order parameters. The order parametersmay represent similar properties of the perceivedobjects.
In this way, we get a hierarchy of emerginglevels of cognition, starting with the micrody-namics of firing neurons. The dynamics of eachlevel is assumed to be characterized by differentialequations with order parameters. For example,on the first level of macrodynamics, order para-meters characterize a visual perception. On thefollowing level, the observer becomes conscious ofthe perception. Then the cell assembly of percep-tion is connected with the neural area that isresponsible for states of consciousness. In a nextstep, a conscious perception can be the goal ofplanning activities. In this case, cell assemblies ofcell assemblies are connected with neural areas inthe planning cortex, and so on. They are repre-sented by coupled nonlinear equations withfiring rates of corresponding cell assemblies. Evenhigh-level concepts like self-consciousness canbe explained by self-reflections of self-reflections,connected with a personal memory which isrepresented in corresponding cell assemblies ofthe brain. Brain states emerge, persist for a smallfraction of time, then disappear, and are replacedby other states. It is the flexibility and creativenessof this process that makes a brain so successful inanimals for their adaption to rapidly changing andunpredictable environments.
124Self-organization and emergence of computationalsystems
Computational systems were historically con-structed on the background of Turings theoryof computability (Dennett, 1998). In his function-alism, the hardware of a computer is related tothe wetware of human brain. The mind isunderstood as the software of a computer. Turingargued: If human mind is computable, it can berepresented by a Turing program (Churchsthesis) which can be computed by a universalTuring machine, that is, technically by a generalpurpose computer. Even if people do not believein Turings strong AI-thesis, they often claimclassical computational cognitivism in the follow-ing sense: computational processes operate onsymbolic representations referring to situations inthe outside world. These formal representationsshould obey Tarskis correspondence theoryof truth: imagine a real world situation X1(e.g., some boxes on a table) which is encodedby a symbolic representation A1=encode(X1)(e.g., a description of the boxes on the table).If the symbolic representation A1 is decoded, thenwe get the real world situation X1 as its meaning,that is, decode(A1)=X1. A real-world operationT (e.g., a manipulation of the boxes on the tableby hand) should produce the same real-worldresult A2, whether performed in the real world oron the symbolic representation: decode(encode(T)(encode(X1)))=T(X1)=X2. Thus, there is anisomorphism between the outside situation andits formal representation. As the symbolic opera-tions are completely determined by algorithms,the real-world processes are assumed to becompletely controlled. Therefore, classicalrobotics operates with completely determinedcontrol mechanisms.
Symbolic representations with ontologies, cate-gories, frames, and scripts of expert systems workalong this line. But, they are restricted to aspecialized knowledge base without the back-ground knowledge of a human expert. Humanexperts do not rely on explicit (declarative) rule-based representations only, but also on intuitionand implicit (procedural) knowledge (Dreyfus,1982; Searle, 1983). Further on, as alreadyWittgenstein knew, our understanding dependson situations. The situatedness of representationsis a severe problem of informatics. A robot, forexample, needs a complete symbolic representa-tion of a situation which must be updated if therobots position is changed. Imagine that itsurrounds a table with a ball and a cup on it. Aformal representation in a computer languagemay be ON(TABLE,BALL), ON(TABLE,CUP),BEHIND(CUP,BALL), etc. Depending on therobots position relative to the arrangement, thecup is sometimes behind the ball or not. So, theformal representation BEHIND(CUP,BALL)must always be updated in changing positions.How can the robot prevent incomplete knowl-edge? How can it distinguish between reality andits relative perspective? Situated agents likehuman beings need no symbolic representationsand updating. They look, talk, and interactbodily, for example, by pointing to things. Evenrational acting in sudden situations does notdepend on symbolic representations and logicalinferences, but on bodily interactions with asituation (e.g., looking, feeling, reacting).
Thus, we distinguish formal and embodiedacting in games with more or less similarity toreal life: chess, for example, is a formal game withcomplete representations, precisely defined states,board positions, and formal operations. Soccer is anonformal game with skills depending on bodilyinteractions, without complete representations ofsituations and operations which are never exactlyidentical. According to the French philosopherMerleau-Ponty (1962), intentional human skills donot need any symbolic representation, but theyare trained, learnt, and embodied by the organism.An athlete like a pole-vaulter cannot repeat hersuccessful jump like a machine generating thesame product. The embodied mind is no mystery.Modern biology, neural, and cognitive science givemany insights into its origin during the evolutionof life.
Organic computing applies the principles ofevolution and life to technical systems. Thedominating principles in the complex world ofevolution are self-organization and self-control.How can they be realized in technical systems?A nice test bed for all kinds of technical systems
125are computational automata. There is a preciserelation between self-organization of nonlinearsystems with continuous dynamics and discretecellular automata (CA). The dynamics of non-linear systems is given by differential equationswith continuous variables and a continuousparameter of time. Sometimes, difference equa-tions with discrete time points are sufficient. Ifeven the continuous variables are replaced bydiscrete (e.g., binary) variables, we get functionalschemes of automata with functional argumentsas inputs and functional values as outputs. Thereare classes of CA modeling attractor behavior ofnonlinear complex systems which is well knownfrom self-organizing processes.
But in many cases, there is no finite program, inorder to forecast the development of randompatterns. In general, there are three reasons forcomputational limits of system dynamics. (1) Asystem may be undecidable in a strict logical sense.(2) Further on, a system can be deterministic, butnonlinear and chaotic. In this case, the systemdepends sensitively on tiny changes of initial datain the sense of the butterfly effect. Long-termforecasting is restricted, and the computationalcosts of forecasting increase exponentially aftersome few steps of future predictions. (3) Finally,a system can be stochastic and nonlinear. In thiscase, only probabilistic predictions are possible.Thus, pattern emergence of CA cannot be con-trolled in any case.
Cellular automata are only a theoretical conceptof computational dynamics. In electrical engineer-ing, information and computer science, the con-cept of cellular neural networks (CNNs) hasrecently become an influential paradigm of com-plexity research and is being realized in informa-tion and chip technology (Chua and Roska, 2002;Mainzer, 2007). CNNs have been made possibleby the sensor revolution of the late 1990s. Cheapsensors and micro-electro-mechanical system(MEMS) arrays have become popular as artificialeyes, noses, ears, tastes, and somatosensor devices.An immense number of generic analog signalshave been processed. Thus, a new kind of chiptechnology, similar to signal processing in naturalorganisms, is needed. Analog cellular computersare the technical response to the sensor revolution,mimicking the anatomy and physiology of sensoryand processing organs. A CNN is their hard core,because it is an array of analog dynamic proces-sors or cells.
In general, a CNN is a nonlinear analog circuitthat processes signals in real time. It is a multi-component system of regularly spaced identicalunits called cells that communicate directly witheach other only through their nearest neighbors.The locality of direct connections is a naturalprinciple which is also realized by brains and CA.Total connectivity would be energetically tooexpensive with the risk of information chaos.Therefore, it was selected by evolution of thebrain and not applied in technology. Unlikeconventional CA, CNN host processors acceptand generate analog signals in continuous timewith real numbers as interaction values. Thedynamics of a cells state are defined by anonlinear differential equation (CNN state equa-tion) with scalars for state, output, input, thresh-old, and coefficients, called synaptic weights,modeling the intensity of synaptic connections ofthe cell with the inputs and outputs of the neighborcells. The CNN output equation connects thestates of a cell with the outputs.
CNN arrays are extremely useful for standardsin visual computing. Examples are CNNs thatdetect patterns in either binary (black-and-white)or gray-scale input images. An image consists ofpixels corresponding to the cells of CNN withbinary or gray scale. From the perspective ofnonlinear dynamics, it is convenient to think ofstandard CNN state equations as a set of ordinarydifferential equations. Contrary to the usual CAapproach with only geometric pattern formation ofcells, the dynamical behavior of CNNs can bestudied analytically by nonlinear equations.Numerical examples deliver CNNs with limit cyclesand chaotic attractors. For technical implementa-tions of CNNs, such as silicon chips, completestability properties must be formulated, in order toavoid oscillations, chaotic, and noise phenomena.These results also have practical importance forimage processing applications of CNNs. As brainsand computers work with units in two distinctstates, the conditions of bistability are studied inbrain research, as well as in chip technology.
126CNNs are optimal candidates to simulate localsynaptic interactions of neurons generating collec-tive macro phenomena. Hallucinations, for exam-ple, are the results of self-organizing phenomenawithin the visual cortex. This type of patternperception seems to be similar to pattern forma-tion of fluids in chemistry or aerodynamics.Pattern formation in the visual brain is due tolocal nonlinear coupling among cells. In the livingorganism, there is a spatial transformationbetween the pattern perception of the retina andthe pattern formation within the visual cortex ofthe brain. First simulations of this cortico-retinaltransformation by CNNs generate remarkablesimilarities with pattern perceptions that are wellknown from subjective experiences of hallucina-tions. Perceptions of a spiraling tunnel patternhave been reported by people who were clinicallydead and later revived. The light at the end of thetunnel has sometimes been interpreted as religiousexperiences.
CNNs with information processing in nanose-conds and even the speed of light seem to beoptimal candidates for applications in neurobio-nics. Obviously, there are surprising similaritiesbetween CNN architectures and, for example,the visual pathway of the brain. An appropriateCNN approach is called the Bionic Eye, whichinvolves a formal framework of vision modelscombined and implemented on the so-called CNNuniversal machine. Like a universal Turingmachine, a CNN universal machine can simulateany specialized CNN and is technically con-structed in chip technology. Visual illusions whichhave been studied in cognitive psychology can alsobe simulated by a universal CNN chip. The samearchitecture of a universal machine can not only beused to mimic the retinas of animals (e.g., of afrog, tiger salamander, rabbit, or eagle), but theycan also be combined and optimized for technicalapplications. The combination of biological andartificial chips is no longer a science fiction-likedream of cyborgs, but a technical reality withinspiring ramifications for robotics and medicine.
In epileptology, clinical applications of CNNchips have already been envisaged. The idea isto develop a miniaturized chip device for theprediction and prevention of epileptic seizures.Nonlinear time series analysis techniques havebeen developed to characterize the typical EEGpatterns of an epileptic seizure and to recognize thephase transitions leading to the epileptic neuralstates. These techniques mainly involve estimatesof established criteria such as correlation dimen-sion, KolmogorovSinai-entropy, Lyapunov expo-nents, fractal similarity, etc. Implantable seizurepredictions and prevention devices are already inuse with Parkinsonian patients. In the case ofepileptic processes, such a device would continu-ously monitor features extracted from the EEG,compute the probability of an impending seizure,and provide suitable prevention techniques. Itshould also possess both a high flexibility fortuning to individual patient patterns and a highefficacy to allow the estimation of these features inreal time. Eventually, it should have low energyconsumption and be small enough to be imple-mented in a miniaturized, implantable system.These requirements are optimally realized byCNNs, with their massive parallel computingpower, analog information processing, and capa-city for universal computing.
In complex dynamical systems of organismsmonitoring and controlling are realized on hier-archical levels. Thus, we must study the nonlineardynamics of these systems in experimental situa-tions, in order to find appropriate order para-meters and to prevent undesired emergentbehavior as possible attractors. From the pointof view of systems science, the challenge of organiccomputing is controlled emergence.
A key application is the nonlinear dynamics ofbrains. Brains are neural systems which allowquick adaption to changing situations duringlifetime of an organism. In short they can learn,assess, and anticipate. The human brain is acomplex system of neurons self-organizing inmacroscopic patterns by neurochemical inter-actions. Perceptions, emotions, thoughts, andconsciousness correspond to these patterns. Motorknowledge, for instance, is learnt in an unknownenvironment and stored implicitly in the distribu-tion of synaptic weights of the neural nets.Technically, self-organization and pattern emer-gence can be realized by neural networks, workinglike brains with appropriate topologies and
127learning algorithms. Neural networks are complexsystems of threshold elements with firing andnonfiring states, according to learning strategies(e.g., Hebbian learning). Beside deterministichomogeneous Hopfield networks, there areso-called Boltzmann machines with stochasticnetwork architecture of nondeterministic proces-sor elements and a distributed knowledge repre-sentation which is described mathematically byan energy function. While Hopfield systems use aHebbian learning strategy, Boltzmann machinesfavor a backpropagation strategy (WidrowHoffrule) with hidden neurons in a many-layerednetwork (Mainzer, 2003).
In general, it is the aim of a learning algorithmto diminish the informatictheoretic measure ofthe discrepancy between the brains internal modelof the world and the real environment via self-organization. The interest in the field of neuralnetworks is mainly inspired by the successfultechnical applications of statistical mechanicsand nonlinear dynamics to solid state physics,spin glass physics, chemical parallel computers,optical parallel computers, or laser systems. Otherreasons are the recent development of computingresources and the level of technology which makea computational treatment of nonlinear systemsmore and more feasible.
A simple robot with diverse sensors (e.g.,proximity, light, collision) and motor equipmentcan generate complex behavior by a self-organizingneural network. In the case of a collision with anobstacle, the synaptic connections between theactive nodes for proximity and collision layerare reinforced by Hebbian learning: a behavioralpattern emerges, in order to avoid collisions infuture (Pfeifer and Scheier, 2001). In the humanorganism, walking is a complex bodily self-organization, largely without central control ofbrain and consciousness: It is driven by thedynamical pattern of a steady periodic motion,the attractor of the motor system.
What can we learn from nature? In unknownenvironments, a better strategy is to define a low-level ontology, introduce redundancy which iscommonly prevalent in sensory systems, forexample and leave room for self-organization.Low-level ontologies of robots only specifysystems like the body, sensory systems, motorsystems, and the interactions among their compo-nents, which may be mechanical, electrical,electromagnetic, thermal, etc. According to thecomplex systems approach, the components arecharacterized by certain microstates generating themacrodynamics of the whole system.
Take a legged robot. Its legs have joints that canassume different angles, and various forces can beapplied to them. Depending on the angles and theforces, the robot will be in different positions andbehave in different ways. Further on, the legs haveconnections to one another and to other elements.If a six-legged robot lifts one of the legs, thischanges the forces on all the other legs instanta-neously, even though no explicit connection needsto be specified. The connections are implicit. Theyare enforced through the environment, because ofthe robots weight, the stiffness of its body, andthe surfaces on which it stands. Although theseconnections are elementary, they have not beenmade explicit by the designer. Connections mayexist between elementary components that we donot even realize. Electronic components mayinteract via electromagnetic fields that the designeris not aware of. These connections may generateadaptive patterns of behavior with high fitnessdegrees (order parameter). But they can also leadto sudden instability and chaotic behavior. In ourexample, communication between the legs of arobot can be implicit. In general, much more isimplicit in a low-level specification than in a high-level ontology. In restricted simulated agents, onlywhat is made explicit exists, whereas in thecomplex real world, many forces exist and proper-ties obtain, even if the designer does not explicitlyrepresent them. Thus, we must study the nonlineardynamics of these systems in experimental situa-tions, in order to find appropriate order para-meters and to prevent undesired emergentbehavior as possible attractors.
But not only low level motor intelligence, butalso high level cognition (e.g., categorization)can emerge from complex bodily interaction withan environment by sensorymotor coordinationwithout internal symbolic representation. Wecall it embodied cognition: an infant learns tocategorize objects and to build up concepts by
128touching, grasping, manipulating, feeling, tasting,hearing, and looking at things, and not byexplicit symbolic representations (e.g., language).The categories are based on fuzzy patchworksof prototypes and may be improved and changedduring life. We have an innate dispositionto construct and apply conceptual schemes andtools.
Moreover, cognitive states of persons depend onemotions. We recognize emotional expressionsof human faces with pattern recognition of neuralnetworks and react by generating appropriatefacial expressions for nonverbal communication.Emotional states are generated in the limbicsystem of the brain which is connected with allsensory and motoric systems of the organism.All intentional actions start with an unconsciousimpulse in the limbic system which can bemeasured before their performance. Thus, embo-died intentionality is a measurable feature of thebrain (Freeman, 2004). Humans use feelings tohelp them navigate the ontological trees of theirconcepts and preferences, to make decisions in theface of increasing combinational complexity.Obviously, emotions help to reduce complexity.
The embodied mind is a complex dynamicalsystem acting and reacting in dynamically chan-ging situations. The emergence of cognitive andemotional states is made possible by braindynamics which can be modeled by neural net-works. According to the principle of computa-tional equivalence (Mainzer, 2007), any dynamicalsystem can be simulated by an appropriatecomputational system. But, contrary to TuringsAI-thesis, that does not mean computability inevery case. In complex dynamical systems, therules of locally interacting elements (e.g., Hebbsrules of synaptic interaction) may be simple andprogrammed in a computer model. But theirnonlinear dynamics can generate complex patternsand system states which cannot be forecast in thelong run without increasing loss of computabilityand information. Thus, artificial minds could havetheir own intentionality, cognitive, and emotionalstates which cannot be forecast and computed likein the case of natural minds. Limitations ofcomputability are characteristic features of com-plex systems.In a complex dynamical world, decision-makingand acting is only possible under conditions ofbounded rationality. Bounded rationality resultsfrom limitations on our knowledge, cognitivecapabilities, and time. Our perceptions are selec-tive, our knowledge of the real world is incom-plete, our mental models are simplified, ourpowers of deduction and inference are weak andfallible. Emotional and subconscious factors affectour behavior. Deliberation takes time and we mustoften make decisions before we are ready. Thus,knowledge representation must not be restrictedto explicit declarations. Tacit background knowl-edge, change of emotional states, personal atti-tudes, and situations with increasing complexityare challenges of organic computing.
In a dramatic step, the complex systems approachhas been enlarged from neural networks to globalcomputer networks like the World Wide Web. It isnot only a metaphor to call them global super-brains. The internet can be considered a complexopen computer network of autonomous nodes(hosts, routers, gateways, etc.), self-organizingwithout central mechanisms. Routers are nodesof the network determining the local path of eachinformation packet by using local routing tableswith cost metrics for neighboring routers. Thesebuffering and resending activities of routers cancause congestions in the internet. Congestedbuffers behave in surprising analogy to infectedpeople. There are nonlinear mathematical modelsdescribing true epidemic processes like malariaextension as well as the dynamics of routers.Computer networks are computational ecologies.
But complexity of global networking does notonly mean increasing numbers of PCs, work-stations, servers, and supercomputers interactingvia data traffic in the internet. Below the complex-ity of a PC, low-power, cheap, and smart devicesare distributed in the intelligent environments ofour everyday world. Like GPS in car traffic, thingsin everyday life could interact telematically bysensors. The real power of the concept does notcome from any one of these single devices. Inthe sense of complex systems, the power emergesfrom the collective interaction of all of them.For instance, the optimal use of energy could beconsidered a macroscopic order parameter of a
129household realized by the self-organizing use ofdifferent household goods according to less con-sumption of electricity during special time-periodswith cheap prices. The processors, chips, anddisplays of these smart devices do not need a userinterface like a mouse, windows, or keyboards,but just a pleasant and effective place to get thingsdone. Wireless computing devices on small scalesbecome more and more invisible to the user.Ubiquitous computing enables people to live,work, use, and enjoy things directly without beingaware of their computing devices.
A challenge of the automobile industry is theincreasing complexity of electronic systems. If weconsider the electronic cable systems of automo-biles from the beginning through to today, therewill be a surprising similarity to neural networks oforganisms which increase in complexity duringevolution. Contrary to biological evolution, elec-tronic systems of today are rigid, compact, andflexible. In an evolutionary architecture (EvoArch)the nervous system of an automobile is dividedinto autonomous units (carlets) which can con-figurate themselves in cooperative functions, inorder to solve intelligent tasks (Hofmann et al.,2002). They are the macroscopic features realizedby interacting unities in a complex system.Examples are the complex functions of motor,brake, and light, wireless guide systems like GPS,smart devices for information processing, and theelectronic infrastructure of entertainment. In anevolutionary electronic architecture (EvoArch),there are several self-x-features with greatsimilarity to self-organizing organic systems inbiological evolution: self-healing demands self-configuration and self-diagnosis. Self-diagnosismeans error recognition and self-reflection, etc.In short: the principles of self-organizing brainsare realized in the electronic hardware of cars.Perspectives of modeling and computing withcomplex dynamical systems
What is the reason behind the successful inter-disciplinary applications of nonlinear complexsystems? This approach cannot be reduced tospecial natural laws of physics, although itsmathematical principles were discovered and atfirst successfully applied in physics. Thus, it is nokind of traditional physicalism to explain thedynamics of laser, ecological populations or ourbrain by similar structural laws. From a formalpoint of view, complex systems may be multi-component systems of, for example, atoms,molecules, cells, or organisms. If certain controlparameters are changed, the interactions of ele-ments in multi-component systems may lead tonew collective macroscopic properties of order orchaos which cannot be reduced to the individualelements on the microlevel.
The emergence of order and chaos dependsessentially on the nonlinearity of evolutionaryequations modeling the dynamics of complexsystems. Further conditions come in by the specificparameters of physical, chemical, biological, psy-chological, or computational systems. Therefore,the formal models of nonlinear complex systemsdo not eliminate the requirements of specificexperimental research on the different levels andscales in the different sciences. Interdisciplinaryapplications of nonlinear complex systems aresuccessful if they find a clever combination offormal mathematics, computer-assisted modeling,and disciplinary research. Complexity and non-linearity are interdisciplinary problems of currentresearch. Thus, the formal analysis and computer-assisted modeling of complex systems must becombined with experimental and empiricalresearch in the natural and social sciences.According to a famous quotation of Kant, wemay conclude that formal models without specificdisciplinary research are empty, while disciplinaryresearch without common principles is blind.
Organic and neural computing not only aims atmodeling but also at constructing self-organizingcomputing systems that display desired emergentbehavior like organisms in natural evolution(Horn, 2001; Kephart and Chess, 2003; Muller-Schloer, 2005). Emergence refers to a property ofa system that is not contained in anyone of itsparts. In the sense of nonlinear dynamical systems,the whole is more than the sum of its parts. Inrobotics, it concerns behavior resulting from theagentenvironment interaction whenever the beha-vior is not preprogrammed. It is thus not common
130to use the term if the behavior is entirelyprespecified like a trajectory of a hand that hasbeen precalculated by a planner. Agents designedusing high-level ontologies have no room foremergence, for novel behaviors. A domain orhigh-level ontology consists of a complete repre-sentation of the basic vocabulary, the primitives,that are going to be used in designing the system.These are the only components that can be used:everything is built on top of these basic elements.The domain ontology remains constant for anextended period of time, often for the entire life ofthe system. A well-known example is the boundedknowledge representation of an expert system.High-level ontologies are therefore used wheneverwe know precisely in what environments thesystems will be used, as for traditional computa-tional systems as well as for factory robot systems.In unknown environments, a better strategy is todefine a low-level ontology and to introduce re-dundancy with a great variety of self-organization.
In the dynamical systems approach, we firstneed to specify what system we intend to modeland then we have to establish the differential ordifference equations. Time series analysis andfurther criteria of data mining help to constructthe appropriate phase spaces, trajectories, andattractors (Small, 2005). In organic computing,one approach would be to model an agent and itsenvironment separately and then to model theagentenvironment interaction by making theirstate variables mutually dependent (Pfeifer andScheier, 2001). The dynamical laws of the agent Aand the environment E can be describedby simplified schemes of differential equationsdxa/dt=A(xa, pa) and dxe/dt=E(xe, pe), where xrepresents the state variables, such as angles ofjoints, body temperature, or location in space, andp parameters like thresholds, learning rates, nutri-tion, fuel supply and other critical features ofchange. Agents and environment can be coupledby defining a sensory function S and a motorfunction M. The environment influences theagent through S. The agent influences its environ-ment through M. S and M constitute the agentenvironment coupling, that is, dxa/dt=A(xa, S(xe),pa) and dxe/dt=E(xe, M(xa), pe), where pa and peare not involved in the coupling. Examples arewalking or moving robots in environments withobstacles. In this case, the basic analysis problemcan be stated in the following way: given anenvironment dynamics E, an agent dynamicsA, and sensory and motor functions S and M,explain how the agents observed behavior isgenerated.
One of the controllers of the dynamics evolveswhen the agents angle sensors are turned off andcannot sense the position of its legs. In this case,the activation levels of the neurons exhibit a limitcycle that causes the agents single leg to stand andswing rhythmically. By that, it causes the robot towalk. The systems state repeatedly changes fromthe stance phase with the foot on the ground tothe swing phase with the foot in the air and back.This example illustrates that the dynamical sys-tems approach can be applied in a synthetic wayin order to design and construct robots and theirenvironments. But, in general, the dynamicalsystems approach is used in an analytical way: itstarts from a given agentenvironment interaction,which is formalized in terms of differentialequations. The complex variety of behavior canbe analyzed by solving, approximating, or simulat-ing the equations, in order to find the attractors ofdynamics. The dynamical attractors of the inter-acting system can be used to steer an agent or to letit self organize in a desired way.
Obviously, self-organization leads to the emer-gence of new phenomena on sequential levels ofevolution. Nature has demonstrated that self-organization is necessary, in order to manage theincreasing complexity on these evolutionary levels(Mainzer, 2005). But nonlinear dynamics can alsogenerate chaotic behavior which cannot be pre-dicted and controlled in the long run. In complexdynamical systems of organisms monitoring andcontrolling are realized on hierarchical levels.There is still no final and unified theory of organiccomputing. We only know parts of biological,neural, cognitive, and social systems in the frame-work of complex dynamical systems. But even inphysics, we have no unified theory of all physicalforces. Nevertheless, scientists work successfullywith an incomplete patchwork of theories. Inorder to know more about it, we need aninterdisciplinary cooperation of technical, natural,
131computer, and cognitive science, and last but notthe least humanities. The goal of organic andneural computing is the construction of self-organizing computing systems to be of service forthe people, in order to manage a world ofincreasing complexity and to support a sustainablefuture of human infrastructure.
In summary, we conclude with following state-ments
Natural evolution of brain and mind
Natural evolution generates nervous
systems working with neurochemicalmechanisms. They enable organisms tolearn, adapt, and change their environ-ment autonomously. Thinking, feeling,and consciousness are considered mentalstates of neural dynamics evolving in thehuman organism and interacting with itsenvironment (embodied mind). Nonlinear dynamics of complex systems
The natural evolution of brain and
mind is an example of nonlinear dynamicsof complex systems. The emergence oforder in complex systems is made possibleby, for example, thermodynamics, genetic,and neural self-organization. Hierarchicallevels of neural self-organization lead tothe emergence of mental states and func-tions which can (in principle) be modeledby order parameters (attractors) of non-linear dynamics. Artificial life and artificial intelligence
Under different conditions, the laws of
nonlinear dynamics would also haveallowed variations of life different fromthe organisms which actually evolved onEarth. Therefore, technology can use thelaws of nonlinear dynamics to find similaror new solutions of self-organizing systemsin, for example, bionics, artificial intelli-gence, and artificial life. Self-organization and the emergence ofhuman mindNonlinear dynamics of self-organizingcomplex systems generate emerging prop-erties and functions which often cannotbe forecast in the long run. Therefore,neuropsychic systems may lead toindividual feelings, perceptions (qualia),intentions (intentionality), and self-consciousness which cannot be derivedfrom single neural activities, but by syner-getic interactions of the whole system. Dynamical and computational systems
According to the principle of computa-
tional equivalence, complex dynamicalsystems can be considered computationalsystems (e.g., CA, neural networks).Nevertheless, complex computational sys-tems can lead to chaos, randomness, andundecidability. Therefore, computationalsystems (e.g., stochastic systems) can alsosimulate brain dynamics with their unde-termined features. Self-organization and human technology
Contrary to the Laplacian Spirit, the
laws of nonlinear dynamics exclude thetotal computability of nature, life, andmind. But, we can understand their phasetransitions in order to find conditions(control parameters) making the emer-gence of desired states and developments(order parameters) possible and probable(e.g., health, wellness in medicine andpsychology). Empathy and responsibilityare made possible by human braindynamics. Superbrain and mankind
In the age of globalization, mankind is
growing together by worldwide informa-tion and communication systems. Theyare the self-organizing nervous systems ofan emerging superbrain with increasingcomplexity. It is a challenge to find theconditions of global governance in thevirtual networks of global organizations. Philosophical and religious perspectives
In the philosophical and religious tradi-
tion of mankind (e.g., Buddhism), theemergence of a global consciousness isassumed to evolve in steps with increasingclearness from individual consciousnessto the soul of the world. Nonlinear scienceopens avenues to follow this line from ascientific point of view.
Bassingthwaighte, J.B., Liebovitch, L.S. and West, B.J. (1994)
Fractal Physiology. Oxford University Press, New York.
Chua, L.O. and Roska, T. (2002) Cellular Neural Networks
and Visual Computing. Foundations and Applications.
Cambridge University Press, Cambridge.
Deco, G. and Schurmann, B. (2000) Information Dynamics.
Foundations and Applications. Springer, New York.
Dennett, C.D. (1998) Brainchildren: Essays on Designing
Minds. MIT Press, Cambridge, MA.
Dreyfus, H.L. (1982) Husserl, Intentionality, and Cognitive
Science. MIT Press, Cambridge, MA.
Floridi L.(2004). Philosophy of Computing and Information.
Freeman, W.J. (2004) How and why brains create meaning from
sensory information. Int. J. Bifurcat. Chaos, 14: 515530.
Haken H. and Mikhailov A. (Eds.), (1993). Interdisciplinary
Approaches to Nonlinear Complex Systems. Springer,
Hofmann, P.H., Lukas, G., Wohlgemuth, F., Schneider, B.,
Muller, A., Schmidt, U., Keydel, M., Sakretz, R., Leboch,
S., Muller, T., Klenk, U., Perans, A. and Dohmeyer, V.
(2002) Evolutionare E/E-Architektur. DaimlerChrysler,
Esslingen.Horn, P. (2001) Autonomic Computing: IBMs Perspective
on the State of Information Technology. IBM, New York.
Kephart, J.O. and Chess, D.M. (2003) The Vision of
Autonomic Computing. IEEE Comput. Soc., 1: 4150.
Mainzer, K. (2003). KI Kunstliche Intelligenz. Grundlagen
intelligenter Systeme, Wissenschaftliche Buchgesellschaft,
Mainzer, K. (2007) Thinking in Complexity. The Computa-
tional Dynamics of Matter, Mind, and Mankind (5th
enlarged ed.). Springer, New York.
Mainzer, K. (2005) Symmetry and Complexity. The Spirit and
Beauty of Nonlinear Science. World Scientific Publisher,
Merleau-Ponty, M. (1962) Phenomenology of Perception.
Routledge & Kegan Paul, London.
Muller-Schloer, C. (Ed.). (2005). Schwerpunktthema: Organic
Computing Systemforschung zwischen Technik und
Naturwissenschaften. it Inf. Technol., 47(4): 179181.
Pfeifer, R. and Scheier, C. (2001) Understanding Intelligence.
MIT Press, Cambridge, MA.
Searle, J.R. (1983) Intentionality. An Essay in the Philosophy of
Mind. Cambridge University Press, Cambridge.
Small, M. (2005) Applied Nonlinear Time Series Analysis.
Applications in Physics, Physiology and Finance. World
Scientific Publisher, Singapore.
The emergence of mind and brain: an evolutionary, computational, and philosophical approachFrom linear to nonlinear dynamicsSelf-organization and emergence in evolutionSelf-organization and emergence of brain and mindSelf-organization and emergence of computational systemsPerspectives of modeling and computing with complex dynamical systemsReferences