amiina bakunowicz_msc thesis_neural self-organising maps and genetic algorithm: evolving 3d cellular...

134
NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL A.BAKUNOWICZ MSc 2013

Upload: amiina-bakunowicz

Post on 09-Jun-2015

180 views

Category:

Design


0 download

TRANSCRIPT

Page 1: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

NEURAL SELF-ORGANISING MAPS

AND

GENETIC ALGORITHM:

EVOLVING 3D CELLULAR AUTOMATA

ARCHITECTURAL MODEL

A.BAKUNOWICZ

MSc

2013

Page 2: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

AMIINA BAKUNOWICZ

A thesis submitted in partial fulfilment of the requirements of the School of Architecture, Computing and Engineering, University of East London for the

degree of Master of Science

September 2013

Page 3: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

Table of Contents:

Chapter 1. Abstract

Chapter 2. Introduction

2.1 The New Aesthetics and the Ecology of Design

2.2 Coding vs Traditional Design Methods

Chapter 3. The Concept of the Architectural Problem

3.1 Architectural Scenario

3.2 Proposed Solution to the Architectural Problem

3.3 Other Ways of Solving Similar Problems?

3.3.1. Electrical Graph Approach

3.3.2. Physically based modelling

Chapter 4. Classic Genetic Algorithm

4.1 Traditional Genetic Algorithm

4.1.1. Brief Description

4.1.2. Evolutionary Embryogenies.

4.2 Classic GA for the Current Architectural Scenario

4.2.1 Description of the Code

4.2.1.1. Original CA Model and Body-Plan

4.2.1.2. The First Generation

4.2.1.3. The Fitness Function

4.2.1.4. The Process of Selection

4.2.1.5. Crossover and Mutation

4.2.2 Outcome of the Algorithm

4.3 Examples of GA Applications in Architecture

Chapter 5. Self-Organising Maps and Genetic Algorithm

5.1 Artificial Neural Networks and Self-Organising Maps

5.2 Introducing SOM to the Classic GA

5.2.1 Why Combining GA with SOM?

5.2.2 Managing the Expanded Choice

5.2.3 The Evaluation of the Aesthetics

Page 4: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

5.3. The Evolution of the Proposed Code

5.3.1. Stage 1.Classic GA for the Current Architectural Scenario

5.3.2. Stage 2. Adding Golden Ratio to the Fitness Criteria

5.3.3. Stage 3. Introducing Self-Organising Maps to the Genetic Algorithm

5.3.4. Stage 4. Turning Chromosomes into Neurons and Back

5.3.5. Stage 5. Co-evolving Threshold of the Fitness Factor Components

5.3.6. Stage6. Optional Selection Method

5.4. The Final Algorithm

5.4.1.The Structure of the Algorithm

5.4.1.1. Original CA Model and Body-Plan

5.4.1.2. The First Generation

5.4.1.3. Introducing Self-Organising Maps

5.4.1.4. The Fitness Function

5.4.1.5. Selection Options

5.4.1.5.1. The final decision-making rules of the proposed algorithm

5.4.1.5.2. Selection Option 1: Optimised Random Selection by Clusters

5.4.1.5.3. Selection Option 2: Artificial Selection and Evaluation of Aesthetics

5.4.1.5.4. Selection Option 3: Goldberg Roulette

5.4.1.6. Crossover

5.4.1.7. Mutation

5.4.1.8. The following Generations

5.4.1.9. Visualisation of the Search Space by Clustering and Evolution of Colour

5.4.2. Main Results and Observations

5.5. Advantages and Disadvantages of GA-SOM

5.6. Further Research

5.6.1 Parameterisation of the Body-Plan

5.6.2 Swarms and SOM Pattern Recognition

5.6.3 Discovering the Potential of Clusters

5.6.4. Co-Evolution Instead of Fitness Criteria

5.6.5. Exploring Other Ways in Which CA Can Generate Body-Plans

5.6.6. Other Visualisation Techniques of the Search Space

5.6.7. Adapting Genetic Operators to Evolve the Evolvability

Page 5: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

Chapter 6. Conclusion

Chapter 7. References

Chapter 8. Further Reading

Chapter 9. Appendixes

Appendix 1. Traditional Genetic Algorithm: a Brief Description

Appendix 2. Comparing the Types of the Evolutionary Embryogenies

Appendix 3. Example of a SOM Algorithm Application in 3D Modelling and Its Applications

Appendix 4. Piasecki and Sean's Wundt Curve Experiment

Appendix 5. First Ever SOM for Multi-Objective Evolutionary Algorithm

Appendix 6. Brudaru's Cellular Genetic Algorithm with Communicating Grids and SOM

Appendix 7. Kita's Real-Coded Genetic Algorithm (RCGA) and SOM

Appendix 8. Optional Body-Plans for the Proposed Algorithm

Appendix 9. Proposed Algorithm (Python)

Page 6: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

“... Game did not display perfection to the outward eye. Rather, it guided the player…”

Herman Hesse, 1943 “The Glass Bead Game”

Page 7: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

Chapter 1. Abstract

Since the dawn of Computer Aided Design for a long time architects were coming up with digitally modelled projects that were very radical, but impossible to build. Later on with the development of the fabrication techniques a lot of the designs have materialised. Currently, architects started using computational methods to re-evaluate designs. In addition to that there always has been the urge for architectural significance and designs became very economic and performance- orientated. Thus the exploration of the alternative computational ways of the architectural morphogenesis was left behind.

I would like to propose a possible solution to the form-finding challenge in a given architectural context by scripting. This paper describes a design process that mostly uses code that combines Genetic Algorithm and Self-Organising Maps and evolves a predetermined Cellular Automata model. A computer forms a team with an architect to develop a design, where the solution is formed gradually during the process of running the algorithm. Together, the designer and the machine go through the architectural design procedure starting from the phase at which the appraisal is carried out, concepts are finalised and the basic mass study is done until both are satisfied with the algorithm's outcome. The eventual model can be then given back to the architect for final adjustments and detailing.

The proposed algorithm has two tasks. Firstly, based on the information available after the completed initial design stages, to develop a schematic body-plan of the architectural model and parameterised it. Secondly, applying the principles of the processes of natural evolution and biological neural networks, use the genetic algorithms (GA) and neural Self-Organising Maps (SOM) to evolve optimal solutions.

This paper will concentrate mostly on the second stage of the algorithm, where the self-organising mapping will be applied to each generation of the alternated body-plan in order to widen, classify, structure and exploit the search space of the GA. I believe that the algorithm can create and amplify the synthetic intuition and give a designer yet another powerful tool and a new skill to be applied in the process of the forming the matter around the architectural concept. Also by structuring and optimising the possibilities space I would like to test if SOM could help to speed up the process of finding the optimal solution and avoid premature convergence.

At the same time the proposed algorithm attempts to deal with the challenge of “...moving from the creation of inventive articulated patterns, and the small-scale installations to the full scale architectural projects where scripting can unleash the entire universe of opportunities for architectural space” (Matias Del Campo)

Also certain relevant theoretical problems will be looked at, such as the paradox of the expanded choice, coding compared to traditional methods of design, philosophical meaning of “the sympathy of things”, the challenges of the evaluation of aesthetics and many other.

Page 8: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

Chapter 2. Introduction

2.1. The New Aesthetics and the Ecology of the Design

In 1917 D’Arcy Wentworth Thompson introduced an idea that form is never a given, but an emerged product influenced by dynamic forces that are shaped by flows of energy and stages of growth. Later on with the birth of the mathematical and scientific hypotheses known as “Complexity Theory” in 1995 Kauffman gave another dimension to Thompson’s ideas. He argued that the form can replicate an organic life and become an example of “organised complexity”. The meaning behind its geometry is not necessarily the accidental “form follows function”, but it could be viewed as a complex system that has “self-organising” properties and being able to evolve and be sustainable in the absence of any external influence (Kauffman 1995).

Applying this concept to the design process, in order to give the system maximum possible autonomy it is essential to determine a distinction between behaviour which emerges as a result of self-organizing processes, and behaviour which was deliberately prefigured in the design of that organism, or organisms. Also certain behaviours can emerge from an initial set of simple rules written for locally interacting units that was neither implied nor foreseen to happen. As independent behaviours emerge, here comes a possible solution to a problem of programming the certain level of “creativity” or “individuality”.

All above leads back to the perceptual morphogenesis. Perception itself, As beauty is "intrinsic" and it cannot emerge ex nibilo ("out of nothing") or be applied, the design acquires the aesthetic properties from the aspiration to make something beautifully and keeping it throughout the entire design process has been studied by Gestalt psychology and philosophical systems since the beginning of 19th century (Wertheimer). The perception of complex systems is mostly generated not so much by its individual elements (for example, human neurons) as by their dynamic interrelation representing the collective behaviour – a phenomena that can be found easily in Artificial Neural Networks, where data generalisation, pattern classification or forecasting abilities are the key features (Ramos 2012). Another way to approach morphogenesis was analysed by Lars Spuybroek. He took the brave role to reinterpret the notion of beauty: "Everything feels to some degree, and the feelings we have towards the object are of the same nature as the ones that have made it or are still making it...the feelings are of the same nature in the sense of resonating or sympathizing". Spuybroek claims that “a feeling, a resonance, a sympathy” is a characteristic or parameter of the design by which we perceive it, that exists in another dimension, that cannot be measured or valued. The forces that were driving the process of creation stay present and active even after the object is finished.

The notion of design in a complex system can be viewed as "self-assemblage" or "self-design". The thought is contradictory in its essence as no self exists prior to the process... Against the logic, Spuybroek believes in design without a designer. He repeated Paley just leaving God out: "Things are beautiful because they are made" and then corrected himself through Ruskin: "Things are beautiful because they are made beautifully". As beauty is "intrinsic" and it cannot emerge ex nibilo ("out of nothing") or be applied, the design acquires the aesthetic properties from the aspiration to make something beautifully and keeping it throughout the entire design process.

Page 9: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

The associative or interior aspect of perception is based on immediate feeling or seeing that extends the experience. That happens when the exterior is internalised, when the objects share the exterior quality, like flatness of the table top and the glass bottom. Internalised means "felt”, as when lifted off the table, glass longs for it (Manuel DeLanda). As a result feelings make things act: take or change shape, in accordance with others.

Carl Jung had similar views. His famous saying "The meeting of two personalities is like the contact of two chemical substances. If there is any reaction, both are transformed" can be also viewed in an architectural context.

Henri Bergson refers to the “interior aspect of perception” as intuition: "An absolute can only be given in an intuition, while all the rest has to do with analysis. We call intuition here the sympathy by which one is transported into the interior of an object in order to coincide with what there is unique and consequently inexpressible in it. Analysis, on the contrary, is the operation which reduces the object to elements already known." This thought puts a big question mark over the suggestion that computers CAN actually be creative as at this point in time they are merely analytical tools.

Hopefully one day we will learn to recognise, comprehend and share “feelings” as a main source of information. Even at this point we perceive everything as a sensation of various degrees of sympathy and make our decisions based on the resonance to this "felt" assessment. Then, the process of creating will be the ONE with the result, especially if the complexity of the potential architectural project could be understood not analytically, but intuitively (Bergson) or “emotionally” (Spuybroek). Then the context can be parameterised to shape the new interlinked functional spaces, and then, to close the circle - it updates and adapts the existing, ready to be changed again. This way an ever-adapting sustainable architectural complex system will be emerging.

2.2 Coding versus Traditional Design Methods

The evolution of computer-aided design (CAD) demonstrates the search for the ways of how technology can either fulfil certain pre-allocated roles or how it can carry out the most appropriate role or combination of roles. Nowadays they mostly serve either as design tools or as means of communication or as design assistants or as design environments or even as habitable physical environments and virtual environments (Kalay 2004).

The process of design, which has been practiced over hundreds years and formalised in the 1960s (Minsky 1968) consists of four interconnected phases: problem analysis, solution synthesis, evaluation and communication. The process of design carried out either with the traditional methods or by scripting remains the same and consists of these four stages. The advantage that computers give to the evolution of design during the first three stages is that they are powerful enough to turn a design into a complex system, evolve it automatically passing on its qualities to the design:

- Flexibility (being able to cope with incorrect, ambiguous or distorted information or even to deal with unforeseen or new situations without showing abrupt performance breakdown)

- Versatility (Dorigo and Colorni)

- Robustness that insures the functionality of the system even when some parts are locally damaged - Damásio)

Page 10: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

10 

Such computational systems have “competition-cooperation duality”. Cooperation reflects in a global behaviours of agents, that interact by communicating information, or hints (usually concerning regions to avoid or likely to contain solutions) to each other while solving a problem (Langton). An example of cooperative problem solving is the use of the Genetic Algorithm to find states of high fitness in some abstract space. In Artificial Neural Networks, we can also find similar features, where the output of one neuron affects the behaviour or state (Cellular Automata theory) of the neuron receiving it, and so on (Ramos 2012).

Also there are various design methods that join together the stages of the design process. At present the design methods can be categorised into four types: trial-and-error searches, constraint-satisfaction methods, rule-based design and precedent-based methods (Kalay 2004). Scripting gives bigger flexibility and broader exploration of all design methods and even more valuably, develops the new or combines the existing ones. Also it permits a broad processing of information work as it has no intrinsic size. “The computer is the tool for the manipulation of information, whether that manipulation is a consequence of our actions or a consequence of the actions of the information structure themselves” (Ramos 2012).

Of course, it’s all well said that computers provide great advantages during the design process. However, the methods that were discussed earlier on in this chapter were developed by a human designer, for human designers and were formed based on human design experience. Only very small new adaptations to these methods were made taking into account the new design partner - the computer. Consequently, the question arises, could the new synthesised design methods be evolved that do not depend on both positive and negative human influence?

Also, of course, the ultimate puzzling and unsettling questions, can the procedural methods be “creative” and yield novel design solutions?

If we apply the criteria of unexpectedness to the results, then the logical (still arguable) answer is yes. They can, very occasionally generate unexpected results only because of the unique ability to produce and analyse vast amount of solutions, a skill that humans do not possess. Some may agree that we can program “creativity” and “individuality” into the code by ensuring that:

- The bottom-up methodological design is followed by allowing only autonomous implicitly programmed mechanisms to be embedded in any artificial organism;

- The role of intrinsic co-evolution between parts of the artificial system, on how it can be implemented, and on his intrinsic scientific properties for each particular project is well analysed and understood (Ramos 2012);

- The rules of self-organization are implemented;

- A member of a complex system goes through various virtual design encounters. As the more one’s past experience differs from the others, the more “individual” its behaviour becomes.

However novelty and unexpectedness are not the only criteria for creativity. What about designer’s personal uncontainable desire to express his/her experience and intuition? What about the mental intention to create and innovate? What about the search for inspiration and actual state of being inspired? I don’t think these aspects can ever be programmed into any algorithm. At the same time by being excellent at mimicking the results of human creativity (implemented by analogical methods) or evolve shape based on pre-evaluated knowledge (used by expert systems and grammars) no computer can qualify to be truly creative.

Actually it really does not matter if computers can or cannot be creative. There is a highly valuable skill that they can contribute to the design methods, they are able to generate and

Page 11: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

11 

recognise configurations that were not obvious or even present to begin with. This property is called emergence. There are certain projects that demonstrate the computational methods as an irreplaceable collaborator in the design process. One of them is a project that was a response to the first tender of the Khalifa-bin-Zayed Al Nehayan Foundation competition. The Computational Design and Research group divided the brief into three areas of applications: roof – circle packing and connectivity graph, elevations – barycentre coordinate systems for mesh population, residential units – polyomino tiling for 3D Tetris with evolutionary optimisation (Aedas Computational Design and Research group 2013).

The deliberate lack of foreseeing the final cladding, roof and residential patterns resulted from the design of algorithmic and heuristic search methods:

Figure 1. Roof circle packing based on underlying schedule

Figure 2. Geometrical studies for apartment units

Figure 3. Designs for all facades based on underlying uses

Page 12: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

12 

Figure 4.Final design: exploded axonometric projection

Having taken into account everything said above I propose a code that combines Genetic Algorithm and Self-Organising Maps aiming to suggest another way to give a solution to certain spatial and morphogenetic architectural problems. The approach to the introduced design process follows computer-aided “bottom-up” principle to ensure that the evolving system emerges maintaining the autonomic qualities. The following chapters will describe the proposed design process in full detail.

Page 13: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

13 

Chapter 3. The Concept of the Architectural Problem

3.1 Architectural Scenario

The task of the proposed algorithm is to give a shape to the building according to the predetermined functional and spatial arrangement of the spaces. It must be achieved in a way that the required layout remains unchanged and the form emerges from the behaviours between the agents of the complex system of the building. To simplify the task the evolution of a single family house was chosen as a base for the project. The house has three functional areas: living, resting and working.

3.2 Proposed Solution to the Architectural Problem

To find a potential solution or solutions to the problem described above a vast search of all the solutions called a search space will need to be performed. In order to make the design process feasible the search space has to be narrowed down by creating a group of random solutions, measuring their relevance or fitness level against the set criteria, crossing over the fittest ones in order to generate new, hopefully fitter solutions. Genetic Algorithm (GA) will be responsible for this part of the code.

However GA has its drawbacks and it minimises the search space too much, resulting in the problem known as “local optima”. Meaning that the algorithm tends to find an optimal solution in a small part of a search space and finds it very difficult, in most cases almost impossible to be able to move on to another part of the space for the search of the possible solutions. In order to try to handle this challenge alongside with other smaller ones (which will be described later on in this paper) an Artificial Neural Networks algorithm called Self-Organising Maps is introduced. It expands the search space by producing possible “in-between” solutions of each generation and classifies them in topological maps, thus allowing for a bigger diversity of the potential fit models that are arranged in clusters to choose from.

3.3 Other Ways of Solving Similar Problems

3.3.1. Electrical Graph Approach

Philip Stedman was first to propose applying the principles behind the electrical networks to guide the computational synthesis of architectural form (March, Steadman 1974). He proposed to use the similarity between a graph representation of architectural floor plans and the physics of electricity, as expressed by Kirchhoff’s laws of electrical flow.

The idea behind the conventional graph representation is the use of a network of nodes and edges, which represent architectural elements and relationships between them:

Steadman expressed horizontal walls as nodes of the graph. Each edge in the graph represents the relationship between the two horizontal walls of the same room. Each edge is labelled with the length of the wall and each node has a label that shows the distance between the node and the bottom-most wall in the floor plan.

Page 14: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

14 

Steadman adapted the graph by applying Kirchhoff’s first law of electrical flow, which states that electricity flows from a higher voltage node to a lower voltage node, giving the graph directionality. Applying the second law, which states that the total current entering a node is equivalent to the current leaving it, permitted Steadman to turn the graph into a floor plan generator. The method allows generating the process to determine the height of the rooms as well:

Figure 5. Applying the principles of the electrical networks to guide the computational synthesis of architectural form.

A different electrical-flow analogy was used by Schwartz to develop his Automated Building Design (ABD) floor plan generator (Schwatrz, Berry, Shaviv 1994).

3.3.2. Physically Based Modelling

Physically based modelling is a technique that applies the principles of dynamic motion and geometrical deformations to rigid and non-rigid objects for the purpose of simulating realistic behaviours and visual effects (Witkin, Baraff 1997). The method was adopted by Arvin and House for the purpose of generation of floor plans that correspond to a wide range of constraints (Arvin, House 2002)

They apply the principles of physical notion of mechanical strings to connect spaces. These strings draw or repel the spaces according to their spatial arrangement and the length of each string that connects the spaces. The idea of the system is quite simple at its core: as the masses move further away from each other, the spring tries to move them closer, and as they approach each other the spring tries to separate them. The motion of the masses is dampened by producing the forces proportional to their relative velocity along the trajectory of their movement:

Page 15: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

15 

Figure 6. The principles of physical notion of mechanical strings to connect spaces

The essential characteristic of this particular method is that topological design objectives, such as desired proximity between spaces, are represented as forces that operate on point nodes at the centres of the spaces. Geometrical objectives, such as the sizes, proportions, alignment or offset of the spaces, are represented by forces on line nodes. A global “gravity” objective stimulates spaces to grow together without gaps. The system first resolves the topological problems until it is in equilibrium:

Figure 7. Resolving the topological problems.

Page 16: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

16 

Figure 8. The system works out the geometric objectives

Figure 9. The topological actualisation stage, the spaces are simulated as circles to allow them to “slide” over each other.

The methods described above provide with a wide range of solutions. However, the methods at the same time are too deterministic as the boundaries and sizes are well defined from the beginning. Both algorithms are responsible solely for an adjustment of the possible layouts. The principle that is proposed in this paper attempts to leave more transparent boundaries between spaces of a model allowing for better continuity of space and possible future adjustment by the end user.

Page 17: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

17 

Chapter 4. Classic Genetic Algorithm

4.1 Traditional Genetic Algorithm

4.1.1. Brief Description

Genetic Algorithm (GA) has an essence of evolution in its domain: the Darwinian process of generating a vast amount of random “solutions”, appropriate in terms of fitness between form, function and context. The solutions that are defined “unfit” are abandoned; the survivors are copied and modified generating the pool of newly created individuals ready for the next leap of the same evolutionary process of mating and mutation, until one or more solutions exhibit a greater degree of “fit”. The fit normally is not absolute and can be controlled by modifying the number of individuals and generations that adapt them.

Synthetic genetic algorithms closely resemble the natural model of evolution. Artificial Chromosomes are represented as string structure made of “genes”.

The detailed description of the traditional Genetic Algorithm and its embryogenies are described in the Appendix 1.

4.1.2. Evolutionary Embryogenies.

Out of the four main types of evolutionary algorithms only GA evolves the computational models motivated by genotype-phenotype mappings in biological systems. In other words it performs artificial development, also known as artificial embryogeny. There are three main types: implicit, explicit and external.

a. Implicit embryogeny.

The growth process is implicitly specified by a set of rules or instructions, similar to a ’recipe’ that govern the growth of a shape.

Examples of implicit criteria:

- amount of novelty - survival ratio - similarity with the neighbours (for data-mining) - mass customization, or the degree of possible customization of the product - ratio of similarity of FC's components to each other, i.e. if the fitness value of each

component is within a range and/or close to each other Through emergence during evolution, these implicit embryogenies incorporate all concepts of conditional iteration, subroutines, and parallel processing which must be manually introduced into explicit GP embryogenies (Bentley, Kumar 1999).

Page 18: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

18 

b. Explicit embryogeny

It specifies each step of the growth process in the form of explicit instructions. In computer science, an explicit embryogeny can be viewed as a tree containing a single growth instruction at each node. Typically, the genotype and the embryogeny are combined and both are allowed to evolve simultaneously. The advantage of using this type is that there no need to design embryogenies. However, it is difficult to evolve their representations and often specialised genetic operators are required to ensure disruption is minimised (Koza et al, 1999).

Examples of explicit criteria:

- intersection of the objects - proximity of the objects - sensitivity to light

c. External embryogeny

It is “hand-designed” and is defined globally and externally to genotypes and a designer retains higher control over the resulting evolved solutions. It can be manipulated by carefully changing the design of the embryogeny. Also it produces the fewest harmful effects for evolution, and require no specialised genetic operators". The biggest drawback of the type is that embryogenies do not evolve during the evolution of genotypes. The challenge is "to ensure that this complex mapping process will always perform the desired function" (Koza et al, 1999). Examples of external criteria:

- total floor area or volume of the building - height of the building - area of the southern facade (to make it more sustainable) - Ratio of floor area to volume (to make it more sustainable) - proximity of the proportion

of the elements to the golden ratio - etc.

Examples of researches and experiments in the area of artificial embryogenies are explained in Appendix 2.

Page 19: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

19 

4.2 Classic GA for the Current Architectural Scenario

4.2.1. Description of the Code

The architectural problem described in section 3.1 Architectural Scenario is proposed to be solved by evolving the predetermined Cellular Automata (CA) model using the conventional GA, where decoded values of the chromosomes are used to parameterise the original CA model thus creating the first generation of its variations. Then the parents are chosen for crossover using the Goldberg Roulette selection method. Offsprings are mutated and replace their parents, thus creating the new generation. The process repeats for a certain number of loops.

4.2.1.1. Original CA Model and Body-Plan

Cellular Automata (CA) model at this point created randomly. It represents a building with the three functional areas:

BLUE – working RED – living GREEN - resting

Figure 10. Cellular Automata model of one floor.

Figure 11. Cellular Automata model of many floors.

Page 20: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

20 

Parameterisation of the body-plan:

Each CA unit is parameterised according to its own state being either "working", or "living" or "resting". At this point the parameters are purely geometrical and GA genes are programmed to define the sizes and positions of the units.

Figure 12. CA unit's width and length depending on its state being "working", "living" or "resting" (3 + 3 genes)

Figure 13. CA unit's position in relation to the unit's original location on the grid and at the same time keeping the original spatial arrangement provided by the CA model

4.2.1.2. The First Generation

The first generation is created out of the individuals that represent the geometrical variations of the original CA model:

Figure 14. The first generation of GA individuals

Page 21: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

21 

4.2.1.3. The Fitness Function

The Fitness Criteria is still very simple and aimed to maintain the spatial arrangement of the units of the original CA model:

- Intersecting volumes are kept to minimum (fig.14);

- The distances between non-edge unit and its Moore neighbours kept to minimum (Fig.16 a,b); - Non-edge unit and its Moore neighbours should intersect; - The proportion of the width and the length of each unit should be close to the Golden Ratio

(Fig. 15);

- The “Living” area must be bigger than “Working”, and “Working” – bigger than “Resting”.

Figure 15. Intersecting volumes

Figure 16(a).

Page 22: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

22 

Figure 16(b).

Figure 17. The Golden Ratio programmed into the Fitness Function

Page 23: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

23 

4.2.1.4. The Process of Selection

Selection method applied is Goldberg's Roulette Wheel, or pie chart (Mitchell 1998). The principle is that the fittest individual has more chances to be chosen as a parent, compared to the individual with the smaller fitness value:

Figure 18. The principle behind the Goldberg Wheel selection method.

4.2.1.5. Crossover and Mutation

Mutation Rate was set to be 0.5, meaning that an individual has 50/50 chance to be mutated. The mutation rate is relatively high, just to make sure that the algorithm does work and evolves fitter individuals over the generations and also to gives better diversity to the phenotypes.

Crossover Rate: 1.0, meaning that every single couple of chosen parents is crossed-over.

Crossover Type: One Point, where a single crossover point on both parents' organism strings is selected. All data beyond that point in either organism string is swapped between the two parent organisms. The resulting organisms are the children:

Both general processes of mutation and a one point crossover were explained in detail in the section 4.2.1. A Brief Description of Genetic Algorithm.

4.2.2 Outcome of the Algorithm

10 individuals evolved over 50 generations showing the emergence of the model that maintain a certain spatial arrangement of the different types of units and keeping them next to each other with the slight intersection of the masses.

Page 24: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

24 

Figure 19. 1st generation (colourful individual is the fittest with the Fitness Factor being 8.4).

Figure 20. 50th generation (colourful individual is the fittest with the Fitness Factor being 9.4).

Figure 21. Example of the fittest model of one level (Fitness Factor of 10.4).

Figure 22. Example of the fittest model of three floors (Fitness Factor of 7.4).

Page 25: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

25 

The models can evolve more sophisticated shapes as more creative forms are programmed into the CA units instead of simple boxes, the search space becomes more structured and the selection methods improve.

Figure 23. Part of the search space

4.3.3 The Drawbacks of the Algorithm

The most obvious and crucial drawback is algorithm’s lack of produced solution diversity as a result of a problem known as “local optima” and extremely narrow search space. The others are:

- the intensification process is not accurate - genetic drift - the production of a vast amount of data during an optimization process without actually using

much of it - high convergence rate

4.3 Examples of GA Applications in Architecture

Genetic Algorithms have been applied to a variety architectural problems, mostly spatial, structural or performance optimisation and also to floor plan generation. Only recently GAs started to take on board morphogenetic tasks, however still most of them tend to be a result of any sort of performance analysis and development. For example, SOM architects programmed a Genetic Algorithm revealed surprising structurally-efficient shape of the tower - a teardrop:

Figure 24. A teardrop tower by SOM architects

Page 26: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

26 

Another example where GA is used for a shape-optimisation purpose based on analysis of the lines of principal stress is the TransBay Transit Centre by SOM architects:

Figure 25. Trans Bay Transit Centre by SOM architects.

Not many are working on giving GA’s field of application another dimension: shape evolution that emerges from the behaviours of other agents of building’s complex system. Among them are Aedas|R&D and Davis Brody Bond who have designed The National September 11 Memorial Museum in New York. They built a suite of software that allows architects as well as exhibition designers to simulate the cognitive effect of geometry on visitors as they view and move through the building. By calculating expanse of views, maps of intuitive orientation are produced that help designers to understand transitions between spaces where no obvious delineation exits (Aedas Computational Design and Research group 2013)

Figure 26. The models of movement and volumetric visual perception

Page 27: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

27 

Figure 27. Visibility polyhedra along a visitor's path

Figure 28. Volume of space visible from a single point in the exhibition

Figure 29. Strength and direction of visual field

Page 28: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

28 

Chapter 5. Self-Organising Maps and Genetic Algorithm

5.1 Artificial Neural Networks and Self-Organising Maps

Artificial Neural Networks (ANNs) were first introduced in the 1950s, but only in the mid-1980s their algorithms have become sophisticated enough for general applications. A self-organizing map (SOM) (Kohonen 1982) produces a low-dimensional structured representation of the high dimensional input space of the training samples, called a map. The data is represented by prototypes, called weight vectors. SOM is trained using unsupervised learning, which means that no human intervention is needed during the learning and that little needs to be known about the characteristics of the input data.

An example of a SOM Algorithm Application in 3D Modelling, its applications and other types of SOM algorithms are described in appendix 3.

5.2. Introducing Self-Organising Maps to the Classic Genetic Algorithm

5.2.1 Why Combining Genetic Algorithm with Self-Organising Maps?

Having decided to engage computational methods of design and implement the process of emergence during the design process the two main challenges have arisen: how to generate novel, unique and unexpected forms; and how to distinguish “interesting” and meaningful shapes once they have emerged.

One of the best ways to tackle the first problems known to date is through a technique known as genetic programming. A possible solution to the second problem could be to involve artificial neural networking.

Being the only one of four types of Evolutionary Algorithms, the GA treats genotypes separately from phenotypes, i.e. maps "from genotypes (evolved parameter values), to phenotypes, (to problems)." (Bentley, Kumar 1999) This quality on its own gives increased advantages like "better enumeration of search space, permitting two very differently organised spaces to coexist" (Bentley 1999). On the other hand, SOM represents the quality similar to the associative type of memory of the brain. The brain naturally associates one thing with another. It can access information based on contents rather than on sequential addresses as in the normal digital computer. The associative, or content-addressable, memory accounts for fast information retrieval and permits partial or approximate matching (Ramos 2012).

Naturally, Genetic Algorithm reduces the search space. As only highly compact genotypes are permitted to characterise phenotypes, genotypes have fewer parameters than their corresponding phenotypes. It results in the reduction of the dimensionality of the search space (Bentley 1999). I believe that SOM could help to reverse the process of a constantly shrinking possibilities space, allowing higher parameterisation of the genotypes. However, at the same time the expansion of the search space creates a challenge of how to manage the massively expanded choice.

Page 29: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

29 

On the other hand, AG belongs to the class of stochastic methods that find solutions to problems by examining random parts of the search space. SOM permits the thorough exploration of well-defined parts of the search space. If essential, more control is gained by using the artificial selection of the potential parents where the designer hand-picks the individuals for crossover and mutation.

Moreover, traditional GA generates a large number of solutions rather than a single possible solution at a time, thus exhibiting a sophisticated quality of operational parallelism. Meaning that GA has a greater chance to find the meaningful solution, but it has a tendency of getting “stuck” in a local optimum. SOM attacks this challenge by allowing an in-depth search of the expanded neighbourhood of solutions similar to the ones that caught the attention of the designer. Also if there are few potential candidate solutions the whole spectrum of the resulting models that lies in between them can be assessed as well.

Design cannot be carried out without the visualisation of the potential models. In traditional methods most of the visualisation happens in the imagination of the designer and only the selected elements are brought to life by sketching or drawing. SOM in a way becomes that virtual imagination that structures, classifies and records the broader search space of the traditional GA and enhances the creative intuition of the designer.

Therefore by combining the two approaches, organising the search space with the help of Self- Organising Maps algorithm, the design of the possibilities space can be classified thus enabling better optimisation of the search process, providing with a more advanced visualisation of the solutions.

5.2.2. Managing the Expanded Choice

As mentioned earlier, the expanded search space is an advantage and a drawback at the same time. Why it is beneficial, was looked at earlier. The reasons why its size has to be controlled is described below.

Expanded choice can lead to "...less happiness, less satisfaction and can even lead to paralysis" (Piasecki, Hanna 2000). A designer can experience the same confusion when faced with a massive solution space offered by a Genetic Algorithm, even more so when it is expanded at each generation with the SOM.

Being different to machines, human beings have a very powerful filter of information – their subconscious, when it comes to making a decision. Subconscious is formed by one's experience, knowledge and inherited genes. However the verdict in most cases is made by the conscious mind. When having to make a choice between two fairly equal options, most of us often feel confused and distressed. I believe that this is due to another characteristic of human beings - a personal attachment. Luckily computers do not possess personal attachment and a way around the problem can be worked out.

Now, how do I imitate both the sub-consciousness and the consciousness of human beings in a program to help the computer to make the right choice even with the helping hand of the designer? What will the fitness factor consist of? Will there be a fitness factor at all? Or will it be evolving over the duration of the algorithm?

Keller and Stealin suggested that the results of the decision-making process depend on the correlation between the quality and quantity of information. If the amount of information increases whilst maintaining its quality, the efficiency of good decision making will decrease, and visa versa. This concept is represented in Wundt Curve Representation of the Paradox of Choice:

Page 30: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

30 

Figure 30. Wundt Curve: Graphic representation of the Paradox of Choice.

Self-Organising Map can represent a multidimensional Wundt curve, where each product attribute is mapped onto a separate dimension. The shape of the curve will vary in each dimension going from user to user.

When it comes to architectural design an architect faces certain challenges of being able to successfully define the meaningful attributes of the design:

- The importance of the parameters varies whilst they all have to be read in conjunction with each other. It can be represented in the weighed complex fitness criteria

- The end user simply may not know what he/she needs. The idea of co-evolution emerges.

- By nature the architectural design due to its complexity is an "ill-defined problem" or a "wicked problem", in which potential solutions evolve together with the formulation of the problem itself (Rittel 1973)

Below is the proposed breakdown of the important factors to consider whilst coding the process of selecting the right individuals for the crossover based on selected thoughts of Piasecki and Hanna (2000):

1. Limit the solution space by choosing the individuals based on evaluation of their mostessential and meaningful parameters, parameters that the designer wishes to work with or customise.

2. Let multiple users to experiment with the population and the algorithm to adjust thedefining selection criteria for a particular problem.

3. Create an algorithm that learns from users and creates a model of their preferencesbased on a definition of meaningful solutions. It can be used later on to narrow down the solution space. Alternatively, use a selection method based on the recognition of the remaining meaningful choices based on artificial or "manual" selection. Users are not forced to explicitly define values of all the parameters available to customize. Instead they express preferences among a couple of different options.

4. Implement a Recommender System

Appendix 4 shows Piasecki and Sean's experiment that proves that by using GA in rationalising the "meaningfulness" of choice the Wundt curve relation of users satisfaction and the scope of choice doesn't apply.

Page 31: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

31 

5.2.3. The Evaluation of the Aesthetics

The challenge of managing the expanded choice was tackled suggesting that the user assesses the essential criteria that determine both performance and aesthetical values of the emerged models. But how the evaluation of aesthetic qualities can be performed? How can they be controlled in the bottom-up computer-aided design process?

The formal study of aesthetics dates back at least to the ancient Greeks, who were trying to establish the connection and analogies between architecture and other forms of art like dance and music. This is why the principles of rhythm, proportion and symmetry were widely applied in Greek and Roman architecture. Until now architects, critics, artists, psychologists, and philosophers are persistently trying to develop the objective methods for evaluating aesthetic qualities of building or even simple objects. Still their efforts did not result in the establishment of neither agreed-upon aesthetic “standards” nor methods that will compare the objects or buildings to such standards. The experimental methods do exist and they are described as “Habitual” or “Computational” (Kalay 2004). They include an “objective” or mathematical approach, a “subjective” or perceptual approach, shape grammars, computational approximation of “acquired taste” and so on.

To establish aesthetic criteria is extremely challenging if not impossible, as the relation of the design formation and the architect can be seen as a process of co-evolution. One way to define an aesthetic evaluation is as a translation from the evolving over-all perception image of an architectural model to an automatic set of mathematical selection rules. In the absence of such mathematical function the final result will remain an exploration of some random sub-spaces of possible and hypothetical novel solutions. Alternatively, however, the objective and analytical evaluation of any final design could be misleading if those aesthetical fitness functions were implemented, mapping genotypes into the “usefulness” of the hypothetical novel forms. Probably this is mostly because this aesthetical “mapping” uses a compression method, where the diversity of any multidimensional conceptual world of the design reduced and represented by its only few aspects. Luis Borges words on these matters are wise: the only true map of the world is the world itself.

There is however one way to avoid those mathematical mappings. That is of using the human observer or “artist”, as an “aesthetical mapping machine”, connected in real-time to the artificial evolutionary process. (Ramos 2012). In other words, the human being selects the images which are aesthetically pleasing, or otherwise interesting, and these are used to breed the next generation. For the first time the idea was introduced in 1991 by Karl Sims, where the computer-graphics program called Primordial Dance uses genetic algorithms to generate new images, or patterns, from pre-existing images(Sims 1991):

Figure 31. Images evolved during the Primordial Dance uses genetic algorithms.

Page 32: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

32 

5.3. The Evolution of the Proposed Code

5.3.1. Stage 1.Classic GA for the Current Architectural Scenario

The algorithm was described in the section 4.3. Classic GA for the Current Architectural Scenario.

5.3.2. Stage 2. Adding Golden Ratio to the Fitness Criteria

An arguable aesthetic factor was added to the model's existing Fitness Criteria. The idea was to encourage the proportions of the width and length of the boxes to be close to Golden Ratio number 1.618, whilst maintaining the spatial arrangement of the units of the original CA model.

Observations and Conclusion:

The more factors are added to the Fitness Factor the more problematic becomes the problem of local optima. The algorithm has a tendency to stick to one or two fitness criteria and evolve them, compensating with their very high ratio the low level fitness of the other criterias, like in the example shown below. The units' proportions became very close to the ratio of 1.618 thus achieving very high fitness factor. However the intersection and proximity to each other requirements are far from being met.

Example of the evolved model with very high total fitness factor that gets its value mostly from meeting the criteria of the proximity to Golden Ratio, leaving the other fitness factor constituents un-evolved:

Figure 32. A model with a high fitness value assessed on meeting the Golden Ratio criteria

Page 33: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

33 

The emerging properties of the resulting models of the algorithm were not consistent as the fitness function was confusing the process of evolution. To attempt to solve the problem the following code adjustments could be implemented:

- to introduce the process of expanding and structuring the search space with the SelfOrganising Maps. Alternatively

- to ensure that only that model is allowed to become a parent if each component of the fitness criteria is above the certain acceptable level;

- to create an elimination criteria. It can also be getting stricter with each generation thus still encouraging evolution however at the same time promoting diversity and avoiding the problem of local optima.

Nevertheless the algorithm did work and produced some good results. For example, below is the comparison of the best individual of the 1st generation and the best individual of the 250th generation.

Figure 33. The best individual of the 1st generation

Figure 34. The best individual of the 250th generation:

Page 34: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

34 

5.3.3. Stage 3. Introducing Self-Organising Maps to the Genetic Algorithm

Each Self-Organising Map trains its neurons to become a close match to every individual of each generation that represent “inputs” of the neural map. The neurons highlighted in RED are the best match of an individual of the generation or “winners”. The fitness of the neurons within the map at this stage was not checked therefore they couldn’t become parents yet and fuel the evolution.

Figure 35. SOM expanding each generation of GA.

Page 35: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

35 

5.3.4. Stage 4. Turning Chromosomes into Neurons and Back

In order for neurons to participate in the selection process, chromosomes and neurons have to be compatible in their parameterisation as they represent two classes in the algorithm. One that produces and works with GA individuals and their chromosomes, and another with SOM neurons.

At the stage when SOM inputs are generated every individual from the current generations has to serve as an input neuron. To do that, chromosomes and their values have to be encoded into the vectors of corresponding neurons. As the chromosome coding and neuron’s vector parameterisation are exactly the same, their fitness assessment can be performed with the same function. However, later on, after the SOM is trained and the parents have to be selected, neurons have to pass their coded vectors’ values to form chromosomes for crossover. In order to achieve this the resulting vectors of the trained neurons have to be encoded to represent the binary string of the chromosome.

Turning GA individual into a neuron of SOM

In a standard GA algorithm a decode function does the following. It takes a chromosome list that contains zeroes and ones. Then it decodes it and turns it into the list of corresponding to each gene values (please see section 3.2 .Traditional Genetic Algorithm: a Brief Description). These values are assigned to their corresponding neurons’ vectors.

Turning neuron back to GA individual

In order to SOM neurons have generated lists of values and a "reverse decoding" had to be performed in order to be able to have chromosome lists ready for crossover and mutation of the individuals. In other words the task was to create a chromosome list like this:

From values list like that:

Also all neurons' values are decimal numbers (as they were trained by artificial neural networks) so the values have to be rounded up to the closest even number and not bigger than the max decode value for the defined gene length

Now neurons have all the information necessary for crossover and they now can become parents.

Page 36: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

36  

Colour-coding of the models:

PURPLE - individuals of Genetic Algorithm generations BLACK - Self-Organising Map trained on each generation RED - the SOM "winners" or a model that is the closest match to the individuals in their generation. GREEN - the fittest models among each GA generation and neurons of SOM.

The Fitness Criteria is unchanged

The Selection Process is carried out among the individuals of the particular generation and its corresponding Self-Organising Map using Goldberg's Roulette Wheel method

Mutation rate 0.2

Crossover: One Point type with 1.0 rate. Siblings replace random individuals of the current GA generation thus becoming the population of the next generation.

Self-Organising Maps & their values: Winner learning strength is 0.98; others learning strength is 0.95; convergence rate is 0.3.

Figure 36. The example of the resulting evolution of the SOM-expanded generations.

Page 37: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

37  

Figure 37. Example of the first generation and its Self-Organising Map

Observations:

- All the fittest models are within SOM

- With the evolution there are less winners as the training individuals from the later generations are getting more and more identical to each other. Therefore one "winner" can be a match to a few individuals of the GA;

- By the 7th generation the algorithm got stuck on its local optima. The population of each generation should be refreshed each time by replacing the old individuals with the neurons of high fitness. Alternatively an age criteria could be incorporated into the fitness value. However on the other hand, it will not yield good results as complicating fitness criteria leads to losing some values and concentrating on others. It can be fixed however by introducing the filter that determines the individuals with equal proportions of fitness criteria components as mentioned earlier;

- There is a linear correlation between the fitness values of the neurons and their location within the map if fitness criteria is mostly dependent on the geometry of a model.

Page 38: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

38  

Conclusion:

- Manage the expanded choice by introducing the elimination criteria; - Introduce the threshold for Fitness Factor components; - Artificial Selection; - The population of each generation could be refreshed each time by replacing the old

individuals with the neurons of high fitness

5.3.5. Stage 5. Co-evolving Threshold of the Fitness Factor Components

In order to gain some fitness an individual (or a neuron) must have the components of the fitness factor defined accepted level. If one of the Fitness Factor constituents is below the defined threshold, its total fitness is written down to zero. This is introduced for two reasons. Firstly, so individuals evolve their fitness evenly across all its components. The individuals whose fitness is zero are not participating in the selection process at all. Secondly, to manage the problem of expanded choice and structure the search space by narrowing it down and eliminating the individuals that are not fit across all the components of the fitness factor. Also the computational time is saved as the neurons with fitness zero, do not go through the laborious process of encoding their values to binary chromosome code.

Figure 38. Extract from the code: assessing the fitness value

However if the population is too small the chance to have fitness value of zero is bigger and no crossover can happen. If all individuals are of fitness zero, the algorithm communicates this and suggests setting up more individuals in the next run.

Much fitter individuals started to appear straight away in the next generation. However the evolution speed slows down. To keep it up, a co-evolving threshold can be introduced, i.e. the defined level will be getting higher and higher depending on the growth of the Fitness Factor components of each individual. In this case, to keep it simple for the sake of the experiment, each component fitness factor increases by 5% with each generation.

Co-evolving fitness components threshold and composing the next generation from the siblings after the crossover processes resulted in a surprisingly slow fitness criteria increase. However the algorithm gave back something much more valuable: from the second generation onward almost 90% of neurons were over the fitness components threshold structured in a clustering manner. At the same time the variety was very poor. The algorithm was behaving very similar to the classic GA - tending to get stuck on the local optima.

Page 39: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

39  

Observations and Conclusions:

- At this stage filtering out the population with the elimination criteria and applying artificial selection a designer has a great choice of good quality models, but of poor diversity. To increase it a Fitness Factor has to contain non-geometrical criteria;

- as the algorithm structures the search space and only concentrates on the individuals with higher fitness, the Goldberg's Roulette or even artificial selection was not a very logical choice. A totally random pick of parents from the group of already selected individuals could be perfectly adequate, as the selection is already programmed into the fitness threshold. The problem with Goldbergs Roulette is that it tends to select the fittest individuals thus creating a perfect environment for the future local optima. By randomly selecting parents from the cloud of the fit candidates hopefully the results can be more diverse especially restructured by SelfOrganising Maps;

- the problem of limited variety of the models' geometry is down to the facts that the fitness factor is composed of 100% external criteria;

- by changing the learning factor from 0.3 to 0.5 (the lower the factor the closer neuron's parameters will be to the individual it is trying to match), bigger diversity was noticed among neurons and resulting in a very aggressive fitness increase;

- keeping the learning factor at 0.5, the mutation process was eliminated. Much more structured space, slightly faster fitness evolution;

- not many generations the population needs to be evolved over in comparison with the classc GA;

- even if all the individuals in the generation have fitness of zero, a map can produce couple of neurons with some sort of fitness. It is like driving a car using only clutch;

- the fitness grows so fast from generation to generation even with the co-evolving fitness threshold that it is necessary to introduce a fitness assessment function to determine the reasonable threshold level for each generation;

- this algorithm gains its effectiveness with the larger populations and neural maps, that takes a lot of the computational power and time.

5.3.6. Stage 6. Optional Selection Method

As the selection of individuals is already programmed into fitness, the assumption is that there is no point to even use Goldberg Roulette as a selection method. A random coupling can be performed and their off-springs can compose the next generation. The experiments showed that if there are more couples than the number of individuals in each generation and the fitness threshold is not set very high, than the evolution dies out. At best the fitness remains exactly the same throughout generations, without even giving much of form diversity. That is due to a vast variety of certain level fitness values, so it fluctuates from generation to generation without steadily going up.

Page 40: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

40  

As a solution, off-springs of all couples can compose the next generation and therefore increase the size of the neural map, so its efficiency is not lost. Coincidentally the phenomenon of the ever changing size of the population of each generation is very similar to what occurs in nature. The size of population depends on so many factors and luck itself and is never constant. The drawback of this method can be excessively large size of the population and their corresponding maps. To control the situation the fitness threshold must be evaluated properly and adjusted to represent only the appropriate range for the selection purposes. Only the parents with the right characteristics should participate in creation of the next generation.

Figure 39. SOM participates in crossover function. The dark and light orange coloured individuals represent the parents of the next generation.

As a result, the potential parents were filtered leaving only those with the fitness of greater than half of the best one of the current map. It was done in order to minimise the size of the further generations and their neural self-organising maps. So the algorithm evolves the fitter generation from the expanded search space extremely fast. However, the biggest drawback is that it has to run many times before it can create the first generation where the components of the fitness criteria are at the right balance. If it doesn't happen then the code just terminates with the message informing that the fitness of all individuals is of zero value. In a way it is not a bad thing as it stops the user from wasting a lot of computational time and evolving the wrong generations. However, even though the algorithm works perfectly well, something can be done in order to produce a more appropriate first generation as at the moment it is created randomly. Alternatively, the fitness components threshold can be more intelligent and be set depending on the fitness values of each component of the current generation. The second option is more realistic to program. The same goal was achieved by simply lowering the Fitness Components Threshold and increasing the minimum fitness level for the parents’ selection. Also the maximum size for the neural maps was set to keep the computing time to realistic limits as sometimes the map can be so large that it concentrates too long on one generation without any benefit for the evolution process.

Page 41: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

41  

The resulting maps showed well-defined clustering (due to the fitness factor being geometry dependent), displaying the concentration of the fittest models on the maps:

Figure 40. Clustering of potential parents.

Page 42: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

42  

5.4. The Final Algorithm

5.4.1. The Structure of the Algorithm

The algorithm is described in the stages of its performance.

5.4.1.1. Original CA Model and Body-Plan

The initial part of this stage remains unchanged and is described in the section 4.2.1.1. Original CA Model and Body-Plan. An example of the possible architectural application and parameterisation of the body-plan is described below.

Once the parameterised CA volumes are built the morphogenetic process of the final architectural model begins. Firstly the centroids of the edge boxes are located. Then the centroid of the area between the points is determined.

Figure 41. Locating the centre point between the centroids of all edge CA units

Page 43: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

43  

After that the ellipse is drawn from the centroid of each CA unit as shown on the Figure. 42 and then its extruded (Fig. 43).

Figure 42. Creating ellipses.

Figure 43. Extruding ellipses.

Page 44: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

44  

Both top and bottom ellipse of each unit is divided into equal amount of segments. The division points are shown of Fig. 44.

Figure 44. New shape of the CA units and their division points.

Figure 45. Double loft surfaces going through the division points.

Page 45: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

45  

Figure 46. Surfaces of the new model with the bridges between the units

Figure 47. Top view of the model

Page 46: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

46  

All similar types of units are connected with each other by a bridge-platform, providing the connectivity between the functional zones.

Figure 48. Colour-coded connection bridges between the similar units on each floor.

Page 47: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

47  

Figure 49. Perspectives of the model

Page 48: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

48  

Figure 50. The bridges viewed from the interior of the towers

The alternative models that can be programed into the proposed algorithm are shown in Appendix 8 The task of this stage is to find the ways to parameterise the defined functional zones so they can be moulded together in one connected space and let the user choose the method. Alternatively, mould the spatially allocated masses together by combining all the objects with different sets of parameters to form one (with the sum of the characteristics of all the separate parts). Parameters do not necessarily need to represent geometry. Qualities can be both extensive and intensive.

A different initial model can be chosen for development. For example, a spatial layout as a result of an automated space planning by Computational Design group at Aedas R could be used. The concept behind this planning is that the accommodation schedule is loaded, and its components try to arrange themselves automatically according to an adjacency matrix (Aedas Computational Design and Research group 2013):

Figure 51. Abu Dhabi Education Council: accommodation schedule

Page 49: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

49  

Figure 52. Abu Dhabi Education Council: massing derived from layout application

Figure 53. Abu Dhabi Education Council: Design principles

Page 50: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

50  

5.4.1.2. The First Generation

The first generation is composed from the “fit” models hand-picked by the designer from the

number of random “fit” models generated by the algorithm. Each of them is given a unique colour that will be evolving together with the form of the models. The first generation is created out of the individuals that represent the geometrical variations of the original CA model. Each CA unit is parameterised according to its own state being either "working", or "living" or "resting" and located within chromosome-defined distance from the CA’s origin of each unit:

Figure 54. The first generation hand-picked by the designer.

5.4.1.3. Introducing Self-Organising Maps

The detailed process is described in the section 5.3.4. Stage 4. Introducing Self-Organising Maps to the Genetic Algorithm

Clustering of the neural map is performed based on the proximity of each neuron to the winners. Each neuron chooses the closest one and assigns itself to become a part of that winner’s cluster. At the same time each list of the cluster's individuals contains the corresponding original individual from the current population. For visualisation purposes each individual has its own colour and the same colour has individual's best match on the neural map. Every neuron on the map changes its colour to match the colour of the closest input or teaching neuron (marked with the circle around it). This way the self-organising map displays colour-coded clusters:

Figure 55. Colour-coded clusters of the map classifying it according to the similarity of neurons to each input (or individual of the current generation).

Page 51: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

51  

5.4.1.4. The Fitness Function

One of the biggest challenges in evolving the optimal design solution with multi-objective criteria is to balance out the fitness criteria components or overall performance of the solution. The goal is to overcome the problem when a design proposal excels with regard to some performance objectives and yet prove insufficient on others. In order to achieve that one of the most crucial principles of the current algorithm is the condition to assign a sum of all components to the total fitness of the individual only if a value of each fitness constitute is above its minimum corresponding threshold level.

The initial fitness constituents were described in the chapter 4.2 Classic GA for the Current Architectural Scenario, section 4.2.1.3. The Fitness Function. Additional two fitness components are added:

- Minimising the total intersection area of the elliptical slabs of the new architectural model (Fig. 56);

- Minimising the length of the centroid curve of the model (Fig. 57 a, b, c).

Figure 56. Intersection areas of the elliptical slabs.

Page 52: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

52  

Figure 57(a). Middle point of all the edge units’ centroids.

Figure 57(b). Centroid curves of the models

Figure 57(c). Centroid curves of various length

Page 53: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

53  

Composition of Fitness Criteria

At this stage the proposed algorithm assumes that all Fitness criteria components are equally important. However, fitness constituents can be programmed to have their own weightings within the total fitness value, thus mathematically assigning their importance in the overall fitness assessment, if after the process of the evaluation it is clear that all design objectives cannot be equally satisfied. This ensures that GA will not evolve an individual with a high total fitness value that is represented only by one or two fitness criteria constituent and at the same time it naturally narrows down SOM-expanded search space. Furthermore, depending on the selection method a fit neuron is selected from each cluster for crossover and mutation in order to keep the diversity rate high.

Figure 58. Composition of Fitness Criteria

Fitness Function Explained Mathematically:

max(TFabc) = max(Fa) + max(Fb) + max(Fc); max(TFabc) is the maximum value of total fitness, where max(Fa), max(Fb), max(Fc) are the maximum possible values of constituents Fa, Fb, Fc of the actual total fitness TFabc; max(Fa), max(Fb), max(Fc) are calculated at the beginning of the algorithm;

max(TF) = 100 is a preassigned new maximum total fitness value of 100 so the actual value will be mapped to have a value between 0 and 100;

β = max(TF)/max(TFabc) = 100/max(TFabc), where β is the mapping index;

TFabc = Fa + Fb + Fc is actual total fitness, where Fa, Fb, Fc are the constituents of the actual total fitness;

Fthr = 20% is the fitness threshold that represents the minimum value all of the fitness constituents must have in order for an individual to qualify as a candidate to become a parent; if any of the fitness components is less that this threshold, then the total fitness of that individual automatically goes to zero;

Actual min value of the fitness component a: min(Fta) = Fthr * max(Fa)

If Fa > min(Fta) then the value of the fitness constituent a is mapped to be within the range between 0 and 100: Fa acquires the value of (Fa * β) the new total fitness value will be: TF = Fa + Fb + Fc and its maximum value will be 1:

Page 54: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

54  

Fitness Topography

As it was mentioned earlier due to the pure geometrical fitness criteria, the fitness values of the maps and GA individuals are arranged in clusters throughout generations. The Surfaces below represent the Fitness Topography of the entire evolved population over the generations. This visualisation is useful for analysis of the performance of each neuron, their maps and data-mining. The cooler the colour, the higher the fitness value.

Figure 59. Example 1: The Fitness Topography Surface of the population evolved over six SOM-expanded generations (a)

Page 55: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

55  

Figure 60. Example 1: The Fitness Topography Surface of the population evolved over six SOM-expanded generations (b)

Figure 61. Example 1: The Fitness Topography Surface of the best individuals of each SOM-expanded generation

Page 56: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

56  

Figure 62. Example 2: The Fitness Topography Surface of the population evolved over six SOM-expanded generations (a)

Page 57: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

57  

Figure 63. Example 2: The Fitness Topography Surface of the population evolved over six SOM-expanded generations (b)

Figure 64. Example 2: The Fitness Topography Surface of the population evolved over six SOM-expanded generations (c)

Page 58: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

58  

Figure 65. Example 2: The Fitness Topography Surface of the best individuals of each SOM-expanded generation

Page 59: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

59  

5.4.1.5. Selection Options

5.4.1.5.1. The Final Decision-Making Rules

- Provide the user with the minimum parameters, only with those that they wish to work with or customise;

- Define multiple constraints. Define a separate very simple multiple fitness criteria that consists of pure functional values that increase feasibility of the building (min/max floor area, min/max height, structural capacity, etc). Eliminate the least fittest individuals of each benchmark from the current SOM-expanded generation instead of classic "choose the fittest". This will stop the algorithm from taking into account the attractive features of non-suitable individuals during the next few steps;

- Offer simple visualisation of each architectural solution thus enabling user to access it quickly and efficiently;

- Artificial selection of the individuals for the cross-over. The individuals can be assessed in pairs to increase efficiency of choice and promote the intuitive choice of "good enough" individuals instead of looking for the optimal ones (Schwartz, B. 2004). This step can be repeated a few times in order to narrow down the parents to a desirable number of potential parents;

- Choose when to choose parents for crossover. Sometimes only mutation needs to be applied. The choice can be left for a Recommender System to find the patterns (not part of this thesis);

- Regardless the chosen selection method the first selection process is performed on the background during the stage of the fitness assessment. The total fitness is assigned to the individual only if a value of each fitness constitute is above its min corresponding threshold level. Otherwise it is set to zero, thus carrying out the first filtration of the potential parents;

- The parents are chosen one from each cluster. However, if there are no fit individuals in the cluster then a brand new individual is created that has fitness above the average of that particular cluster. This fulfils three purposes. Firstly, it keeps evolution from dying out by lowering the number of the individuals as the selection criteria are relatively high. Secondly, it serves as another more effective variation of mutation function and keeps the diversity levels up. Thirdly, non-intrusively broadens the relevant search space by bringing occasional individual of a relatively high fitness.

Even the preferred and the most logical method for handling the expanded choice is Artificial selection, the other two selection methods are introduced for comparison and proved to perform equally well if not sometimes better when facing various design objectives.

5.4.1.5.2. Selection Option 1: Optimised Random Selection by Clusters

As the neural map is trained classifying the neurons according to their vector parameterisation and is dividing the map into clusters, the selection process takes place. The initial selection of individuals is already programmed into the fitness function. Only those individuals are given fitness whose fitness constituents are above a certain evolvable threshold.

Page 60: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

60  

After the neural map is trained, of all fit individuals in each cluster only those are qualified to go through this stage if their total fitness is above the average of all the fit neurons within the cluster .The cut-off level can be set even higher if the purpose is to speed up the evolution process.

The following rules are applied to the process of the parents’ selection among the neurons that are fit enough and have a fitness value above certain threshold over the average cluster fitness:

- If there are two or more fit enough neurons in the cluster, then two random ones are selected for the crossover;

- If there is only one, then it becomes a parent automatically; - If there are no fit enough neurons, then a brand new fit enough individual is created to

become a parent (Fig. 66 on the next page).

This selection method is used when the designer is more confident with the programmed fitness evaluation rules and wishes to give more control to the machine which is programmed to look for a potential parent in the certain area of each cluster.

5.4.1.5.3. Selection Option 2: Artificial Selection by Clusters and the Evaluation of Aesthetics

The proposed selection method gives power to a designer’s experience and intuition in order to evaluate the aesthetics of the evolving model when the artificial selection method is chosen. At the same time designer influences the aesthetics by programming low level implicit rules into the code. Also it helps to avoid the local optima problem.

The individuals are assessed in pairs going from cluster to cluster in order to increase efficiency of choice and promote the intuitive choice of "good enough" individuals instead of looking for the optimal ones (Schwartz, B. 2004). This step must be repeated a few times in order to narrow down the parents to a desirable number of potential parents. The structure of the algorithm allows the designer to choose a different number of parents for crossover at each generation. Also at this stage the crossover rate controls which parents to mate, not the designer. An option for the user to choose which individuals to carry on to the next generation and which to pair could be suggested as a possible variation to the code.

The following rules are applied to the process of the parents’ selection among the neurons that are fit enough and have a fitness value above certain threshold over the average cluster fitness:

- If there are two or more fit enough neurons in the cluster, then the designer chooses two among them for the crossover;

- If there is only one, then it becomes a parent automatically; - If there are no fit enough neurons, then a brand new fit enough individual is created to

become a parent (Fig. 66 on the next page).

The Artificial Selection method can be used when the priorities are the aesthetical qualities of the model or/and the architect feels essential to personally carry out the assessment of the offered solutions by him/herself and needs to put his/her hands directly into the design process.

Page 61: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

61  

Figure 66. SOM, its clusters and the selection principle

Page 62: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

62  

5.4.1.5.4. Selection Option 3: Goldberg Roulette

The mechanics of this selection method were described in the chapter 4.2 Classic GA for the Current Architectural Scenario, section 4.2.1.4. The Process of Selection.

It was introduced in the proposed algorithm as a comparison to more deterministic methods described above. Even being the simplest and the closest to a natural selection process it has its drawbacks. It may lead to biased selection towards high fitness individuals. It can also possibly miss the best individuals of a population. There is no guarantee that good individuals will find their way into the next generation.

5.4.1.6. Crossover

Depending on the chosen selection method a list of parents is generated. Then the couples are formed randomly from the individuals on that list. The same type of one point crossover is applied. The unique quality of this algorithm is that the size of each generation and therefore their corresponding neural maps can change, depending on the number of the parents chosen during the selection stage. It helps to maintain the diversity at the early stages of the design process. However if the point where the small solution space is defined as relevant and optimal, then the number of individuals can go down allowing thorough investigation of this particular area of the search space.

5.4.1.7. Mutation

The mutation rate can be changed depending on the level of the genetic diversity that needs to be maintained. However if a new individual is created for any of the clusters because there were no fit ones then mutation doesn’t apply. Mutation as a genetic operator was described more in detail in the section 4.2.1. Brief Description of Genetic Algorithm.

5.4.1.8. The Following Generations

All chosen parents are mated and the siblings compose the next generation. The same process performed again for a certain number of generations or until the individual/s with the adequate characteristics has/have evolved.

5.4.1.9. Visualisation of the Search Space by Clustering and the Evolution of Colour

The illustration below shows the neural maps in clusters. The colour of each cluster represents the colour of the corresponding training individual of the current generation. Each neuron acquires the colour of the closest to it winner (a neuron closest to the training individual of the current generation). When the parents are chosen their off-springs inherit their averages of R and G colour indexes spliced in a randomly chosen proportion. B is constant for all individuals and their off-springs to keep the colours readable. Therefore the colour of the clusters evolves with generations in parallel with the shape of the models:

Page 63: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

63  

Figure 67. The example of the population evolved over three generations.

Page 64: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

64  

Figure 68. The example of the population with the example of the developed body-plan evolved over two generations.

Figure 69. The perspective of the same population.

Page 65: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

65  

So far, neurons' vector parameters were 100% geometrical. That gives the map a very well-defined visual clustering. As more implicit types of parameters are programmed into the vector (or chromosome values) and also into the Fitness Factor, the more diverse will be the appearance of the search space.

5.4.2. Main Results and Observations

As the algorithm finishes running it produces the summary of the data of each generation (Fig.__)

Figure 70. The example of the algorithm’s report

The key observations over the process of the algorithm development:

- The evolution of the model is recorded for further reflection as the fittest solution does not always mean the most appropriate in both functional and aesthetical sense;

- By changing the neural learning factor from 0.3 to 0.5 (the lower the factor the closer neuron's parameters are to the individual it is trying to match), bigger diversity was noticed among neurons resulting in a rapid fitness increase and much more structured space;

- Even if all the individuals in the generation have fitness of zero, a map can produce couple of neurons with some sort of fitness. It is like driving a car using only the clutch.

- By changing the converge rate, “winners” and “non-winners” learning strength the designer can control which part of the GA's search space will be explored next

- In order to avoid the local optima problem, the best solutions are the artificial selection and introduction of other non-geometrical fitness criteria components. The relationship between the units of each model must understood as a notion of “sympathy” and be reflected in the way they are acquiring their form;

- The majority of the fittest models are located on the neural map;

- With the evolution there are less winners as the training individuals from the later generations are getting more and more identical to each other. Therefore one "winner" can be a match to a few individuals of the GA;

- There is a linear correlation between the fitness values of the neurons and their location within the map if the fitness criteria is mostly dependent on the geometry of a model.

Page 66: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

66  

5.5. Advantages and Disadvantages of GA-SOM

5.5.1. Advantages

As more and more complicated design tasks are being set in front of the computers, it is essential to take care of the "representation of the solutions", otherwise the increase in complexity of the genetic algorithms can cause certain problems such as "disruption of inheritance and premature convergence to local minima often preventing the generation of fit solutions altogether" (Bentley & Kumar 1999).

Conventionally architects use sketching technique known as “thinking with diagrams” - a process where the designer “lets his/her hand do the thinking” (Akin 1978). Sketching is performed with the context of a specific problem in mind, when the solution to it is not fixed to begin with. Rather, it emerges as the process unfolds (Habraken 1985). It has been argued that this process of the emergence of form and the meaning behind it from a representation is the essence of creative design. SOM provides the designer with such visual feedback of the possible solutions and a reinforcing system to recognise and understand the possibilities and implications of the evolving design.

The explored search space of the Genetic Algorithm is mapped only according to the fitness criteria. The biggest advantage of SOM-GA algorithm is that its search space is also mapped based on the results of the evolution of the neurons’ vectors or individuals’ chromosomes. The design process is led towards that part of the possibilities space where the combination of model’s programmed parameters and fitness are found to be appropriate.

The other ways in which SOM benefits to the current algorithm:

- Re-structures and expands the possibilities space;

- Better performance at generating unexpected solutions and thus potentially being able to compete with human creativity;

- Introduces clustering property to the search space of GA, a characteristic that is a direct consequence of the data visualization- and exploration capabilities of the topographic map - ability to evaluate the solutions in non-procedural ways;

- Tot as many generations the population needs to be evolved over in comparison with the classic GA;

- SOM classifies the search space in order to make the essential choice of regions of the space should be explored next;

- Out of all known algorithms only SOM handles the challenge of working with the models of

- Large dimensionality. GA individuals are parameterised following the similar principle as the neurons in SOM, making GA and SOM compatible;

- Minimises many drawbacks of the traditional GAs, such as inaccuracy of the intensification process, genetic drift, production of vast amount of the data during an optimization process without actually using much of it, high convergence rate, limited diversity, and problem of "local optima", helps to broaden and amplify the synthetic intuition of the designer.

Page 67: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

67  

Main characteristics of SOM that are automatically are passed to GA search space that uses neural maps to expand each generation:

- Unsupervised learning algorithm that reduces the dimensionality of a certain number of units (Kohonen, 1997);

- The map is able to maintain the topology of a data space by using the neighbourhood function, thus creating clusters of units that correspond to similar data patterns;

- The uniqueness of SOM is in its ability to handle the problems of large dimensionality;

- Processes information as patterns, rather than executing explicit instructions one at a time exhibiting the properties of parallelism and pattern recognition;

- Generalisation capability meaning that the network can recognize or characterize inputs it has never encountered before. A new input is assimilated with the map unit it is mapped to.

5.5.2. Disadvantages

The main disadvantages of the proposed algorithm, excluding the typical ones for each GA and SOM:

- The algorithm gains its full effectiveness with the larger populations and neural maps, that takes a lot of the computational power and time;

- there are too many combined factors between GA and SOM (like fitness factor, mutation rate, selection rate, convergence rate, winner strength, etc.) that proves it very challenging to find the right equilibrium between them for each design problem;

- The problem of the expanded choice and evaluation of the aesthetics of the models;

5.5.3 Examples of other SOM-GA algorithms and their applications

- Appendix 5: First ever Self-Organising Maps for Muliti-Objective Evolutionary Algorithm (SOM-MOEA) by Buche et. al. in 2002

- Appendix 6: Brudaru's Cellular Genetic Algorithm With Communicating Grids and SOM

- Appendix 7: Kita's real-coded genetic algorithm (RCGA) and SOM

Page 68: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

68  

5.6. Further Research and Algorithm Development

The proposed algorithm and the associated research so far were only concentrated on the SOM/GA part of the potential final code. The chapters below show the other areas of exploration that could turn the current algorithm into a very productive architectural design tool.

5.6.1. Parameterisation of the Body-Plan

Using the main advantage of using machine in the design process is being able to explore various parameterisation strategies of the body-plan for shape finding purposes.

While D’Arcy Thompson proposed the idea that natural form may be better understood through biological process, Christopher Alexander has taken the concept of morphogenesis further, formulating the principles and applying it to all kinds of natural phenomena. He proposed three natural processes of form-making. The first one is “smooth unfolding” in nature, the second is “structure-preserving transformation” and the third is “formation of centers and fields of centers” (Alexander 2002). Alexander’s work could inspire few different ways of evolving the form that could be programmed into the proposed algorithm.

Other various form-finding methods can be tested, like for example grammars, expert systems, case-based methods, mechanical or electrical metaphors, recipe like instructions, attempts to draw inspiration from divine sources, “rational” methods, “formalistic” methods, and of course simple trial-and-error methods.

For example, Aedas R&D, developed a concept design for the Taiwan Tower in Gateway Park of Taichung City in Taiwan (Aedas Computational Design and Research group 2013). The tower’s shape derived from the form that emerges when a falling object hits the surface of water. The scheme’s plan takes the geometry of the new Gateway Park and forms a series of ripples that weave the tower into it surroundings, forming terraces, bridges and the main structure of the tower itself:

Figure 71. Taiwan Tower: Close-up of the parametric 3D model for the general form, structural ribbons and walkways

Page 69: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

69  

Figure 72. Taiwan Tower: Site plan with the scheme’s concentric guide arc

Figure 73. Taiwan Tower: Elevation the parametric 3D model for the general form, structural ribbons and walkways

Page 70: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

70  

5.6.2. Swarms and SOM Pattern Recognition

Various examples of swarms in nature demonstrate how certain perceptive capabilities can emerge and evolve from the interaction of many simple local rules. These can be studied and adapted to become the algorithm’s pattern recognition system/s.

5.6.3. Discovering the Potential of Clusters

The Neural Maps' clusters and their properties can be explored further in order to optimise GA selection methods.

For example a set of several neural maps can be created for each generation. Each of the maps can classify the neurons according to different external, explicit or implicit criteria that not even necessarily were programmed into the fitness function. Thus the designer has better understanding of the search space that is structured according to different parameters.

The proposed algorithm can also be used in a slightly different context, where clustering of the search space can be used more efficiently. The large pool of individuals is generated. The ones are chosen for neural training that possess a high value of only one part of the fitness criteria or extreme end of one of its parameters. For example, if the fitness criteria consists of four components or all individuals have four parameters, than four individuals with sharply distinguished single characteristics are chosen as inputs for the self-organising map. After the map is trained it presents clusters of neurons categorised by the presence of a single welldefined parameter or fitness component. The search space is better structured this way and the process of selection of the parents becomes more systemised and visualised.

The data-mining process can be used that recognises different clusters the SOM and figures out the suitable candidate to become a parent. This way the SOM-expanded search space of the GA can be explored, managed and used in evolutionary algorithm even better.

Another principle can be applied to the Optimised Random option. From the remaining individuals within each cluster choose the neuron that is closest to the winner (or within certain radius from it) in order to maximise diversity of the parents and consequently the off-springs. The closer the individuals are to the border between two clusters the more similar phenotype they have.

Another clustering method application was explored by Aedas R&D as part of Masdar MIST 340. They developed a sketching tool to test variations for one of the concepts for Aedas Dubai’s proposal. Each apartment type in the area schedule is loaded onto the site and clustered around a number of cores. The apartments without direct access to a core are raised onto the next floor. The clustering process is repeated until all the apartments have access (Aedas Computational Design and Research group 2013).

Page 71: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

71  

Figure 74. Sketch of an apartment within in cluster

Figure 75. Perspective aerial view of the clusters

Page 72: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

Figure 76. Model photograph

5.6.4. Co-Evolution Instead of Fitness Criteria

Benefit of using co-evolution instead of the fixed fitness criteria is that it prevents large portions of the population become stuck in local optima. The first people to work with co-evolution were Daniel Hills and Karl SIms.

Daniel Hillis co-evolved sorting networks (Hillis 1991). Also he developed on the algorithm that co-evolves the artificial parasites. His code works with the two independent gene pools whose evolution rates are naturally comparable, one representing "prey" and other "parasites". Each of the populations evolve according to GA's selection /mutation/ recombination sequence. Both gene pools evolve on the same grid and their interaction is through their fitness functions. The prey is scored according the failure of parasites at the same grid location. The parasites are scored based on how well they find flaws in prey's fitness. The fitness functions of the prey and the parasites are complementary in the sense that a success of one represents a failure of another and vice versa.

As another example, Karl Sims co-evolved virtual creatures (Sims 1994).

Figure 77.Creatures evolved for walking.

Page 73: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

 73

Amiina Bakunowicz 2013 | 27

Figure 78.Creatures evolved for jumping.

Sims presents a system for evolving virtual creatures and each creature is tested for their ability to perform a given task, such the ability to swim in a simulated water environment. Those that are most successful survive, and their virtual genes containing coded instructions for their growth, are copied, combined, and mutated to make offspring for a new population. The new creatures are again tested, and some may be improvements on their parents. As this cycle of variation and selection continues, creatures with more and more successful behavio urs can emerge.

5.6.5. Exploring other ways in which CA can generate Body-Plans

One of the many possible methods to test is when CA can be used each time when creating the next generation. So the genes would simply define the position of the CA seed.

5.6.6. Other visualisation techniques of the search space

The following visual references can be created in order to aid the analysis of the performance of the algorithm:

- Fitness landscape of both generations and their corresponding neural maps

- Graphs that show the evolution of the fitness and models’ parameters over the generations

- Graphs showing the areas of the search space that is explored by the algorithm - etc.

6.6.7. Adapting Genetic Operators to Evolve the Evolvability

- Elimination criteria can be used instead of or together with the defined fitness criteria. It can represent the ratio of similarity of FC's components (same as one of the examples of implicit FC). EC gets stricter with each generation thus still encouraging evolution however at the same time avoiding the problem of local optima;

- Fitness constituents can be programmed to have their own weightings within the total fitness value, thus mathematically assigning their importance in the overall fitness assessment;

- An option for user to choose which individuals to carry on to the next generation and which to mate;

- Survival ratio determines the percentage of the population that will survive each generation. If the initially generated population has fewer individuals with positive fitness than the number that should survive, another round of seed genotypes is generated to replace those with zero fitness (Sims 1999);

Page 74: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

 74

- Modifier genes can be added, whose allelic values control the genetic operators (e.g. loci controlling recombination or mutation rates and distributions) (Back, Hoffmeister and Schwefel 1991; Bergman and Feldman 1992; Shaffer and Morishima 1987);

- Meta-level GA can be running on parameters of the operators, in which each run of the primary GA produces a performance measure used as the fitness of the operator parameter (Grefenstette 1986);

- “Messy GAs”, a form of representation which allows more complex transformations from parent to offspring (Goldberg, Deb, and Korb 1990);

- Focusing the probabilities of operator events to those that have had a history of generating fitter individuals (Davis 1989);

- Mutation distribution adjustment to maintain certain properties of the fitness distribution, e.g. the “one-fifth” rule of Evolution Strategies (Back, Hoffmeister and Schwefel 1991);

- Dynamic parameter encoding (O’Neil and Shaefer 1989; Pitney, Smith, and Greenwood 1990; Schraudolph and Belew 1992; Shaefer 1987; Szarkowicz 1991).

Page 75: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

75 

Chapter 6. Conclusion The proposed algorithm explores and tries to solve the problem of Multi-criteria evaluation in architectural parameterisation and form-finding design context with the help of the code that combines principles of Genetic Algorithm and neural Self-Organising Maps. As the produced result of any multi-criteria design solution is a subjective matter, the algorithm provides with the overall value of multiple options that can be later evaluated by the designer (and/or client) and help them to choose among the presented options.

There is currently a number of well-performing multi-criteria evaluation software available within the architectural industry. However they mostly deal with the functionality of the building or geometry of a building that arises from that functionality. The advantage of the proposed algorithm that it has a potential to explore, evaluate and evolve both functionality and form in its aesthetic context. To summarise, the positive and negative characteristics of the proposed algorithm are:

Advantages of GA-SOM: - SOM provides the designer with the visual feedback of the possible solutions; - The explored search space is mapped not only according to the fitness criteria, but also to the results of the evolution of the neurons’ vectors; - SOM re-structures and expands the possibilities space; - Better performance at generating unexpected solutions; - By classifying the data introduces clustering property to the search space of GA; - The map maintains the topology of a data space by using the neighbourhood function; - Ability to evaluate the solutions in non-procedural ways; - Fewer number of generations needs to be evolved to achieve satisfying results; - Works with the models of large dimensionality; - Minimises many drawbacks of the traditional GAs; - Exhibits the properties of parallelism and pattern recognition; - Can recognize or characterize inputs it has never encountered before.

Disadvantages of GA-SOM: - The algorithm gains its full effectiveness with the larger populations and neural maps, that takes a lot of the computational power and time; - there are too many combined factors between GA and SOM (like fitness factor, mutation rate, selection rate, convergence rate, winner strength, etc.) that proves it very challenging to find the right equilibrium between them for each design problem; - The problem of the expanded choice and evaluation of the aesthetics of the models;

The proposed GA-SOM proved to be an efficient tool for the search for the architectural form that emerges from the evolution of the spatial arrangement of the functional spaces or any other performance optimisation process. The unique characteristic of this algorithm is that choosing the fittest ceases to be essential. A “fit enough” model that “looks” acceptable to the eye of the designer becomes the solution. Therefore this computational design technique is especially useful when the priorities are set to be the aesthetical qualities of the model or/and the architect feels essential to personally carry out the assessment of the offered solutions by him/herself and needs to put his/her hands directly into the design process.

Page 76: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

76 

Chapter 7. References

Aedas Computational Design and Research group. Available. http://aedasresearch.com [Last accessed 01.09.2013].

Alexander, C. (2002) The Nature of Order - The Process of Creating Life. Berkeley: Center For Environmental Studies, vol. 2.

Akin, O. (1978) How DO Architects Design? In Artificial Intelligence and Pattern Recognition in Computer-Aided Design, ed. J. C. Latombe, New York: IFIP, North Holland.

Arvin, S. A.; House, D. H. (2002) Modelling Architectural Design Objectives in Physically Based Space Planning. Automation in Construction 11, no. 2: 213-225.

Back, T.; Hoffmeister, F. and Schwefel, H.-P. (1991) A survey of evolution strategies. In R. K. Belew and L. B. Booker, editors, Proceedings of the Fourth International Conference on Genetic Algorithms, pages 2–9, San Mateo, CA. Morgan Kaufmann.

Bentley, P. J. (Ed.) (1999) Evolutionary Design by Computers. Academic Press Ltd., London.

Bentley, P. J.; Kumar, S. (1999) Three Ways to Grow Designs: A Comparison of Embryogenies for an Evolutionary Design Problem. In Genetic and Evolutionary Computation Conference (GECCO) Orlando, Florida, USA.

Bentley, P. J. (1999) Evolving Fuzzy Detectives: An Investigation into the Evolution of Fuzzy Rules. Chapter in Suzuki, Roy, Ovasks, Furuhashi, and Dote (Eds), Soft Computing in Industrial Applications, Springer Verlag London Ltd.

Bentley, P. J.;Kumar, S. (2002) Computational Embryology: Past, Present and Future. Invited chapter in Ghosh and Tsutsui (Eds) Theory and Application of Evolutionary Computation: Recent Trends. Springer Verlag (UK).

Brudaru, O. (2011) Cellular Genetic Algorithm With Communicating Grids For A Delivery Problem. Symbolic and Numeric Algorithms for Scientific Computing (SYNASC), 2011 13th International Symposium on.

Coates, P. (1997) Using Genetic Programming and L-Systems to explore 3D design worlds. CAAD Futures 97, R. Junge (ed), Kluwer Academic Publishers, Munich.

Davis, L. (1989) Adapting operator probabilities in genetic algorithms. In J. D. Schaffer, editor, Proceedings of the Third International Conference on Genetic Algorithms, pages 61–69, San Mateo, CA. Morgan Kaufmann.

Dawkins, R. (1987). The Evolution of Evolvability. Proceedings of Artificial Life VI. Langton (Ed.) USA.

Deb. K.; Pratap. A.; Agarwal. S.; Meyarivan. T. (2002). A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation.vol. 6. pp. 182-197.

DeGaris, H. (1994) An Artificial Brain. New Generation Computing vol. 12:2, Springer Verlag.

DeGaris, H. (1999) Artificial Embryology and Cellular Differentiation. Ch. 12 in Bentley, P. J. (Ed.) Evolutionary Design by Computers. Morgan Kaufman Pub.

Page 77: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

77 

DeLanda, M. (2011) Philosophy and Simulation: The Emergence of Synthetic Reason. Continuum Publishing Corporation.

Derix, C. (2010) Mediating Spatial Phenomena through Computational Heuristics. ACADIA 10: LIFE in:formation, On Responsive Information and Variations in Architecture [Proceedings of

the 30th Annual Conference of the Association for Computer Aided Design in Architecture (ACADIA).

Drobics, M.; Bodenhofer. U.; Winiwarter, W. (2001). Data Mining using synergies between selforganising maps and inductive learning on fuzzy rule. In Joint 9th IFSA world congress and 20th NAFIPS international conference.

Felfernig, A.; Friedrich, G.; Schmidt, T., L. (2007) Introduction: Recommender Systems. IEEE Intelligent Systems 22(3): 18-21.

Fogel, L. (1962). Autonomous automata. Industrial Research, vol. 4, pp. 14- 19.

Fogel, D. ( 1991 ). System identification through simulated evolution. A machine learning approach. USA: Ginn Press.

Gero, J.; Rosenman, M. (1999). Evolving Designs by Generating Useful Complex Gene Structures. In Bentley, P. J. (ed.) (1999). Evolutionary Design by Computers. Academic Press Ltd., London.

Gersho, A.; Gray, R.M. (1991). Vector quantization and signal compression. Boston, Dordrecht, London: Kluwer.

Goldberg, D.; Deb, K.; Korb, B. (1990) Messy genetic algorithms revisited: studies in mixed size and scale. Complex Systems 4(4): 415–444.

Grefenstette, J. J. (1986) Optimization of control parameters for genetic algorithms. IEEE Transactions on Systems, Man and Cybernetics SMC-16(1): 122–128.

Habraken, N. J. (1985). The Appearance of the Form. Cambridge , Mass.: Atwater press.

Hakimi-Asiabar, M. (2009) Multi-objective genetic local search algorithm using Kohonen’s neural map. Computers and Industrial Engineering, Volume 56 Issue 4, May, 2009 , pp. 15661576. Pergamon Press, Inc. Tarrytown, NY, USA

Hillis, D. (1991) Co-evolving parasites improve simulated evolution as an optimization procedure. Emergent computation, pp. 228-234. MIT Press Cambridge, MA, USA.

Holland. J. ( 1975) Adaptation in natural and artificial systems. Ann Arbor: The University of Michigan Press.

Kalay, Y. (2004) Architecture's New Media: Principles, Theories, and Methods of ComputerAided Design. The MIT Press.

Kauffman, S. (1995). At Home in the Universe: The Search For The Laws of Self-Organisation and Complexity. New York: Oxford University Press.

Karpenko, A. (2011) Meta-optimisation based on self-organising map and genetic algorithm. Optical Memory and Neural Networks, Volume 20 Issue 4, December 2011, Pages 279-283. Springer-Verlag New York, Inc. Secaucus, NJ, USA

Kita, E. (2010) Investigation of self-organising map for genetic algorithm. Advances in Engineering Software 01/2010; 41:148-153.

Page 78: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

78 

Kohonen, T. (1997). Self-organising maps. Information sciences (2nd ed.). Springer, Berlin.

Koza, J . (1992). Genetic programming. On the programming of computers by means of natural selection. The MIT Press.

Koza, J.; John R.; Bennett; Forrest, H.; David, A.; Keane, M. A. (1999). Genetic Programming III. San Francisco, CA: Morgan Kaufmann.

Maes, P. (1994) Agents that Reduce Work and Information Overload. Communications of the ACM 37(7): 30-40

March, L.; Steadman, P. (1974) The Geometry of Environment. Cambridge:MIT Press, pp. 263284.

Minsky, M. (1968) Semantic Information Processing. Cambridge: MIT Press.

Mitchell, M.; Hraber, P. T.; Crutchfield, J. P. (1993) Revisiting the edge of chaos: Evolving cellular automata to perform computations. Complex Systems, 7:89-130.

Mitchell, M. (1998) Introduction to Genetic Algorithms (Complex Adaptive Systems). MIT Press; New edition edition p. 166.

Michillewicz, Z. ( 1996). Generic algorithms. Data structures. Evolution programs. Springer Verlag.

Morooka, K.; Nagahashi, H. (2012).Self-organizing Deformable Model: a Method for Projecting a 3D Object Mesh Model onto a Target Surface.Computer Graphics, Prof. Nobuhiko Mukai (Ed.)

O’Neil, E.; C. Shaefer. (1989) The ARGOT strategy III: the BBN Butterfly multiprocessor. In J. Martin and

S. Lundstrom, editors, Proceedings of Super computing, Vol.II: Science and Applications, pages 214–227. IEEE Computer Society Press, 1989.

Piasecki, M., Sean, H. (2010) A Redefinition of the Paradox of Choice. J.S. Gero (ed.): Design Computing and Cognition 10, pp. 347–366. Springer Science + Business Media B.V.

Piller, F.; Schubert, P.; Koch, M.; Moslein, K. (2005) Overcoming Mass Confusion: Collaborative Customer Co-Design. In Online Communities. Journal of Computer-Mediated Communication 10(4): article 8Pine, J. B. (1993) Mass Customization - The New Frontier in Business Competition. Harvard Business School Press, Cambridge, Massachusetts.

Rauber, A. (1999) LabelSOM: On the labeling of self-organising maps. In Proceedings of international joint conference on neural networks, Washington, DC. 9

Rittel, H. W. J.; Webber, M. M. (1973) Dilemmas in General Theory of Planning. Policy Sciences 4:155-169

Salvador, F.; de Holan, P. M.; Piller, F. (2009) Cracking the Code of Mass Customization. MIT Sloan Management Review 50(3): 71-78.

Schaffer, J. D. ( 1984) Some experiments in machine learning using vector evaluated genetic algorithms. Ph.D. thesis, Nashville, TN: Vanderbilt University.

Schmidt-Thieme, L. (2005) Compound Classification Models for Recommender Systems. Proceedings of the Fifth IEEE International Conference on Data Mining, pp.378-385.

Page 79: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

79 

Schraudolph, N.; Belew, R. (1992) Dynamic parameter encoding for genetic algorithms. Machine Learning 9(1): 9–21.

Schwartz, B. (2004) The Paradox of Choice. HarperCollins Publishers, New York

Schwatrz, A.; Berry, D.M.; Shaviv, E. (1994) Representing and Solving the Automated Building Design Problem. Computer-Aided Design 26, no. 9: 689-698.

Schwefel, H.-P. ( 1975) Evolutions strategie und numerische Optimierung. Ph.D. thesis.

Schwefel, H.-P. (1981) Numerical optimization of computer models. Wiley Chichester.

Schwefel, H.-P. ( 1995) Evolution optimum seeking. Sixth-generation computer technology series. John Wiley and Sons.

Shaefer, C. G. (1987) The ARGOT strategy: adaptive representation genetic optimizer technique. In J. J. Grefenstette, editor, Genetic Algorithms and their Applications: Proceedings of the Second International Conference on Genetic Algorithms, pages 50–58, Hillsdale, NJ. Lawrence Erlbaum Associates.

Sims, K. (1994) Evolving Virtual Creatures. Computer Graphics, Siggraph 1994 Proceedings, July 1994, pp.15-22.

Sims, K 1991. Available. Primordial dance.http://www.karlsims.com/primordial-dance.html [Last accessed 01.09.2013]

Sipper, M. (1997) A Phylogenetic, Ontogenetic, and Epigenetic View of Bio-Inspired Hardware Systems. IEEE Transactions On Evolutionary Computation, Vol 1, No. 1, February 4, 1997.

Spuybroek, L.(2012) On Sympathy of Things. NAi Publishers.

Srinivas, N.; Deb, K. (1994). Multiobjective function optimization using nondominated sorting genetic algorithms. Evolutionary Computation journal. 2(3), pp. 221-248.

Szarkowicz, D. (1991) A multi-stage adaptive-coding genetic algorithm for design applications. In D.Pace, editor, Proceedings of the 1991 Summer Computer Simulation Conference, pages 138–144, San Diego, CA.

Taura, T.; Nagasaka, I. (1999) Adaptive-Growth-Type 3D Representation for Configuration Design. In Bentley, P. J. (guest ed.) Special Issue on Evolutionary Design, AIEDAM journal v13:3.

Thompson, D. (1961) On Growth and Form. Edited by John Tyler, Cambridge: Cambridge University Press. (First published in 1917)

Van Hulle, M. (2009) Self-Organizing Maps. Handbook of Natural Computing. PSPC, Springer.

Witkin, A.; Baraff, D. (1997) Physically Based Modelling: Principles and Practice. SIGGRAPH ‘97 Course Notes 19.

Yamakawa, T.; Horio, K.; Hiratsuka, T. (2002) Advanced self-organising maps using binary weight vector and its digital hardware design. In Proceedings of the 9th international conference on neural information processing, vol. 3, pp. 1330-1335.

Yu, T.; Bentley, P. (1998) Methods to Evolve Legal Phenotypes. In Proceedings of the Fifth Int. Conf. on Parallel Problem Solving From Nature. Amsterdam, Sept 27-30, 1998, pp. 280-282.

Page 80: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

80 

Zitzler, E.; Thiele; L. (1999) Multiobjective evolutionary algorithms: A comparative case study and the strength Pareto approach. IEEE Transactions on Evolutionary Computation. 3(4). pp. 257-271.

Page 81: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

81 

Chapter 8. Further Reading

Genetic Algorithm and Self-Organising Maps:

Hinton, G.E.; Nowlan, S.J. (1987) How Learning can Guide Evolution. Complex Systems, vol.1, pages 495-502.

Kubota, N.; Fukuda, T.; Fumihito, A.; Shimojima, K. (1994) Genetic algorithm with age structure and its application to self-organising manufacturing system. IEEE symposium on emerging technologies and factory automation.; EFTA'94; Tokyo, 1994; Nov, 1994' 472-477. Piscataway, NJ; IEEE; 1994-1994

Kubota, R.; Horio, K.; Yamakawa, T. (2006) Genetic algorithm with modified reproduction strategy based on self-organising map and usable schema. International congress series. Vol 1291;, 2006, 169-172. Elsevier Science B.V., Amsterdam. 2006

Maya, S.;Reynoso, R.; Torres, C.; Arias-Estrada, M. (2000) Compact Spiking Neural Network Implementation in FPGA. In R.W. Hartenstein, H. Grünbacher (Eds.), FPL 2000, pages 270276, Lecture Notes in Computer Science, 1896, Springer-Verlag, Berlin Heidelberg, 2000

Sasaki, T.; Tokoro. M. (2000) Comparison between Lamarckian and Darwinian Evolution on a Model Using Neural Networks and Genetic Algorithms. Knowledge and Information Systems, pages 201-222, 2: 2, 2000

Tanaka, M; Watanabe, H.; Furukawa, Y.; Tanino, T. (1995) GA-based decision support system for multi-criteria optimisation. Systems , man and cybernetics,1995. Intelligent systems for 21st century. IEEE international conference on. Vol. 2; 1995, pp. 1556-1561

Yamashiro, D.; Yoshikawa, T.; Furuhashi, T. (2006) Efficiency of search performance through visualising search process. Systems , man and cybernetics, 2006. SMC'06. IEEE international conference on. 2006, pp. 1114-1119.

Yao, X. (1999) Evolving Artificial Neural Networks. Proceedings of the IEEE, pages 1423-1447, 87: 9, 1999

Genetic Algorithm and Co-Evolution:

Sakanashi, H.; Kakazu, Y. (1994) Co-evolving genetic algorithm with filtered evaluation function. In Emerging technologies and factory automation, 1994. ETFA 1994., IEEE Symposium on

Ying-Hua, C. (2010) Adopting co-evolution and constraint-satisfaction concept on genetic algorithms to solve supply chain network design problems. In Expert Systems with Applications. Vol 37; 2010, 6919-6930. Elsevier Science B.V., Amsterdam. 2009

Dorst, K.; Cross, N. (2001) Creativity in the design process_co-evolution of problem–solution. Design Studies, Volume 22, Issue 5, September 2001, Pages 425–437, Design Studies, Elsevier.

Page 82: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

82 

The Paradox of Choice:

Simon, H. (1955) A Behavioural Model of Rational Choice. The Quarterly Journal of Economics, Vol. 69, No. 1. (Feb., 1955), pp. 99-118.

Jannach, D., Zanker, M., Felfernig, A., Friedrich. G, (2010).Recommender Systems: An Introduction. Cambridge University Press, 2010.

Felfernig, A.; Friedrich, G.; Schmidt Thieme, L. (2007) Guest Editors' Introduction: Recommender Systems. IEEE Intelligent Systems 22(3)

Self-Organising Maps, Swarms and Patterns:

Hasan, S.; Shamsuddin, S. M. (2011) Multistrategy self-organizing map learning for classification problems. Computational Intelligence and Neuroscience, Volume 2011, January 2011, Article No. 1, Hindawi Publishing Corp. New York, NY, United States.

Hebb, D.O. (1949) the Organization of Behavior. Wiley, New York, 1949.

Hopfield, J.J. (1982) Neural Networks and Physical Systems with Emergent Collective Computational Abilities. Proceedings of the National Academy of Sciences, pages 2554-2558, 79: 8, USA, 1982

Ramos, V; Almeida, F. (2000) Artificial Ant Colonies in Digital Image Habitats – A Mass Behaviour Effect Study on Pattern Recognition. In Dorigo, M., Middendorf, M., Stuzle, T. (Eds.): From Ant Colonies to Artificial Ants - 2 nd Int. Wkshp on Ant Algorithms.

Ramos, V. (2002) Stigmergic Self-Organised Data-Mining. Procs. of AEB´02 - 1 st Spanish Conf. on Evolutionary and Bio-Inspired Algorithms, Mérida Univ

Ramos, V; Merelo, J. (2002) Self-Organized Stigmergic Document Maps: Environment as a Mechanism for Context Learning. In Proc. of 1 st Spanish Conf. on Evolutionary and BioInspired Algorithms.

Greenfield, G (2012) Stigmergy Prints from Patterns of Circles. Proceedings of Bridges 2012: Mathematics, Music, Art,Architecture, Culture, pp.291-298. Robert Bosch and Douglas McKenna and Reza Sarhangi, Tessellations Publishing, Phoenix, Arizona, USA.

J. Dayhoff. Pattern recognition with a pulsed neural network. Proceedings of the conference on Analysis of neural network applications, pages 146-159, ACM Press, New York, NY, USA, 1991

Page 83: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

83 

Chapter 9. Appendixes

Appendix 1. Traditional Genetic Algorithm: a Brief Description

Genetic Algorithm (GA) has an essence of evolution in its domain: the Darwinian process of generating a vast amount or random “solutions”, appropriate in terms of fitness between form, function and context. The solutions that are defined “unfit” are deserted, the survivals are copied and modified generating the pool of newly created individuals ready for the next leap of the same evolutionary process of mating and mutation, until - one or more solutions exhibit a greater degree of “fit”. The fit normally is not absolute and can be controlled by modifying the number of individuals and generations that adapt them.

Synthetic genetic algorithms closely resemble the natural model of evolution. Artificial Chromosomes are represented as string structure made of “genes”:

Figure 79. A Chromosome in Nature:

Figure 80. An artificial chromosome:

Page 84: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

84 

Genome is encoded in GNA and it contains the entire hereditary information of an organism. It is represented by GENOtype and PHENOtype. Genotype is the particular set of genes contained in a genome and phenotype displays the observable characteristics or traits.

The first step is to create a number of random chromosomes or they can be called individuals that the constant number of genes and their lengths. They compose the first generation. Each chromosome in the example below has four genes with the gene length of five:

Figure 81. A population of chromosomes or individuals

Each gene is then decoded from a binary string to a whole number or “value” that represents a ever-changing parameter that is used to generate phenotype or a model in 2D or 3D space:

Decoding process for gene length of 5 that gives the value range between 0 and 62:

Figure 82. Mathematics behind the decoding process

Each “value” is assigned to control a specified parameter of a model that is called a “body-plan”. As chromosomes change over the generations, the body-plan evolves.

After that the couples of parents-chromosomes are selected according to their fitness for crossover. Below is the example of one of the methods, which is calle One-Point Crossover, where a single crossover point on both parents' organism strings is selected. All data beyond that point in either organism string is swapped between the two parent organisms. The resulting organisms are the children:

Page 85: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

Figure 83. One-Point Crossover method

The off-springs replace their parents and together with the unchanged individuals they form a new population that will go through the same process again and again for a specified number of generations or until the individual with desired characteristics is evolved.

The mutation operator introduces additional changes into the resulting genotype, by randomly changing one or more of genes’ components:

Figure 84. The process of mutation

In mutation, the solution may change entirely from the previous solution. Hence GA can come to better solution by using mutation. Mutation occurs during evolution according to a user definable mutation probability. This probability should be set low. If it is set too high, the search will turn into a primitive random search.

Amiina Bakunowicz 2013 | 39

Page 86: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

86

There are different types of crossover and mutation which are not looked into as a part of the current research. The other two optional selection methods are given in the proposed algorithm and will be described later on in this paper.

Terminating conditions of GA often include:

-Fixed number of generations reached -Budgeting: allocated computation time/money used up -An individual is found that satisfies minimum criteria -The highest ranking individual's fitness is reaching or has reached a plateau such that successive iterations are not producing better results anymore. -Manual inspection -Combinations of the above

Having proved to be very efficient, the traditional GAs however has a number of drawbacks (Hakimi-Asiabar 2009):

- the intensification process is not accurate - genetic drift - production of vast amount of the data during an optimization process without actually using

much of it - high convergence rate - limited diversity - problem of "local optima"

Genetic Algorithm: a Bit of History

There are four established evolutionary computational models of Evolutionary Algorithms(EAs):

- Genetic Algorithms (GA)(Holland.1975; Michalewicz.1996) - Evolution Strategies (ES)(Schwefel.1975; Schwefel. 1981.1995) - Genetic Programming (GP)(Fogel. 1962; Koza.1992) - Evolutionary Programming (EP)(Fogel.1991)

Initially they were all concentrating on single-objective applications (SOGAs) where fitness criteria is based on a single objective or combined into one multiple objectives, identifying "good" regions of the solutions space. However they are not very efficient at separating the locally optimal solutions within that "good space". Over the last two decades EAs with multi-objective optimization (or Pareto optimisation, or vector optimisation) emerged (MOGAs), where more than one objectives need to be optimised simultaneously, i.e. by minimising one objective, maximising another thus creating non-dominated or Pareto optimal solution. MOGAs normally generate the entire set of solutions in one run of the algorithm, compared to only one for SOGAs (Hakimi-Asiabar, M. 2009). Early MOGAs show poor performance because of their convergence and limited diversity:

- Vector-Evaluated Genetic Algorithm (VEGA)(Shaffer.1984) - Nondominated Sorted Genetic Algorithm (NSGA)(Srinivas & Deb.1994)

More recent MOGAs demonstrate better results but still maintain similar deficiencies as their predecessors. Below are the examples of the most efficient MOGAs:

- Strength Pareto Evolutionary SPEA (Zitzler, E. & Thiele. L .1999) - NSGA-II (Deb. K.; Pratap. A.; Agarwal. S.; Meyarivan. T. 2002)

Page 87: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

87

Appendix 2. Comparing Types of the Evolutionary Embryogenies

Out of the four main types of evolutionary algorithms only GA evolves the computational models motivated by genotype-phenotype mappings in biological systems. In other words it performs artificial development, also known as artificial embryogeny.

a. Implicit embryogeny.

The growth process is implicitly specified by a set of rules or instructions, similar to a ’recipe’ that govern the growth of a shape.

Examples of researches: Hugo de Garis, who for some years has been successfully evolving CA-based implicit embryogenies to grow artificial neural nets on an immense scale (de Garis, 1994); Taura and Nagasaka (1999) describe the use of an implicit embryogeny to define patterns of cells on the surface of a sphere, which are then used via a second, external embryogeny to define the morphology of shapes. Alternatively, Mitchell evolves CA-based systems designed to perform computations (Mitchell et al, 1993), which can be regarded as types of implicit embryogeny. However, the best example of this type.

b. Explicit embryogeny

It specifies each step of the growth process in the form of explicit instructions. In computer science, an explicit embryogeny can be viewed as a tree containing a single growth instruction at each node. Typically, the genotype and the embryogeny are combined and both are allowed to evolve simultaneously.

Examples of researches: Koza and his team have made an extensive study of the evolution of analogue electronic circuits using Cellular Encoding as their explicit embryogeny (Koza et al, 1999). Coates (1997) evolves architectural forms using a Lindenmayer system as his embryogeny. Gero and Rosenman (1999) use a number of different types of grammar-based explicit embryogenies for their work on evolving architecture.

c. External embryogeny

It is “hand-designed” and is defined globally and externally to genotypes. This is the most straightforward type and there are multiple applications of it.

Page 88: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

88

Example of the experiment that compares the three types of embryogenies by Evolving Tessellating Tiles (Bentley, Kumar 2002):

Figure 85. No Embryogeny

Figure 86. External Embryogeny

Figure 87. Explicit Embryogeny

Figure 88. Implicit Embryogeny

Page 89: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

89

Figure 89. Results: After running GA, the evolved tessellated tiles.

Examples of researches carried out in the area of artificial embryogenies:

- Hugo de Garis, who for some years has been successfully evolving CA-based implicit embryogenies to grow artificial neural nets on an immense scale (de Garis, 1994);

- Taura and Nagasaka (1999) describe the use of an implicit embryogeny to define patterns of cells on the surface of a sphere, which are then used via a second, external embryogeny to define the morphology of shapes;

- Mitchell evolves CA-based systems designed to perform computations (Mitchell et al, 1993), which can be regarded as types of implicit embryogeny.

- Coates (1997) who uses Lindenmayer systems to evolve architectural form by using explicit embryogeny

- Koza et al (1999) use an explicit embryogeny in the form of cellular encoding for the evolution of analogue circuits

- Sims (1999) uses an explicit embryogeny with the idea of directed graphs to specify the nervous systems (neural networks), and morphologies of virtual creatures

- Gero and Rosenman (1999) use a number of different types of grammar-based explicit embryogenies for their work on evolving architecture.

- Evolutionary Art systems often use embryogenies defined by fixed, non-evolvable structures which specify how phenotypes should be constructed using the genes in the genotypes (Bentley, 1999)

- Richard Dawkins’ Blind Watchmaker program (Dawkins, 1987), employs a simple external embryogeny to create biomorphs, using the ‘eye of the beholder’ to provide a measure of fitness, and mutation to vary the evolving shapes.

Page 90: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

90

Appendix 3. Example of a SOM Algorithm Application in 3D Modelling and Its Applications

Artificial Neural Networks (ANNs) were first introduced in 1950s, but only in the mid-1980s their algorithms have become sophisticated enough for general applications. A self-organizing map (SOM) (Kohonen 1982a, 1982b) or self-organizing feature map (if it is programmed to detect features inherent to the problem) SOM produces a low-dimensional (typically two-dimensional), a structured representation of the high-dimensional input space of the training samples, called a map. The data is represented by prototypes, called weight vectors. SOM is trained using unsupervised learning, which means that no human intervention is needed during the learning and that little needs to be known about the characteristics of the input data.

Example of a SOM Algorithm Application in 3D Modelling

The input data that needs to train the map are the six cylinders. They are parameterised to have different heights and the radiuses of the ellipses that are extruded to make cylinders:

Figure 90. SOM inputs

The next step is to set up a grid of randomly parameterised cylinders. This grid is the future SOM.

The SOM algorithm consists of two stages: the competitive stage and the cooperative stage. During the che competitive stage algorithm searches for the best input matches within the neurons of the map. The “winners” (or the best input matches) are highlighted in blue. There are six input models behind the map and six inputs’ lookalikes” on the grid:

Page 91: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

91

Figure 91. Highlighted in blue SOM inputs and their matches on the random map.

During the cooperative stage the weights or parameters of the winner are adapted as well as those of its immediate lattice neighbours, by using the neighbourhood function that considers the minimum Euclidean distance version of the SOM algorithm (also the dot product version exists, see Kohonen, 1995). It ensures that during the formation of topographically-ordered maps the neuron weights are not modified independently of each other, but as topologically related models:

Figure 92. The map started the learning process.

Page 92: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

92

Figure 93. The map half-way through the training process: the similarities between the inputs and the winners are more noticeable.

Figure 94. The final trained map.

Page 93: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

93

The final winners are never the same as the inputs, but very similar. This similarity is expressed in the convergence rate that is normally programmed into the algorithm. The converged neuron weights yield a model of the training set in three ways:

- Vector quantization: The training samples are modeled in such a manner that the average discrepancy between the data points and the neuron weights is minimized. In other words, the neuron weight vectors should “optimally” quantize the input space (Gersho and Gray, 1991);

- Regression: no prior knowledge is assumed about the nature or shape of the function to be regressed and the converged topographic map is intended to capture the only principal dimensions or manifolds of the input space;

- Clustering: the partitioning of the data set into subsets of “similar” data, without using prior knowledge about these subsets (Marc M. Van Hulle 2009)

Self-Organising Maps: Applications

The SOM algorithm has led to literally thousands of applications in areas ranging from automatic speech recognition, condition monitoring of plants and processes, cloud classification, and microarray data analysis, to document- and image organization and retrieval .are widely used for exploratory data analysis and visualization. ALso Kohonen maps can also be useful to forecasting tasks, study of temporal evolutions, explanation of complex prediction models.

Figure 95. In 3D modelling SOM algorithm is used to Project a 3D Object Mesh Model onto a Target Surface (Ken’ichi Morooka et al. 2012)

Page 94: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

94

Figure 96. Shape representation and classification to detect differences in the shape of Anatomical structures (Jung-ha An et al.2005)

Other types of SOM algorithms:

There exists a wealth of other algorithms, such as the Dynamic Cell Structures (DCS) (Bruske and Sommer, 1995), which is similar to the GNG, the Growing Self-Organising Map (GSOM, also called Hypercubical SOM) (Bauer and Villmann,1997), which has some similarities to GG but it adapts the lattice dimensionality, Incremental Grid Growing (IGG) which introduces new neurons at the lattice border and adds/removes connections based on the similarities of the connected neurons’ weight vectors (Blackmore and Miikkulainen, 1993), and one that is also called the Growing Self-Organizing Map (GSOM) (Alahakoon et al., 2000), which also adds new neurons at the lattice border, similar to IGG, but does not delete neurons, and which contains a spread factor to let the user control the spread of the lattice, to name a few.

Page 95: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

95

Appendix 4. Piasecki and Sean's Wundt Curve Experiment

The experiment was carried out in order to prove that by using GA in rationalising the "meaningfulness" of choice the Wundt curve relation of users’ satisfaction and the scope of choice doesn't apply. Whilst it retains its shape in the case of parametric configurators. "In the GA configurators, attribute levels are not set explicitly, but rather through manual selection of meaningful solutions and breeding of subsequent generations of products. Each population in GA configurators. Parametric configurators, on the contrary, required the users to explicitly define each attribute levels and thus they called for a definition rather than a recognition of meaningful solutions. The users where able to set the attribute levels through a slider-bar menu and were presented with a single visualization of the current instance of the product. Below are the Wundt curves for both GA and Parametric configurators, which represent and prove the concept that "because artificial selection in GAs enables the users to navigate through a solution space by recognizing rather than defining meaningful options, it is likely to be a possible solution to the redefined paradox of choice":

Figure 97. The Wundt curves for both GA and Parametric configurators

Page 96: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

96

Appendix 5. First Ever SOM for Muliti-Objective Evolutionary Algorithm

Self-Organising Maps and Genetic Algorithm were joined in one algorithm by Buche et. al. in 2002. It was called Self-Organising Maps for Muliti-Objective Evolutionary Algorithm (SOMMOEA). The main distinction of this algorithm is that the parents are chosen irrespectively to their fitness value, i.e. the crossover operator chooses the parents randomly. Therefore the average fitness of the population increases very slowly.

SOM-MOEA algorithm:

1. Calculate fitness in a population

2. Train SOM with the information of the individuals 3. Perform SOM neighbourhood evolution as follows:

a. Select a random individual form the population b. search for the best match unit for this particular individual on the SOM map c. select the closest unit to the best match unit identified in the previous step d. select a random individual from the individuals related to the best match unit e. crossover the selections from steps c and d

4. Perform SOM mutation so that the normal random numbers are added to design variables of the individuals (Kita, 2010)

Page 97: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

97

Appendix 6. Brudaru’s Cellular Genetic Algorithm With Communicating Grids and SOM

Quick introduction to Cellular GA (CGA)

It is a kind of evolutionary algorithm in which each individual crosses over with it closest neighbour. The algorithm normally involves a structured 2D grid (can be 1D and 3D) of individuals. Clusters of similar individuals are naturally evolved. The set of potential mates of an individual is called its neighbourhood:

Figure 98. Examples of cellular EAs

Advantages of traditional CGA:

- avoids premature convergence as the algorithm uses 2D grid and therefore "the occurring of multiple copies of the best chromosome is improbable" - spatial structure based on similarity

Brudaru's proposed CGA:

- algorithm has many communicating grids acting in parallel and synchronized by the stages of the evolution - clustering method based on Kohonen's SOM organises the population of individuals on each grid - communication protocol that interchanges the best individuals belonging to similar clusters placed on different grids, therefore preserving the similarity between some clusters in different grids.

Page 98: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

98

Grid space organisation using SOM: All chromosomes are placed in a grid of NxN locations which appear as a matrix size KxK. In the example below N=18 and K=3:

Figure 99. Grid partitioning into 3x3 squares (a) and the associated Kohonen network (b)

Each location on the grid has two possible states: free/occupied node where a chromosome can be placed/ is already placed. This network is trained using the neighbourhood function. Each cluster has a centroid and only individuals within certain distance from the centre are retained in the cluster and the remaining chromosomes are redistributed to the closest clusters that are not yet filled. The members of each non-empty cluster are placed on the square shaped subset diagrammed below, in the increasing order of their fitness, starting from the centre of the squared spiral to outside.

The positional clustering captures the similarity of the individuals. It transforms the intrinsic structural similarity into a position based similarity, avoiding a time consuming management (creating and maintaining) of the clustering based on explicit structural similarity.

Appendix 7. Kita's Real-Coded Genetic Algorithm (RCGA) and SOM

Real-coded GA - opposite to binary-coded algorithms, crossover/mutation is performed on the digits of the "normal", decimal representation of the individuals. In this particular case it is not essential that real-coded GA is used instead of traditional. Both perform in a very similar manner.

The conceptual diagram of RCGA-SOM for real-valued single objective function problems:

- Apply RCGA to an initial population once to generate a new population - Train SOM by taking the values of an objective function and design variables of the

individuals as the input data - Search the best match unit on the map for each individual on the map - Define a sub-population by the individuals included in the circle centring the best

match unit - Apply fitness oriented RCGA to each resulting ub-population - Add the fittest individual from each sub-population to the new generation

Advantages:

- The convergence speed is faster than that of SOM-MOEA as the local search is performed in each sub-population according to the fitness criteria

- Better local search performance as carries out the solution search in each sub-population instead of the solution space of the whole population.

Page 99: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

99

Appendix 8. Optional Body-Plans for the Proposed Algorithm

Figure 100. Evolved Body-Plan, option 1

Page 100: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

100

Figure 101. Evolved Body-Plan, option 2

Page 101: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

101

Figure 102. Evolved Body-Plans, option 3

Page 102: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

102

Figure 103. Evolved Body-Plan, option 4

Page 103: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

103

Appendix 9. Proposed algorithm (Python)

import rhinoscriptsyntax as rs import random as r import math as m from itertools import permutations ################################################### ###### Amina Bakunowicz ###### ###### MSc Thesis ###### ###### UEL 2013 ###### ###### NEURAL SELF-ORGANISING MAPS ###### ###### AND GENETIC ALGORITHM: ###### ###### EVOLVING 3D CELLULAR AUTOMATA ###### ###### ARCHITECTURAL MODEL ###### ###### (boxes only, no arch model) ###### ################################################### # Global Variables for GA initPOPCOUNT = rs.GetInteger("please enter the number of individuals?", 3, 3, 50) GENERATIONS = rs.GetInteger("evolve over how many generations?", 1, 1, 500) selectionChoice = rs.GetInteger("Selection type: if Goldberg Roulette, enter 1; if optimised random, enter 2; if optimised artificial, enter 3", 1, 1, 3) geneLength = 4 MUTATION_RATE = 0.2 CROSSOVER_RATE = 1.0 FitnCoevRate = 5 # % by which a fitness componenet threshold should grow by with each gneration minFitLevel = 7 # min % over the average fitness of the best individual that a potential parent should possess in order to become a parent FLOORHEIGHT = 4 # Globals for CA CAunitsX = 3 # Number of CA units along X CAunitsY = 3 # Number of CA units along Y CAunitsZ = 1 # Number of CA units along Z widthCA = 17 #width of the CA unit statesCAall = ["living", "working", "resting"] GSratio = 1.618 POPmax = initPOPCOUNT * 1.6 # max size of a generation # Global Variables for SOM FNUM = CAunitsX*CAunitsY*CAunitsZ * 2 + 6 # Number of the parameters in the neural vector and it equals to the number of genes uSPACE = CAunitsX * widthCA + 50 # Neural grid U-direction Spacing vSPACE = CAunitsY * widthCA + 50 # Neural grid V-direction Spacing neuron = [] # Neurons (map) input = [] WINLEARN_RATE = 0.2 class Individual: def __init__(self, __id, _colour): # length of the chromosome

Page 104: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

104

self.alleles = (0,1) self.geneNum = FNUM # number of genes and it equals to the number of the parameters in the neural vector self.geneLength = geneLength # length of each gene self.values = [] # This list will store all the genes (only used in body plan) self.chromLength = self.geneNum*self.geneLength self.chromosome = self.makeChromosome() self.id = __id self.fitness = 0 self.guid = None self.originIndiv = [] self.colour = _colour self.clusterFitness = 0 self.clusterIndList = [] self.grid = None def makeChromosome(self): # local list is returned chrom = [] for a in range(self.chromLength): chrom.append(r.choice(self.alleles)) return chrom def mutate(self): # Change a piece of DNA errorPoint = r.randrange(0, self.chromLength) if self.chromosome[errorPoint] == 0: self.chromosome[errorPoint] = 1 else: self.chromosome[errorPoint] = 0 def decode(self): # Set up a counter to access chromosone index. counter = 0 localList = [] # For each gene calculate the thisValue nd append to the list for g in range(self.geneNum): thisValue = 0 for i in range(self.geneLength): if(self.chromosome[counter] == 1): thisValue += m.pow(2, self.geneLength-i) counter += 1 localList.append(thisValue) self.values = localList def drawBodyplan(self, id, gen, finished, statesCA, BetwGens): area = 0 guid = [] grid = [] originsCA = [] originsCAmodel = [] CAmodelGuid =[] gridX = widthCA*CAunitsX + 80 # The spacing between individuals gridY = BetwGens # The spacing between generations # Set the origin point ready to draw an individual self.originIndiv = [gridX*id, gridY, 0]

Page 105: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

105

n = 0 m = 0 for z in range (CAunitsZ): for y in range (CAunitsY): for x in range (CAunitsX): # collects coordinates of each CA unit for the display original CA model if id ==0 and gen == 0: coordCAmodel = [x*widthCA, y*widthCA + BetwGens,z*FLOORHEIGHT] originsCAmodel.append(coordCAmodel) # move the origin for each CA unit by genes defined 2D vector n += 2 coordCA = [self.originIndiv[0]+(x*widthCA) + (self.values[n + 4]/2), self.originIndiv[1]+(y*widthCA) + (self.values[n + 5]/2),z*FLOORHEIGHT] originsCA.append(coordCA) planeOrigin = [self.originIndiv[0]+(x*widthCA), self.originIndiv[1]+(y*widthCA), z*FLOORHEIGHT] # draw a rectangle of CA's grid for each unit on the ground if z == 0: myPlane = rs.PlaneFromFrame(planeOrigin, [widthCA,0,0], [0,widthCA,z*FLOORHEIGHT]) rect = rs.AddRectangle(myPlane, widthCA, widthCA) rs.ObjectColor(rect, [80,80,80]) grid.append(rect) # creates a display line between the origing of the unit#s grid rectangle # and an origin of the 3D box of the unit m += 1 #if statesCA[m-1] != "void" and self.values[n + 4]!=0 and self.values[n + 5]!=0: #line = rs.AddLine(planeOrigin,coordCA) #rs.ObjectColor(line, [0,255,255]) self.grid = grid[:] # goes through every unit of every individual and draws a appropriate # geometry depending on the state of the unit for i in range (len(statesCA)): # displays original CA model if id ==0 and gen == 0: coordCAmodel = originsCAmodel[i] pt1 = [coordCAmodel[0], coordCAmodel[1], coordCAmodel[2]] pt2 = [pt1[0] + widthCA, pt1[1], pt1[2]] pt3 = [pt1[0] + widthCA, pt1[1] + widthCA, pt1[2]] pt4 = [pt1[0], pt1[1] + widthCA, pt1[2]] pt5 = [coordCAmodel[0], coordCAmodel[1], coordCAmodel[2] + FLOORHEIGHT] pt6 = [pt1[0] + widthCA, pt1[1], pt1[2] + FLOORHEIGHT] pt7 = [pt1[0] + widthCA, pt1[1] + widthCA, pt1[2] + FLOORHEIGHT] pt8 = [pt1[0], pt1[1] + widthCA, pt1[2] + FLOORHEIGHT] pts = [pt1, pt2, pt3, pt4, pt5, pt6, pt7, pt8] CAmodelBox = rs.AddBox(pts)

Page 106: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

106

rs.MoveObject(CAmodelBox, [-1*(widthCA*CAunitsX + 300), 0,0]) if statesCA[i] == "living": rs.ObjectColor(CAmodelBox, [204,51,51]) if statesCA[i] == "working": rs.ObjectColor(CAmodelBox, [51,51,204]) if statesCA[i] == "resting": rs.ObjectColor(CAmodelBox, [51,204,51]) CAmodelGuid.append(CAmodelBox) coordCA = originsCA[i] if statesCA[i] == "living": livingBox = self.drawLiving(coordCA) guid.append(livingBox) if statesCA[i] == "working": workingBox = self.drawWorking(coordCA) guid.append(workingBox) if statesCA[i] == "resting": restingBox = self.drawResting(coordCA) guid.append(restingBox) self.guid = guid[:] rs.ObjectColor(self.guid, self.colour) # colours population purple #rs.ObjectColor(self.guid, (102,51,255)) # Assess the fitness: proximity of the boxes' centroids # within Moore neighborhood and min intersection volume # list originsCA[] has all origins of this individual's boxes. self.assessFitness(gen) def assessFitness(self, gen): # fitness according to the proximity of the proportion of the box to # Golden Section Ratio defined as a global variable GSratio GSfactor = 0 # checks the ration of the width vs length of Living CA unit # and adds to fitness factor GSfactor depending on this ratio pt1 = [0, 0, 0] pt2 = [pt1[0] + self.values[0] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.values[0] + 3, pt1[1] + self.values[1] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.values[1] + 3, pt1[2]] dst1 = rs.Distance(pt1,pt2) dst2 = rs.Distance(pt2,pt3) if dst1>=dst2: ratioLiving = dst1/dst2 else: ratioLiving = dst2/dst1 GSfactor += 12/abs(ratioLiving-GSratio) # check the floor area of the Living CA unit pts = [pt1, pt2, pt3, pt4] srf = rs.AddSrfPt(pts) areaLiving = rs.Area(srf) rs.DeleteObject(srf) # checks the ration of the width vs length of Working CA unit # and adds to fitness factor GSfactor depending on this ratio

Page 107: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

107

pt1 = [0, 0, 0] pt2 = [pt1[0] + self.values[2] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.values[2] + 3, pt1[1] + self.values[3] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.values[3] + 3, pt1[2]] dst1 = rs.Distance(pt1,pt2) dst2 = rs.Distance(pt2,pt3) if dst1>=dst2: ratioWorking = dst1/dst2 else: ratioWorking = dst2/dst1 GSfactor += 12/abs(ratioWorking-GSratio) # check the floor area of the WOrking CA unit pts = [pt1, pt2, pt3, pt4] srf = rs.AddSrfPt(pts) areaWorking = rs.Area(srf) rs.DeleteObject(srf) # checks the ration of the width vs length of Resting CA unit # and adds to fitness factor GSfactor depending on this ratio pt1 = [0, 0, 0] pt2 = [pt1[0] + self.values[4] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.values[4] + 3, pt1[1] + self.values[5] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.values[5] + 3, pt1[2]] dst1 = rs.Distance(pt1,pt2) dst2 = rs.Distance(pt2,pt3) if dst1>=dst2: ratioResting = dst1/dst2 else: ratioResting = dst2/dst1 GSfactor += 12/abs(ratioResting-GSratio) # check the floor area of the Resting CA unit pts = [pt1, pt2, pt3, pt4] srf = rs.AddSrfPt(pts) areaResting = rs.Area(srf) rs.DeleteObject(srf) distances = 0 areas = 0 XXfactor = 0 XXareasFactor = 0 totalAreaFactor = 0 distFactor = 0 objects = [] # fitness criteria keeps the neighboring boxes as close as possible # maintaining the initial spatial arrangement for z in range (CAunitsZ): for y in range (CAunitsY): for x in range (CAunitsX): # check the area of the CA unit and add it to the total area index = (CAunitsX*CAunitsY)*z + y * CAunitsX + x + 1 boxThis = self.guid[index-1] totalAreaFactor += rs.SurfaceArea(boxThis)[0] if x != 0 and y != 0 and x != CAunitsX-1 and y != CAunitsY - 1: # determines its own box, its neighboring boxes and their centroids index = (CAunitsX*CAunitsY)*z + y * CAunitsX + x + 1 boxThis = self.guid[index-1]

Page 108: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

108

if boxThis is not None: centroidThis = rs.SurfaceVolumeCentroid(boxThis)[0] index = (CAunitsX*CAunitsY)*z + (y-1) * CAunitsX + x + 1 boxS = self.guid[index-1] if boxThis is not None: centroidS = rs.SurfaceVolumeCentroid(boxS)[0] index = (CAunitsX*CAunitsY)*z + (y-1) * CAunitsX + x + 2 boxSE = self.guid[index-1] if boxThis is not None: centroidSE = rs.SurfaceVolumeCentroid(boxSE)[0] index = (CAunitsX*CAunitsY)*z + y * CAunitsX + x + 2 boxE = self.guid[index-1] if boxThis is not None: centroidE = rs.SurfaceVolumeCentroid(boxE)[0] index = (CAunitsX*CAunitsY)*z + (y+1) * CAunitsX + x + 2 boxNE = self.guid[index-1] if boxThis is not None: centroidNE = rs.SurfaceVolumeCentroid(boxNE)[0] index = (CAunitsX*CAunitsY)*z + (y+1) * CAunitsX + x + 1 boxN = self.guid[index-1] if boxThis is not None: centroidN = rs.SurfaceVolumeCentroid(boxN)[0] index = (CAunitsX*CAunitsY)*z + (y+1) * CAunitsX + x boxNW = self.guid[index-1] if boxThis is not None: centroidNW = rs.SurfaceVolumeCentroid(boxNW)[0] index = (CAunitsX*CAunitsY)*z + y * CAunitsX + x boxW = self.guid[index-1] if boxThis is not None: centroidW = rs.SurfaceVolumeCentroid(boxW)[0] index = (CAunitsX*CAunitsY)*z + (y-1) * CAunitsX + x boxSW = self.guid[index-1] if boxThis is not None: centroidSW = rs.SurfaceVolumeCentroid(boxSW)[0] # works out the sum of the distances between the neighbouring units if centroidS is not None and centroidThis is not None: dist1 = rs.Distance(centroidThis,centroidS) if centroidSE is not None and centroidThis is not None:dist2 = rs.Distance(centroidThis,centroidSE) if centroidE is not None and centroidThis is not None:dist3 = rs.Distance(centroidThis,centroidE) if centroidNE is not None and centroidThis is not None:dist4 = rs.Distance(centroidThis,centroidNE) if centroidN is not None and centroidThis is not None:dist5 = rs.Distance(centroidThis,centroidN)

Page 109: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

109

if centroidNW is not None and centroidThis is not None:dist6 = rs.Distance(centroidThis,centroidNW) if centroidW is not None and centroidThis is not None:dist7 = rs.Distance(centroidThis,centroidW) if centroidSW is not None and centroidThis is not None:dist8 = rs.Distance(centroidThis,centroidSW) distances = dist1+dist2+dist3+dist4+dist5+dist6+dist7+dist8 distFactor += distances # works out the sum of the intersection areas of the neighbouring units THISxS = rs.BooleanIntersection(boxThis,boxS, False) if (THISxS != [] and THISxS is not None): XXareasFactor += rs.SurfaceArea(THISxS)[0] objects.append(THISxS) #XXfactor += 100 THISxSE = rs.BooleanIntersection(boxThis,boxSE, False) if (THISxSE != [] and THISxSE is not None): XXareasFactor += rs.SurfaceArea(THISxSE)[0] objects.append(THISxSE) #XXfactor += 100 THISxE = rs.BooleanIntersection(boxThis,boxE, False) if (THISxE != [] and THISxE is not None): XXareasFactor += rs.SurfaceArea(THISxE)[0] objects.append(THISxE) #XXfactor += 100 THISxNE = rs.BooleanIntersection(boxThis,boxNE, False) if (THISxNE != [] and THISxNE is not None): XXareasFactor += rs.SurfaceArea(THISxNE)[0] objects.append(THISxNE) #XXfactor += 100 THISxN = rs.BooleanIntersection(boxThis,boxN, False) if (THISxN != [] and THISxN is not None): XXareasFactor += rs.SurfaceArea(THISxN)[0] objects.append(THISxN) #XXfactor += 100 THISxNW = rs.BooleanIntersection(boxThis,boxNW, False) if (THISxNW != [] and THISxNW is not None): XXareasFactor += rs.SurfaceArea(THISxNW)[0] objects.append(THISxNW) #XXfactor += 100 THISxW = rs.BooleanIntersection(boxThis,boxW, False) if (THISxW != [] and THISxW is not None): XXareasFactor += rs.SurfaceArea(THISxW)[0] objects.append(THISxW) #XXfactor += 100

Page 110: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

110

THISxSW = rs.BooleanIntersection(boxThis,boxSW, False) if (THISxSW != [] and THISxSW is not None): XXareasFactor += rs.SurfaceArea(THISxSW)[0] objects.append(THISxSW) #XXfactor += 100 if objects != []: rs.DeleteObjects(objects) if distFactor != 0: distFactor = 17000*CAunitsZ/distFactor if XXareasFactor != 0: XXareasFactor = 150000*CAunitsZ/XXareasFactor totalAreaFactor = totalAreaFactor*0.017/CAunitsZ WRareaFactor = areaWorking/areaResting*110 RLareaFactor = areaResting/areaLiving*110 if GSfactor > 200: GSfactor = 200 # In order to gain some fitness an individual (or a neuron) must have # the components of the fitness factor defined accepted level. # If one of the Fitness Factor constituents is below the # defined threshold, its total fitness is written down to zero. if selectionChoice == 2 or selectionChoice == 3: fitThreshold = gen*(FitnCoevRate) + 50 else: fitThreshold = 0 if distFactor>fitThreshold and XXareasFactor>fitThreshold and totalAreaFactor>fitThreshold and RLareaFactor>fitThreshold and WRareaFactor>fitThreshold and areaLiving >250 and GSfactor>fitThreshold: fitness = distFactor+ XXareasFactor + WRareaFactor + RLareaFactor + totalAreaFactor + GSfactor self.fitness = round(fitness,2) else: self.fitness = 0 def dispalyText(self, id): loc1 = [self.originIndiv[0], self.originIndiv[1]-10, self.originIndiv[2]] myText1 = rs.AddText(str(id), loc1, 5) rs.ObjectColor(myText1, [100,100,100]) loc2 = [self.originIndiv[0], self.originIndiv[1]-3, self.originIndiv[2]] myText2 = rs.AddText(str(self.fitness), loc2, 2) rs.ObjectColor(myText2, [255,0,0]) return myText1, myText2 def drawLiving(self, coordCA): pt1 = [coordCA[0], coordCA[1], coordCA[2]] pt2 = [pt1[0] + self.values[0] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.values[0] + 3, pt1[1] + self.values[1] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.values[1] + 3, pt1[2]] pt5 = [coordCA[0], coordCA[1], coordCA[2] + FLOORHEIGHT] pt6 = [pt1[0] + self.values[0] + 3, pt1[1], pt1[2] + FLOORHEIGHT] pt7 = [pt1[0] + self.values[0] + 3, pt1[1] + self.values[1] + 3, pt1[2] + FLOORHEIGHT] pt8 = [pt1[0], pt1[1] + self.values[1] + 3, pt1[2] + FLOORHEIGHT]

Page 111: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

111

pts = [pt1, pt2, pt3, pt4, pt5, pt6, pt7, pt8] livingBox = rs.AddBox(pts) return livingBox def drawWorking(self, coordCA): pt1 = [coordCA[0], coordCA[1], coordCA[2]] pt2 = [pt1[0] + self.values[2] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.values[2] + 3, pt1[1] + self.values[3] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.values[3] + 3, pt1[2]] pt5 = [coordCA[0], coordCA[1], coordCA[2] + FLOORHEIGHT] pt6 = [pt1[0] + self.values[2] + 3, pt1[1], pt1[2] + FLOORHEIGHT] pt7 = [pt1[0] + self.values[2] + 3, pt1[1] + self.values[3] + 3, pt1[2] + FLOORHEIGHT] pt8 = [pt1[0], pt1[1] + self.values[3] + 3, pt1[2] + FLOORHEIGHT] pts = [pt1, pt2, pt3, pt4, pt5, pt6, pt7, pt8] workingBox = rs.AddBox(pts) return workingBox def drawResting(self, coordCA): pt1 = [coordCA[0], coordCA[1], coordCA[2]] pt2 = [pt1[0] + self.values[4] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.values[4] + 3, pt1[1] + self.values[5] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.values[5] + 3, pt1[2]] pt5 = [coordCA[0], coordCA[1], coordCA[2] + FLOORHEIGHT] pt6 = [pt1[0] + self.values[4] + 3, pt1[1], pt1[2] + FLOORHEIGHT] pt7 = [pt1[0] + self.values[4] + 3, pt1[1] + self.values[5] + 3, pt1[2] + FLOORHEIGHT] pt8 = [pt1[0], pt1[1] + self.values[5] + 3, pt1[2] + FLOORHEIGHT] pts = [pt1, pt2, pt3, pt4, pt5, pt6, pt7, pt8] restingBox = rs.AddBox(pts) return restingBox class Neuron: def __init__(self, _pos, _vec, _limbo, _id, _winnersList, _dist2winner, _closestWInner, _colour): self.pos = _pos self.vec = _vec self.guid = None self.limbo = _limbo self.id = _id self.isWinner = False self.fitness = 0 self.winnersList = _winnersList self.dist2winner = _dist2winner self.closestWInner = _closestWInner self.colour = _colour self.grid = None def organise(self, neuron, input, i, winner, WINLEARN, LEARN, NEIGH): # Check if you're the winner or not... # If so adjust yourself according to WINLEARN if (self.id[0] == winner[0] and self.id[1] == winner[1]):

Page 112: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

112

self.isWinner = True # give a wiiner the colour of its input self.colour = input[i].colour for f in range(FNUM): dd = input[i].vec[f] - self.vec[f] self.limbo[f] = self.limbo[f] + dd * WINLEARN # If you were not the winner, check your distance in map 2d space # And adjust according to the distance and the LEARN value else: dist = m.sqrt(m.pow((self.id[0] - winner[0]),2) + m.pow((self.id[1] - winner[1]),2)) #print dist, NEIGH # If we're close enough apply 'positive' feedback if (dist <= NEIGH) : for f in range(FNUM): dd = input[i].vec[f] - self.vec[f] self.limbo[f] = self.limbo[f] + dd*(LEARN/dist) self.winnersList.append(winner) self.dist2winner.append(dist) # Otherwise apply 'inhibitory' feedback else: for f in range(FNUM): dd = input[i].vec[f] - self.vec[f] self.limbo[f] = self.limbo[f] - dd*(LEARN/dist) *.2 def update(self, gen, statesCA, population, neuron): if (self.guid!=None): rs.DeleteObjects(self.guid) if (self.grid!=None): rs.DeleteObjects(self.grid) for f in range(FNUM): self.vec[f] = self.limbo[f] self.drawNeuronBodyplan(gen, statesCA) # find which winner is closest and give that neuron a colour that # represents the winner's cluster smallestDistance = 10000000 bestID = 0 if self.dist2winner != [] and self.winnersList != [] and self.isWinner == False: for i in range(len(self.dist2winner)): if(self.dist2winner[i] < smallestDistance): closestID = i smallestDistance = self.dist2winner[i] closestWinner = self.winnersList[closestID] # find the u and v indexes of the closest winner and # give its colour to the neuron cwU = closestWinner[0] cwV = closestWinner[1]

Page 113: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

113

col = neuron[cwU][cwV].colour self.colour = col rs.ObjectColor(self.guid,col) if(self.isWinner): rs.ObjectColor(self.guid, self.colour) rad = CAunitsX*widthCA pos = [self.pos[0]+rad/2, self.pos[1]+rad/2, self.pos[2]] circle = rs.AddCircle(pos, rad*0.75) rs.ObjectColor(circle, self.colour) rs.ObjectLinetype(circle, "Dashed") else: circle = None #if(self.isWinner): rs.ObjectColor(self.guid, [255,0,0]) # Reset our winner status to false again self.isWinner = False return circle def drawNeuronBodyplan(self, gen, statesCA): area = 0 guid = [] grid = [] originsCA = [] originsCAmodel = [] CAmodelGuid =[] n = 0 m = 0 for z in range (CAunitsZ): for y in range (CAunitsY): for x in range (CAunitsX): # move the origin for each CA unit by genes defined 2D vector n += 2 coordCA = [self.pos[0]+(x*widthCA) + (self.vec[n + 4]/2), self.pos[1]+(y*widthCA) + (self.vec[n + 5]/2),z*FLOORHEIGHT] originsCA.append(coordCA) planeOrigin = [self.pos[0]+(x*widthCA), self.pos[1]+(y*widthCA), z*FLOORHEIGHT] # draw a rectangle of CA's grid for each unit on the ground if z == 0: myPlane = rs.PlaneFromFrame(planeOrigin, [widthCA,0,0], [0,widthCA,z*FLOORHEIGHT]) rect = rs.AddRectangle(myPlane, widthCA, widthCA) rs.ObjectColor(rect, [80,80,80]) grid.append(rect) # creates a display line between the origing of the unit#s grid rectangle # and an origin of the 3D box of the unit #m += 1 #if statesCA[m-1] != "void" and self.vec[n + 4]!=0 and self.vec[n + 5]!=0: #line = rs.AddLine(planeOrigin,coordCA) #rs.ObjectColor(line, [0,255,255]) self.grid = grid[:] # goes through every unit of every individual and draws the appropriate

Page 114: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

114

# geometry depending on the state of the unit for i in range (len(statesCA)): coordCA = originsCA[i] if statesCA[i] == "living": livingBox = self.drawNeuronLiving(coordCA) guid.append(livingBox) if statesCA[i] == "working": workingBox = self.drawNeuronWorking(coordCA) guid.append(workingBox) if statesCA[i] == "resting": restingBox = self.drawNeuronResting(coordCA) guid.append(restingBox) self.guid = guid[:] # Assess the fitness: proximity of the boxes' centroids # within Moore neighborhood and min intersection volume # list originsCA[] has all origins of this individual's boxes. #self.assessNeuronFitness(gen) def assessNeuronFitness(self, gen): # fitness according to the proximity of the proportion of the box to # Golden Section Ratio defined as a global variable GSratio GSfactor = 0 # checks the ration of the width vs length of Living CA unit # and adds to fitness factor GSfactor depending on this ratio pt1 = [0, 0, 0] pt2 = [pt1[0] + self.vec[0] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.vec[0] + 3, pt1[1] + self.vec[1] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.vec[3] + 3, pt1[2]] dst1 = rs.Distance(pt1,pt2) dst2 = rs.Distance(pt2,pt3) if dst1>=dst2: ratioLiving = dst1/dst2 else: ratioLiving = dst2/dst1 GSfactor += 10/abs(ratioLiving-GSratio) # check the floor area of the Living CA unit pts = [pt1, pt2, pt3, pt4] srf = rs.AddSrfPt(pts) areaLiving = rs.Area(srf) rs.DeleteObject(srf) # checks the ration of the width vs length of Working CA unit # and adds to fitness factor GSfactor depending on this ratio pt1 = [0, 0, 0] pt2 = [pt1[0] + self.vec[2] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.vec[2] + 3, pt1[1] + self.vec[3] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.vec[3] + 3, pt1[2]] dst1 = rs.Distance(pt1,pt2) dst2 = rs.Distance(pt2,pt3) if dst1>=dst2: ratioWorking = dst1/dst2 else: ratioWorking = dst2/dst1 GSfactor += 10/abs(ratioWorking-GSratio) # check the floor area of the WOrking CA unit

Page 115: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

115

pts = [pt1, pt2, pt3, pt4] srf = rs.AddSrfPt(pts) areaWorking = rs.Area(srf) rs.DeleteObject(srf) # checks the ration of the width vs length of Resting CA unit # and adds to fitness factor GSfactor depending on this ratio pt1 = [0, 0, 0] pt2 = [pt1[0] + self.vec[4] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.vec[4] + 3, pt1[1] + self.vec[5] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.vec[5] + 3, pt1[2]] dst1 = rs.Distance(pt1,pt2) dst2 = rs.Distance(pt2,pt3) if dst1>=dst2: ratioResting = dst1/dst2 else: ratioResting = dst2/dst1 GSfactor +=10/abs(ratioResting-GSratio) # check the floor area of the Resting CA unit pts = [pt1, pt2, pt3, pt4] srf = rs.AddSrfPt(pts) areaResting = rs.Area(srf) rs.DeleteObject(srf) distances = 0 areas = 0 XXfactor = 0 XXareasFactor = 0 totalAreaFactor = 0 distFactor = 0 objects = [] # fitness criteria keeps the neighboring boxes as close as possible # maintaining the initial spatial arrangement for z in range (CAunitsZ): for y in range (CAunitsY): for x in range (CAunitsX): # check the area of the CA unit and add it to the total area index = (CAunitsX*CAunitsY)*z + y * CAunitsX + x + 1 boxThis = self.guid[index-1] totalAreaFactor += rs.SurfaceArea(boxThis)[0] if x != 0 and y != 0 and x != CAunitsX-1 and y != CAunitsY - 1: # determines its own box, its neighboring boxes and their centroids index = (CAunitsX*CAunitsY)*z + y * CAunitsX + x + 1 boxThis = self.guid[index-1] if boxThis is not None: centroidThis = rs.SurfaceVolumeCentroid(boxThis)[0] index = (CAunitsX*CAunitsY)*z + (y-1) * CAunitsX + x + 1 boxS = self.guid[index-1] if boxThis is not None: centroidS = rs.SurfaceVolumeCentroid(boxS)[0] index = (CAunitsX*CAunitsY)*z + (y-1) * CAunitsX + x + 2 boxSE = self.guid[index-1]

Page 116: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

116

if boxThis is not None: centroidSE = rs.SurfaceVolumeCentroid(boxSE)[0] index = (CAunitsX*CAunitsY)*z + y * CAunitsX + x + 2 boxE = self.guid[index-1] if boxThis is not None: centroidE = rs.SurfaceVolumeCentroid(boxE)[0] index = (CAunitsX*CAunitsY)*z + (y+1) * CAunitsX + x + 2 boxNE = self.guid[index-1] if boxThis is not None: centroidNE = rs.SurfaceVolumeCentroid(boxNE)[0] index = (CAunitsX*CAunitsY)*z + (y+1) * CAunitsX + x + 1 boxN = self.guid[index-1] if boxThis is not None: centroidN = rs.SurfaceVolumeCentroid(boxN)[0] index = (CAunitsX*CAunitsY)*z + (y+1) * CAunitsX + x boxNW = self.guid[index-1] if boxThis is not None: centroidNW = rs.SurfaceVolumeCentroid(boxNW)[0] index = (CAunitsX*CAunitsY)*z + y * CAunitsX + x boxW = self.guid[index-1] if boxThis is not None: centroidW = rs.SurfaceVolumeCentroid(boxW)[0] index = (CAunitsX*CAunitsY)*z + (y-1) * CAunitsX + x boxSW = self.guid[index-1] if boxThis is not None: centroidSW = rs.SurfaceVolumeCentroid(boxSW)[0] # works out the sum of the distances between the neighbouring units if centroidS is not None and centroidThis is not None: dist1 = rs.Distance(centroidThis,centroidS) if centroidSE is not None and centroidThis is not None:dist2 = rs.Distance(centroidThis,centroidSE) if centroidE is not None and centroidThis is not None:dist3 = rs.Distance(centroidThis,centroidE) if centroidNE is not None and centroidThis is not None:dist4 = rs.Distance(centroidThis,centroidNE) if centroidN is not None and centroidThis is not None:dist5 = rs.Distance(centroidThis,centroidN) if centroidNW is not None and centroidThis is not None:dist6 = rs.Distance(centroidThis,centroidNW) if centroidW is not None and centroidThis is not None:dist7 = rs.Distance(centroidThis,centroidW) if centroidSW is not None and centroidThis is not None:dist8 = rs.Distance(centroidThis,centroidSW) distances = dist1+dist2+dist3+dist4+dist5+dist6+dist7+dist8 distFactor += distances # works out the sum of the intersection areas of the neighbouring units

Page 117: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

117

THISxS = rs.BooleanIntersection(boxThis,boxS, False) if (THISxS != [] and THISxS is not None): XXareasFactor += rs.SurfaceArea(THISxS)[0] objects.append(THISxS) #XXfactor += 100 THISxSE = rs.BooleanIntersection(boxThis,boxSE, False) if (THISxSE != [] and THISxSE is not None): XXareasFactor += rs.SurfaceArea(THISxSE)[0] objects.append(THISxSE) #XXfactor += 100 THISxE = rs.BooleanIntersection(boxThis,boxE, False) if (THISxE != [] and THISxE is not None): XXareasFactor += rs.SurfaceArea(THISxE)[0] objects.append(THISxE) #XXfactor += 100 THISxNE = rs.BooleanIntersection(boxThis,boxNE, False) if (THISxNE != [] and THISxNE is not None): XXareasFactor += rs.SurfaceArea(THISxNE)[0] objects.append(THISxNE) #XXfactor += 100 THISxN = rs.BooleanIntersection(boxThis,boxN, False) if (THISxN != [] and THISxN is not None): XXareasFactor += rs.SurfaceArea(THISxN)[0] objects.append(THISxN) #XXfactor += 100 THISxNW = rs.BooleanIntersection(boxThis,boxNW, False) if (THISxNW != [] and THISxNW is not None): XXareasFactor += rs.SurfaceArea(THISxNW)[0] objects.append(THISxNW) #XXfactor += 100 THISxW = rs.BooleanIntersection(boxThis,boxW, False) if (THISxW != [] and THISxW is not None): XXareasFactor += rs.SurfaceArea(THISxW)[0] objects.append(THISxW) #XXfactor += 100 THISxSW = rs.BooleanIntersection(boxThis,boxSW, False) if (THISxSW != [] and THISxSW is not None): XXareasFactor += rs.SurfaceArea(THISxSW)[0] objects.append(THISxSW) #XXfactor += 100 if objects != []: rs.DeleteObjects(objects)

Page 118: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

118

if distFactor != 0: distFactor = 17000*CAunitsZ/distFactor if XXareasFactor != 0: XXareasFactor = 150000*CAunitsZ/XXareasFactor totalAreaFactor = totalAreaFactor*0.017/CAunitsZ WRareaFactor = areaWorking/areaResting*110 RLareaFactor = areaResting/areaLiving*110 if GSfactor > 200: GSfactor = 200 #print distFactor, XXareasFactor, totalAreaFactor, GSfactor, WRareaFactor, RLareaFactor # In order to gain some fitness an individual (or a neuron) must have # the components of the fitness factor defined accepted level. # If one of the Fitness Factor constituents is below the # defined threshold, its total fitness is written down to zero. if selectionChoice == 2 or selectionChoice == 3: fitThreshold = gen*(FitnCoevRate) + 50 else: fitThreshold = 0 if distFactor>fitThreshold and XXareasFactor>fitThreshold and totalAreaFactor>fitThreshold and RLareaFactor>fitThreshold and WRareaFactor>fitThreshold and areaLiving > 250 and GSfactor>fitThreshold: fitness = distFactor+ XXareasFactor + WRareaFactor + RLareaFactor + totalAreaFactor + GSfactor self.fitness = round(fitness,2) else: self.fitness = 0 def dispalyNeuronText(self, id): loc1 = [self.pos[0], self.pos[1]-10, self.pos[2]] myText = rs.AddText(str(id), loc1, 5) rs.ObjectColor(myText, [100,100,100]) loc2 = [self.pos[0], self.pos[1]-3, self.pos[2]] myText = rs.AddText(str(self.fitness), loc2, 2) rs.ObjectColor(myText, [255,0,0]) def drawNeuronLiving(self, coordCA): pt1 = [coordCA[0], coordCA[1], coordCA[2]] pt2 = [pt1[0] + self.vec[0] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.vec[0] + 3, pt1[1] + self.vec[1] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.vec[1] + 3, pt1[2]] pt5 = [coordCA[0], coordCA[1], coordCA[2] + FLOORHEIGHT] pt6 = [pt1[0] + self.vec[0] + 3, pt1[1], pt1[2] + FLOORHEIGHT] pt7 = [pt1[0] + self.vec[0] + 3, pt1[1] + self.vec[1] + 3, pt1[2] + FLOORHEIGHT] pt8 = [pt1[0], pt1[1] + self.vec[1] + 3, pt1[2] + FLOORHEIGHT] pts = [pt1, pt2, pt3, pt4, pt5, pt6, pt7, pt8] livingBox = rs.AddBox(pts) return livingBox def drawNeuronWorking(self, coordCA): pt1 = [coordCA[0], coordCA[1], coordCA[2]] pt2 = [pt1[0] + self.vec[2] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.vec[2] + 3, pt1[1] + self.vec[3] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.vec[3] + 3, pt1[2]] pt5 = [coordCA[0], coordCA[1], coordCA[2] + FLOORHEIGHT] pt6 = [pt1[0] + self.vec[2] + 3, pt1[1], pt1[2] + FLOORHEIGHT] pt7 = [pt1[0] + self.vec[2] + 3, pt1[1] + self.vec[3] + 3, pt1[2] + FLOORHEIGHT] pt8 = [pt1[0], pt1[1] + self.vec[3] + 3, pt1[2] + FLOORHEIGHT]

Page 119: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

119

pts = [pt1, pt2, pt3, pt4, pt5, pt6, pt7, pt8] workingBox = rs.AddBox(pts) return workingBox def drawNeuronResting(self, coordCA): pt1 = [coordCA[0], coordCA[1], coordCA[2]] pt2 = [pt1[0] + self.vec[4] + 3, pt1[1], pt1[2]] pt3 = [pt1[0] + self.vec[4] + 3, pt1[1] + self.vec[5] + 3, pt1[2]] pt4 = [pt1[0], pt1[1] + self.vec[5] + 3, pt1[2]] pt5 = [coordCA[0], coordCA[1], coordCA[2] + FLOORHEIGHT] pt6 = [pt1[0] + self.vec[4] + 3, pt1[1], pt1[2] + FLOORHEIGHT] pt7 = [pt1[0] + self.vec[4] + 3, pt1[1] + self.vec[5] + 3, pt1[2] + FLOORHEIGHT] pt8 = [pt1[0], pt1[1] + self.vec[5] + 3, pt1[2] + FLOORHEIGHT] pts = [pt1, pt2, pt3, pt4, pt5, pt6, pt7, pt8] restingBox = rs.AddBox(pts) return restingBox class Input: def __init__(self, _pos, _vec, _guid, _colour): self.pos = _pos self.vec = _vec self.guid = _guid self.colour = _colour def findWinner(self, neuron,population): win_dif = 100000 winner = [] for u in range(int(round(len(population)*1.5))): for v in range(int(round(len(population)*1.5))): difference = 0 # Eucliden Distance in feature space: for f in range(FNUM): difference += (m.pow((self.vec[f] - neuron[u][v].vec[f]),2)) difference = m.sqrt(difference) if (difference < win_dif): win_dif = difference winner = [u,v] # Return 2d index of winner on map return winner def main(): # The main list of individuals population = [] oldPop = [] rs.Command("SelAll") rs.Command("Delete") rs.AddLayer("fitness curves") # Setup a text file for data myFile = open('fitnessHistory.txt', 'w') # Initialise GA for i in range(initPOPCOUNT):

Page 120: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

120

colour = [0,0,0] population.append(Individual(i, colour)) # Run the process runGA(population, oldPop, myFile) # Close the text file myFile.close() def runGA(population, oldPop, myFile): BetwGens = 0 # generate two lists that contain all possible cobinations of genes and # their corresponding values. These will be used in ReverseDecode function # to convert self.values of SOM neurons into chromosomes: list0 = [] list1 = [] bestPts = [] allGeneslist = binaryList(geneLength) for m in range (geneLength): list0.append(0) list1.append(1) allGeneslist.append(list0) allGeneslist.append(list1) allValueslist = binaryDecode(allGeneslist) # calculate a random gene decode number to be used for the creations # of the initial SOM decodeNum = 6 #################################### ######### CA model ########## #################################### # run CA for population[i]; generate two lists containing info for each CA unit: # originsCA[] and statesCA[] and pass this info to drawBodyplan function. # In this algorithm the lists are given already statesCA = [] for z in range (CAunitsZ): for y in range (CAunitsY): for x in range (CAunitsX): k = r.random() if k > 10 and (x == 0 or y == 0 or x == CAunitsX-1 or y == CAunitsY - 1): statesCA.append(statesCAall[3]) # creates void cell on outer layer of the CA model else: s = r.randrange(0,3) statesCA.append(statesCAall[s]) # assigns states to other cells ##################################### ########## GA Body Plan ########### ##################################### EVOLUTION = True while(EVOLUTION == True):

Page 121: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

121

newPopcount = len(population) for g in range(GENERATIONS): print "generation", g+1,":" myFile.write("Generation: ") myFile.write(str(g+1) + "\n") # Reset values totalFitness = 0 uvCurves = [] uPoints = [] uPointsInd = [] neuron = [] input = [] # Inputs (samples) allIndivNeurons = [] rs.EnableRedraw(False) finished = False oldPopcount = newPopcount newPopcount = len(population) if newPopcount > 1: # sets the radius according to the SOM size, that in its turn depends # on the length of each generation UMAX = oldPopcount # Map U size of next generations VMAX = oldPopcount # Map V size of next generations # calculates the distance along Y between generations BetwGens = (VMAX + 2)* 1.5 * (CAunitsY*widthCA + widthCA*2) + BetwGens step = round(200/newPopcount)-10 # Create the generation for i in range(newPopcount): # Look at the current population. Sum up all of the fitness values. # checks if an indiv has fitness population[i].decode() population[i].drawBodyplan(i, g, finished, statesCA, BetwGens) # record fitness values as points at certain heights above individuals # Fitness Surface will use the points to visualise how the fitness changes in SOM x = population[i].originIndiv[0] + CAunitsX*widthCA/2 y = population[i].originIndiv[1] + CAunitsY*widthCA/2 z = population[i].originIndiv[2] - 300 + population[i].fitness/5 upt = rs.AddPoint(x,y,z) uPointsInd.append(upt) rs.EnableRedraw(False) rs.ZoomExtents() if g == 0: colour = [30+i*step,30+(newPopcount-i)*step,200] population[i].colour = colour rs.ObjectColor(population[i].guid, colour) if population[i].fitness > 0: # check with the designer weather to keep the fit individual rs.SelectObjects(population[i].guid) rs.ZoomSelected()

Page 122: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

122

rs.UnselectAllObjects() choice = rs.GetInteger("keep this original individual? If yes, enter 1, if not, enter 2") if choice == 1: myText1, myText2 = population[i].dispalyText(i) rs.ZoomExtents() else: HandPickingIndiv(population, colour, i, g, finished, statesCA, BetwGens) else: HandPickingIndiv(population, colour, i, g, finished, statesCA, BetwGens) else: myText1, myText2 = population[i].dispalyText(i) # colour is given to each individual only in first generation. # The colours of the individuals of the following generations inherit in parts from their parents rs.ObjectColor(population[i].guid, population[i].colour) ############################### ######### SOM ########### ############################### # After the individuals are generated, initialise the neural map that # later will be trained to achieve parameters close to the individuals of the current generation rs.EnableRedraw(False) for u in range(int(round(len(population)*1.5))): vDom = [] for v in range(int(round(len(population)*1.5))): # generate initial vector parameters for each individual in the map vec = [] for i in range(FNUM): # the vectors of initial map neurons can be either zero or any # number from the decode list. If they are all zeroes, # then as the map trains it is easier to observe the "growth" # the model as it tries to match an input vecParam = 0 # r.randrange(0, decodeNum) vec.append(vecParam) pos = [u*uSPACE, v*vSPACE + BetwGens + widthCA*CAunitsY*2, 0] limbo = vec[:] id =[u,v] dist = 0 winnersList = [] dist2winner = [] closestWInner = [] colour = [0,0,0] objNeuron = Neuron(pos, vec, limbo, id, winnersList, dist2winner,closestWInner, colour) vDom.append(objNeuron) neuron.append(vDom) rs.EnableRedraw(False)

Page 123: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

123

# Initialise the inputs rs.EnableRedraw(False) for i in range(len(population)): pos = population[i].originIndiv[:] vec = population[i].values[:] guid = population[i].guid[:] colour = population[i].colour objInput = Input(pos, vec, guid, colour) input.append(objInput) rs.EnableRedraw(False) rs.EnableRedraw(True) rs.ZoomExtents() # train neurons on the map runSOM(neuron, input, statesCA, g, population, UMAX, VMAX) # take all neurons which fitness is above zero, convert them to # individuals and append theem to the list allIndivNeurons i = 0 for v in range(int(round(len(population)*1.5))): uPoints = [] for u in range(int(round(len(population)*1.5))): # assess neurons' fitness neuron[u][v].assessNeuronFitness(g) # record fitness values as points at certain heights above neurons # Fitness Surface will use the points to visualise how the fitness changes in SOM x = neuron[u][v].pos[0] + CAunitsX*widthCA/2 y = neuron[u][v].pos[1] + CAunitsY*widthCA/2 z = neuron[u][v].pos[2] - 300 + neuron[u][v].fitness/5 upt = rs.AddPoint(x,y,z) uPoints.append(upt) if neuron[u][v].fitness != 0: colour = [0,0,0] allIndivNeurons.append(Individual(i, colour)) allIndivNeurons[i].values = neuron[u][v].vec[:] allIndivNeurons[i].originIndiv = neuron[u][v].pos[:] allIndivNeurons[i].fitness = neuron[u][v].fitness allIndivNeurons[i].guid = neuron[u][v].guid[:] allIndivNeurons[i].colour = neuron[u][v].colour allIndivNeurons[i].grid = neuron[u][v].grid # Display the index as text neuron[u][v].dispalyNeuronText(i) allIndivNeurons[i].id = i

Page 124: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

124

# Reverse decode from values to chromosome for each neuron of the SOM # of each generation allIndivNeurons[i].chromosome = [] for p in range (len(allIndivNeurons[i].values)): for b in range (len(allValueslist)): # each neuron's value is rounded up to the closest even number # so a binary chromosome can be decoded from it roundedValue = closestEven(allIndivNeurons[i].values[p]) allIndivNeurons[i].values[p] = roundedValue if allIndivNeurons[i].values[p] == allValueslist[b]: for k in range (geneLength): # gene by gene, bit by bit, creates a binary chromosome decoded from neuron's values allIndivNeurons[i].chromosome.append(allGeneslist[b][k]) i += 1 uCrv = rs.AddCurve(uPoints,1) rs.DeleteObjects(uPoints) uvCurves.append(uCrv) rs.ObjectLayer(uCrv, "fitness curves") #fitnSurf = rs.AddLoftSrf(uvCurves) ####################################################### ########### GA: Selection and Crossover ########### ####################################################### rs.EnableRedraw(True) # CLUSTERS # a. separate the list allIndivNeurons into lists of individuals that # form clusters and add an original "teaching" individual from the # current generation (distinction of neurons by their colours) # b. calculate total fitness for each cluster and store it in the list # self.clusterFitness for each individual in the generation for i in range (len(population)): population[i].clusterIndList = [] population[i].clusterFitness = 0 parents = [] clustersLists(allIndivNeurons, population, g, statesCA, BetwGens, parents) #SELECTION if (g != GENERATIONS-1): for i in range (len(population)): clusterPop = population[i].clusterIndList[:] # GOLDBERG ROULTETTE SELECTION among fittest.

Page 125: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

125

# spin the wheel once to select one future parent from each cluster if selectionChoice == 1: if len(clusterPop) == 1: parent = clusterPop[0] if len(clusterPop) > 1: parentIndex = roulette(clusterPop, population[i].clusterFitness) parent = clusterPop[parentIndex] if len(clusterPop) == 0: print "cluster has no individuals" # mark the parents on the map markParents(parent) parents.append(parent) # RANDOM OPTIMISED SELECTION among fittest # filter the cluster with the min fitness cut-off limit, # generate couples and pick one randomly from each cluster if selectionChoice == 2 or selectionChoice == 3: if len(clusterPop) != 0: totClFit = 0 ln = len(clusterPop) for s in range (ln): totClFit = totClFit + clusterPop[s].fitness avFit = totClFit/ln minFit = minFitLevel*avFit/100 + avFit tempNIList = [] for j in range (ln): if clusterPop[j].fitness >= minFit: tempNIList.append(clusterPop[j]) # if original individual has fitness above minFit # make it part of the cluster & give it its own cluster id if population[i].fitness > minFit: if len(tempNIList) != 0: ind = len(tempNIList)-1 lastID = tempNIList[ind].id + 1 else: lastID = 0 population[i].id = lastID loc = [population[i].originIndiv[0] + 20, population[i].originIndiv[1]-10, population[i].originIndiv[2]] txt = "cluster id", str(lastID) myText = rs.AddText(txt, loc, 5) rs.ObjectColor(myText, population[i].colour) population[i].clusterIndList.append(population[i]) allIndivNeurons.append(population[i]) tempNIList.append(population[i])

Page 126: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

126

population[i].clusterFitness += population[i].fitness clusterPop = tempNIList[:] ln = len(clusterPop) # if there are two or more fit enough couples in the cluster if ln >= 2: if selectionChoice == 2: # creates a list of couples from the current cluster population couplesList = [] couple = [] tempList = clusterPop[:] ln1 = int(round((ln-0.5)/2)) for j in range (ln1): ln2 = len(tempList)-1 ind = r.randrange(0,ln2) couple.append(tempList[ind]) tempList.pop(ind) if ln2 != 1: ind = r.randrange(0,ln2-1) else: ind = 0 couple.append(tempList[ind]) tempList.pop(ind) couplesList.append(couple) couple = [] # pick a randon couple from the couples list coupleIndex = r.randrange(0,len(couplesList)) dad = couplesList[coupleIndex][0] mum = couplesList[coupleIndex][1] # ARTIFICIAL SELECTION # filter the cluster with the min fitness cutoff limit, # if necessary apply the elimination criteria, and manually pick the number of # candidates from each cluster if selectionChoice == 3: for t in range (len(clusterPop)): rs.SelectObjects(clusterPop[t].guid) rs.ZoomSelected() mumId = rs.GetInteger("please choose mum's id from the selected models") dadId = rs.GetInteger("please choose dad's id from the selected models") rs.UnselectAllObjects() for ind in range (len(clusterPop)):

Page 127: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

127

if clusterPop[ind].id == mumId: mum = clusterPop[ind] if clusterPop[ind].id == dadId: dad = clusterPop[ind] rs.ZoomExtents() # mark the parents on the map #parents.append(dad) parents.append(mum) # if there is one fit enough individual in the cluster if ln == 1: parent = clusterPop[0] parents.append(parent) # if there are no fit enough individuals in the cluster if ln == 0: # create a fresh random individual with the fitness above a min fitness parent = createFreshInd(minFit,population,i,allIndivNeurons,g, statesCA, BetwGens, parents) parents.append(parent) # record fitness values as points at certain heights above individuals # Fitness Surface will use the points to visualise how the fitness changes in SOM x = parent.originIndiv[0] + CAunitsX*widthCA/2 y = parent.originIndiv[1] + CAunitsY*widthCA/2 z = parent.originIndiv[2] - 300 + parent.fitness/5 upt = rs.AddPoint(x,y,z) uPointsInd.append(upt) else: minFit = 0 parent = createFreshInd(minFit,population,i,allIndivNeurons,g, statesCA, BetwGens, parents) parents.append(parent) # record fitness values as points at certain heights above individuals # Fitness Surface will use the points to visualise how the fitness changes in SOM x = parent.originIndiv[0] + CAunitsX*widthCA/2 y = parent.originIndiv[1] + CAunitsY*widthCA/2 z = parent.originIndiv[2] - 300 + parent.fitness/5 upt = rs.AddPoint(x,y,z) uPointsInd.append(upt) # PARENTS CROSSOVER # parents are chosen from the GA population and its SOM.

Page 128: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

128

# However the offsprings are replacing individuals from the GA's generation # so the next SOMap is build based on the evolved generation newPop = [] # couples are formed randomly from the list of the parents and crossed over # making sure that the number of offsprings does not exceed # max allowed POPmax noParents = len(parents) lnth = int(noParents/2) parentsTemp = parents[:] count = 0 for q in range (lnth): if len(newPop) <= POPmax: count = count + 1 ln = len(parentsTemp) ind = r.randrange(0,ln) dad = parentsTemp[ind] parentsTemp.pop(ind) markParents(dad) ln = len(parentsTemp) ind = r.randrange(0,ln) mum = parentsTemp[ind] parentsTemp.pop(ind) markParents(mum) for j in range (len(parents)): if parents[j] == dad: dad.id = j if parents[j] == mum: mum.id = j # append all offsprings to the temp new population list offspring1, offspring2 = crossover(dad.id, mum.id, parents) newPop.append(offspring1) newPop.append(offspring2) # Display the fittest among all individuals and neurons if len(allIndivNeurons) > 0: bestFitness, totFitness = showBest(allIndivNeurons, myFile, statesCA,g, bestPts) aveFitness = totFitness/len(allIndivNeurons) print "the average fitness of the generation and its SOM is", aveFitness print "the best fitness is ", bestFitness print "there are", len(allIndivNeurons), "fit individuals" #print "there are", len(newPop), "parents" if len(parents) == 4: print "The crossover is performed only between the individuals of the current generation", g + 1 myFile.write(" the average fitness of the generation and its SOM is: ") myFile.write(str(aveFitness) + "\n") myFile.write(" number of fit individuals: ")

Page 129: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

129

myFile.write(str(len(allIndivNeurons)) + "\n") myFile.write(" number of parents: ") myFile.write(str(len(parents)) + "\n") if totFitness == 0: print "Evolution died out" EVOLUTION = False # MUTATION #if selectionChoice == 1: #for j in range(len(newPop)): #if r.random()< MUTATION_RATE: #newPop[j].mutate() # copy the temp new population list to the original population list if (g != GENERATIONS-1): population = newPop[:] else: EVOLUTION = False # draws a fitness curve of the individuals of the current generation uCrv = rs.AddCurve(uPointsInd,1) rs.DeleteObjects(uPointsInd) uvCurves.append(uCrv) rs.ObjectLayer(uCrv, "fitness curves") rs.EnableRedraw(True) rs.ZoomExtents() bestPts[0][1] = bestPts[0][1] - widthCA bestPts[len(bestPts)-1][1] = bestPts[len(bestPts)-1][1] + CAunitsY *widthCA + widthCA rs.AddCurve(bestPts,1) for i in range(len(bestPts)): bestPts[i][0] = bestPts[i][0] + CAunitsY*widthCA + widthCA*2 rs.AddCurve(bestPts,1) def HandPickingIndiv(population, colour, i, g, finished, statesCA, BetwGens): if population[i].guid is not None: rs.DeleteObjects(population[i].guid) if population[i].grid is not None: rs.DeleteObjects(population[i].grid) fitFound = False while fitFound == False: individual = Individual(i, colour) individual.decode() individual.drawBodyplan(i, g, finished, statesCA, BetwGens) myText1, myText2 = individual.dispalyText(i) objs = [myText1, myText2] rs.EnableRedraw(True) if individual.fitness > 850: rs.ZoomExtents() # check with the designer weather to keep the new individual rs.SelectObjects(objs) rs.SelectObjects(individual.guid) rs.SelectObjects(individual.grid) rs.ZoomSelected() rs.UnselectAllObjects()

Page 130: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

130

choice = rs.GetInteger("keep this new individual? If yes, enter 1, if not, enter 2") if choice == 1: fitFound = True population[i] = individual population[i].colour = colour else: rs.DeleteObjects(objs) rs.DeleteObjects(individual.guid) rs.DeleteObjects(individual.grid) else: rs.DeleteObjects(objs) rs.DeleteObjects(individual.guid) rs.DeleteObjects(individual.grid) def markParents(parent): rad = CAunitsX*widthCA pos = [parent.originIndiv[0]+rad/2, parent.originIndiv[1]+rad/2, parent.originIndiv[2]] circle = rs.AddCircle(pos, rad*0.8) hatch = rs.AddHatch(circle, rs.CurrentHatchPattern()) rs.ObjectColor(circle, parent.colour) R = (255 - parent.colour[0])* 0.7 + parent.colour[0] G = (255 - parent.colour[1])* 0.7 + parent.colour[1] B = (255 - parent.colour[2])* 0.7 + parent.colour[2] if hatch is not None: rs.ObjectColor(hatch, [R,G,B]) def clustersLists(allIndivNeurons, population,g, statesCA, BetwGens, parents): minFit = 0 # separate the list allIndivNeurons into lists of individuals that # form clusters and add an original "teaching" individual from the # current generation (distinction of neurons by their colours) for i in range (len(population)): for j in range (len(allIndivNeurons)): if population[i].colour == allIndivNeurons[j].colour: population[i].clusterIndList.append(allIndivNeurons[j]) # calculate total fitness for each cluster and store it in the list # self.clusterFitness for each individual in the generation population[i].clusterFitness += allIndivNeurons[j].fitness # if the individual of the current pop has fitness then make it a part of its own cluster # and give it its own cluster id if population[i].fitness > minFit: population[i].clusterIndList.append(population[i]) allIndivNeurons.append(population[i]) population[i].clusterFitness += population[i].fitness # #for i in range (len(population)): # if there are no fit models in the cluster create a brand new individual # that have some fitness and make it part of the cluster. # position it alongside the original individuals of the current population

Page 131: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

131

#if len(population[i].clusterIndList) == 0: # add an individual from the current generation to each clusters list #createFreshInd(minFit, population,i, allIndivNeurons,g,statesCA, BetwGens, parents) # print "there is/are", len(population[i].clusterIndList), "fit individual/s in cluster", i def createFreshInd(minFit,list,id,allIndivNeurons,g, statesCA, BetwGens, parents): # creates fresh random individual with the fitness above the specified minFit level # if the number of the parents does exceed the specified limit POPmax fitFound = False while fitFound == False: colour = [r.randrange(100,200),r.randrange(100,200),r.randrange(170,200)] finished = False freshID = len(list) freshInd = Individual(freshID, colour) freshInd.decode() freshInd.drawBodyplan(freshID, g, finished, statesCA, BetwGens) if freshInd.fitness > minFit: myText1, myText2 = freshInd.dispalyText(freshID) list[id].clusterIndList.append(freshInd) allIndivNeurons.append(freshInd) list.append(freshInd) fitFound = True else: if freshInd.guid is not None: rs.DeleteObjects(freshInd.guid) if freshInd.grid is not None: rs.DeleteObjects(freshInd.grid) rs.ZoomExtents() return freshInd def runSOM(neuron, input, statesCA, g, population, VMAX, UMAX): # Initialise the parameters WINLEARN = 0.98 # Winner learning strength LEARN = 0.95 # Others learning strength RADIUS = m.sqrt(m.pow((UMAX*uSPACE),2) + m.pow((VMAX*vSPACE),2)) NEIGH = RADIUS CONVERGED = False cycles = 0 win = [] closestWinner = [] objects = [] # Now keep going until system has converged while(CONVERGED == False): for u in range(int(round(len(population)*1.5))): for v in range(int(round(len(population)*1.5))): neuron[u][v].winnersList = [] neuron[u][v].dist2winner = [] neuron[u][v].closestWInner = [] # 1. Find winner and organise for i in range(len(population)): win = input[i].findWinner(neuron, population) for u in range(int(round(len(population)*1.5))): for v in range(int(round(len(population)*1.5))):

Page 132: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

132

neuron[u][v].organise(neuron, input, i, win, WINLEARN, LEARN, NEIGH) # 2. Update map rs.EnableRedraw(False) if objects != []: rs.DeleteObjects(objects) objects = [] for u in range(int(round(len(population)*1.5))): for v in range(int(round(len(population)*1.5))): circle = neuron[u][v].update(g, statesCA, population, neuron) if circle != None: objects.append(circle) # 3. Update parameters cycles += 1 WINLEARN = WINLEARN * (1 - (cycles / 600)) #0.98 LEARN = LEARN * (1 - (cycles / 400)) #0.95 NEIGH = RADIUS * (1 - (cycles / 100)) #0.95 if WINLEARN < WINLEARN_RATE: CONVERGED = True rs.EnableRedraw(True) #print WINLEARN rs.ZoomExtents() def binaryList(n): list = [] for perm in getPerms(n): gene = map(int,perm) list.append(gene) return list def getPerms(n): for i in getCandidates(n): for perm in set(permutations(i)): yield ''.join(perm) def getCandidates(n): for i in range(1, n): res = "1" * i + "0" * (n - i) yield res def binaryDecode(allGeneslist): counter = 0 localList = [] for g in range(len(allGeneslist)): thisValue = 0 for i in range(geneLength): if allGeneslist[g][i] == 1: thisValue += m.pow(2, geneLength-i) localList.append(thisValue) return localList def closestEven(decimal): c = round(decimal) d = c/2 if d.is_integer(): return c else: if decimal > c:

Page 133: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

133

c = c+1 return c else: c = c - 1 return c def crossover(mumIndex, dadIndex, oldPop): Dcol = oldPop[dadIndex].colour Mcol = oldPop[mumIndex].colour # Make a copy of ourselves dad = Individual(dadIndex, Dcol) mum = Individual(mumIndex, Mcol) dad.chromosome = oldPop[dadIndex].chromosome[:] mum.chromosome = oldPop[mumIndex].chromosome[:] # Single point splice (not right at the ends) splice = r.randrange(1, dad.chromLength-1) # Get the left and right bits for the dad parent01_left = dad.chromosome[0:splice] parent01_right = dad.chromosome[splice:dad.chromLength] # Get the left and right bits for the mum parent02_left = mum.chromosome[0:splice] parent02_right = mum.chromosome[splice : mum.chromLength] # Now make the children and update the chromosomes for mum and dad # Essentially, mum and dad turn into their children - ahem... dad.chromosome = parent01_left + parent02_right mum.chromosome = parent02_left + parent01_right # offsprings inherit the averaged colours from their parents # split in a randomly chosen proportion R = (dad.colour[0] + mum.colour[0])/2 G = (dad.colour[1] + mum.colour[1])/2 B = (dad.colour[2] + mum.colour[2])/2 dColSplice = r.random() mColSplice = 1 - dColSplice dad.colour = [R*dColSplice, G*dColSplice, B] mum.colour = [R*mColSplice, G*mColSplice, B] # Return these new offspring return dad, mum def roulette(oldPop, totalFitness): # Goldberg's Roulette Wheel (or pie chart) as described by Mitchell, # Introduction to Genetic Algorithms, p.166 # First find a position on the pie chart myRandom = r.random()*totalFitness; fitSum = 0.0; # Now keep cycling through the pie until you get the correct member for i in range(len(oldPop)): fitSum += oldPop[i].fitness if(fitSum > myRandom): return i

Page 134: Amiina Bakunowicz_MSc Thesis_NEURAL SELF-ORGANISING MAPS AND GENETIC ALGORITHM: EVOLVING 3D CELLULAR AUTOMATA ARCHITECTURAL MODEL

134

def showBest(allIndivNeurons, myFile, statesCA, gen, bestPts): totFitness = 0 bestFitness = 0 bestID = 0 for i in range(len(allIndivNeurons)): if(allIndivNeurons[i].fitness > bestFitness): bestID = i bestFitness = allIndivNeurons[i].fitness totFitness += allIndivNeurons[i].fitness for i in range (len(allIndivNeurons[bestID].guid)): unit = allIndivNeurons[bestID].guid[i] if statesCA[i] == "living": rs.ObjectColor(unit, [204,51,51]) if statesCA[i] == "working": rs.ObjectColor(unit, [51,51,204]) if statesCA[i] == "resting": rs.ObjectColor(unit, [51,204,51]) start =allIndivNeurons[bestID].originIndiv end = [-210, 1000+ gen* (CAunitsY *widthCA + 80),0] transl = rs.VectorSubtract(end, start) rs.CopyObject(unit, transl) pt = [end[0]- widthCA, end[1], end[2] - 300 + allIndivNeurons[bestID].fitness/5] bestPts.append(pt) for i in range (len(allIndivNeurons[bestID].grid)): unit = allIndivNeurons[bestID].grid[i] start =allIndivNeurons[bestID].originIndiv end = [-210, 1000+ gen* (CAunitsY *widthCA + 80),0] transl = rs.VectorSubtract(end, start) rs.CopyObject(unit, transl) # display text loc1 = [end[0], end[1]-10, end[2]] myText = rs.AddText(str(gen), loc1, 5) rs.ObjectColor(myText, [100,100,100]) loc2 = [end[0], end[1]-3, end[2]] myText = rs.AddText(str(allIndivNeurons[bestID].fitness), loc2, 2) rs.ObjectColor(myText, [255,0,0]) # Write to the text file myFile.write(" the best fitness is: ") myFile.write(str(allIndivNeurons[bestID].fitness) + "\n") return bestFitness, totFitness if __name__=="__main__": main()