neural networks for optimization william j. wolfe california state university channel islands

46
Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Post on 19-Dec-2015

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Neural Networks for

OptimizationWilliam J. Wolfe

California State University Channel Islands

Page 2: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Neural Models

• Simple processing units• Lots of them• Highly interconnected• Exchange excitatory and inhibitory signals• Variety of connection architectures/strengths• “Learning”: changes in connection strengths• “Knowledge”: connection architecture• No central processor: distributed processing

Page 3: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Simple Neural Model

• ai Activation

• ei External input

• wij Connection Strength

Assume: wij = wji (“symmetric” network)

W = (wij) is a symmetric matrix

ai ajwij

ei ej

Page 4: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Net Input

ai

aj

wij

ei

i

j

jiji eawnet

eaWnet

Page 5: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Dynamics

• Basic idea:

ai

neti > 0

ai

neti < 0

ii

ii

anet

anet

0

0

netdt

adnet

dt

dai

i

Page 6: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Energy

aeaWaE TT 21

Page 7: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

net

netnet

ewew

aEaE

E

n

j

nnj

j

j

n

,...,

,...,

/,....,/

1

11

1

netE

Page 8: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Lower Energy

• da/dt = net = -grad(E) seeks lower energy

net

Energy

a

Page 9: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Problem: Divergence

Energy

net a

Page 10: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

A Fix: Saturation

))(1( iiii

aanetdt

da

corner-seeking

lower energy

10 ia

Page 11: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Keeps the activation vector inside the hypercube boundaries

a

Energy

0 1

))(1( iiii

aanetdt

da

corner-seeking

lower energy

Encourages convergence to corners

Page 12: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Summary: The Neural Model

))(1( iiii

aanetdt

da

i

j

jiji eawnet

ai Activation ei External Inputwij Connection StrengthW (wij = wji) Symmetric

10 ia

Page 13: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Example: Inhibitory Networks

• Completely inhibitory– wij = -1 for all i,j– k-winner

• Inhibitory Grid– neighborhood inhibition

Page 14: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Traveling Salesman Problem

• Classic combinatorial optimization problem

• Find the shortest “tour” through n cities

• n!/2n distinct tours

D

D

AE

B

C

AE

B

C

ABCED

ABECD

Page 15: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

TSP

50 City Example

Page 16: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Random

Page 17: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Nearest-City

Page 18: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

2-OPT

Page 19: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Centroid

Page 20: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Monotonic

Page 21: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Neural Network Approach

D

C

B

A1 2 3 4

time stops

cities neuron

Page 22: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Tours – Permutation Matrices

D

C

B

A

tour: CDBA

permutation matrices correspond to the “feasible” states.

Page 23: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Not Allowed

D

C

B

A

Page 24: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Only one city per time stopOnly one time stop per city

Inhibitory rows and columns

inhibitory

Page 25: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Distance Connections:

Inhibit the neighboring cities in proportion to their distances.

D

C

B

A-dAC

-dBC

-dDC

D

A

B

C

Page 26: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

D

C

B

A-dAC

-dBC

-dDC

putting it all together:

Page 27: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Research Questions

• Which architecture is best?• Does the network produce:

– feasible solutions?– high quality solutions?– optimal solutions?

• How do the initial activations affect network performance?

• Is the network similar to “nearest city” or any other traditional heuristic?

• How does the particular city configuration affect network performance?

• Is there any way to understand the nonlinear dynamics?

Page 28: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

A

B

C

D

E

F

G

1 2 3 4 5 6 7

typical state of the network before convergence

Page 29: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

“Fuzzy Readout”

A

B

C

D

E

F

G

1 2 3 4 5 6 7

à GAECBFD

A

B

C

D

E

F

G

Page 30: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Neural ActivationsFuzzy Tour

Initial Phase

Page 31: Neural Networks for Optimization William J. Wolfe California State University Channel Islands
Page 32: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Neural ActivationsFuzzy Tour

Monotonic Phase

Page 33: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Neural ActivationsFuzzy Tour

Nearest-City Phase

Page 34: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

24

23

22

21

20

19

18

17

16

15

14

13

12

11

10

9

8

7

6

to

ur

len

gt

h

10009008007006005004003002001000iteration

Fuzzy Tour Lengths

centroidphase

monotonicphase

nearest-cityphase

monotonic (19.04)

centroid (9.76)nc-worst (9.13)

nc-best (7.66)2opt (6.94)

Fuzzy Tour Lengthstour length

iteration

Page 35: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

12

11

10

9

8

7

6

5

4

3

2

tour length

70656055504540353025201510# cities

average of 50 runs per problem size

centroid

nc-w

nc-bneur

2-opt

Average Results for n=10 to n=70 cities

(50 random runs per n)

# cities

Page 36: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

DEMO 2

Applet by Darrell Longhttp://hawk.cs.csuci.edu/william.wolfe/TSP001/TSP1.html

Page 37: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Conclusions

• Neurons stimulate intriguing computational models.

• The models are complex, nonlinear, and difficult to analyze.

• The interaction of many simple processing units is difficult to visualize.

• The Neural Model for the TSP mimics some of the properties of the nearest-city heuristic.

• Much work to be done to understand these models.

Page 38: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

EXTRA SLIDES

Page 39: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Brain

• Approximately 1010 neurons• Neurons are relatively simple• Approximately 104 fan out• No central processor• Neurons communicate via excitatory and

inhibitory signals• Learning is associated with modifications of

connection strengths between neurons

Page 40: Neural Networks for Optimization William J. Wolfe California State University Channel Islands
Page 41: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Fuzzy Tour Lengths

iteration

tour length

Page 42: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Average Results for n=10 to n=70 cities

(50 random runs per n)

# cities

tour length

Page 43: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

01

10W

1,1

1

1,1

1

v

v

a1

a2

1

1

0111

1011

1101

1110

W

Page 44: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

a1

a2

1

1

with external input e = 1/2

Page 45: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

Perfect K-winner Performance: e = k-1/2

e

e

e

e

e

e

e

e

Page 46: Neural Networks for Optimization William J. Wolfe California State University Channel Islands

1

0

initial activations

final activations

1

0

initial activations

final activations

e=½(k=1)

e=1 + ½(k=2)