Thesis Presentation WITH NOTES

Evolving Artificial Neural Network
Controllers for Autonomous
Agents Navigating Dynamic
Environments
Thesis presented by Robert Lucas
Overview
•
•
•
•
•
•
Objective
Prerequisites
Recent Literature
Methodology
Experimentation and Findings
Conclusions
Objective
• “Evolve an efficient neural network
controller that can learn to effectively
operate an autonomous agent in multiple
different dynamic environments”
Prerequisites
•
•
•
•
Artificial Neural Networks
Genetic Algorithms
Autonomous Agents
Dynamic Environments
Recent Literature
• Evolving Artificial Neural Networks
–
–
–
–
Weight Evolving Algorithms
Topology Evolving Algorithms
Hybrid Evolution Algorithms
Other methods
• Artificial Neural Networks for
Autonomous Agents
• Simulated Environments and Real World
Environments
Methodology
NEAT Algorithm
• NeuroEvolution of Augmenting
Topologies or NEAT
• Evolves artificial neural networks using:
– Weight mutation
– Structural mutation
– Crossover
– Speciation
NEAT Algorithm
Nodes 1
2
Links
5
4
3
1
1
2
1
2
2
1
2
2
1
Segmental Duplication
NEAT
• Based on NEAT
• Inspired by recent research of the human
genome
• Duplicate genetic information essential
for the advancement of the species
• Duplicate segments may offer an
evolutionary leap for the population
A Segment
• A segment is an array of Sn nodes and an
array of Lm links.
• S contains only hidden nodes.
• The first link in L is connected to an input node.
• The last link in L is connected to an output
node.
• The segment is not recurrent.
A Segmental Duplication
Nodes 1
2
3
6
Nodes 1
2
3
6
Links
7
8
9
Links
7
8
9
4
4
Nodes
10
Links
11
12
Nodes 1
2
3
6
10
Links
7
8
9
11
4
12
Neuroevolutionary Solver
• SIMBAD robot
simulation system
• PicoEvo
• PicoNeuro
• NEAT and SDNEAT
written into PicoEvo
• Holodeck
Experimentation and
Findings
XOR Problem
• XOR is a binary logic function.
• The output of XOR is only true when its
inputs are different values.
• The XOR output values are not linearly
separable.
• XOR cannot be solved by an artificial
neural network with no hidden nodes.
Most efficient solutions
NEAT
SDNEAT
NEAT XOR Performance
NEAT XOR Solution Generation Number
NEAT XOR Number of Hidden Nodes
90
80
70
Generation
60
50
40
30
20
10
0
1
2
3
4
5
6
7
8
9
10
11
12 13 14
Experiment
15
16
17
18
19
20
21
22
23
24
25
SDNEAT XOR Performance
SDNEAT XOR Solution Generation Number
SDNEAT XOR Number of Hidden Nodes
90
80
70
Generation
60
50
40
30
20
10
0
1
2
3
4
5
6
7
8
9
10
11
12 13 14
Experiment
15
16
17
18
19
20
21
22
23
24
25
Dynamic Obstacle
Avoidance Problem
• Large problem.
• Small subset used for Neuroevolutionary
Solver experimentation.
• Three environments, one static and two
dynamic.
• Smart agent starts at a fixed location in each.
• Goal placed at fixed location.
• Three environments meant to offer increasing
obstacle avoidance complexity.
Experimentation
Maze
Busy Hallway
Busy Room
Fitness Function
• s = speed
• a = angular velocity
• m = maximum sensor
value
• d = distance from goal
• Constants c1,c2,c3 were
set to 1.0, 1.6, 1.0
respectively.
NEAT Fitness vs. Generation
SDNEAT Fitness vs. Generation
High Fitness Solutions
Above 20,000 fitness NEAT
Above 20,000 fitness SDNEAT
45
Number of High-Fitness Solutions
40
35
30
25
20
15
10
5
0
1
4
7
10
13
16
19
22
Experiment
25
28
31
34
37
40
Solution Categorization
NEAT Solutions
SDNEAT Solutions
100
90
80
Number of Solutions
70
60
50
40
30
20
10
0
1
2
3
4
5
Category
6
7
8
Solution Topology
NEAT
SDNEAT
The SDNEAT Solution
Conclusions
• NEAT and SDNEAT both identify efficient
solutions to XOR. SDNEAT finds the solutions
in less time on average.
• SDNEAT evolved a higher-efficiency and
better-performing solution to the dynamic
obstacle avoidance problem than NEAT.
• SDNEAT was the only algorithm to evolve a
solution to the maze environment.