Module 19: Neuroevolution: Evolving Network Topology - When the Architecture Matters as Much as the Weights
NEAT evolves both the topology and weights of neural networks, starting minimal and growing complex. HyperNEAT encodes patterns indirectly. These algorithms discover network architectures that no human would design - and they work.
Learning Objectives
- Understand direct vs indirect encoding for neural networks
- Implement weight evolution for a fixed-topology network
- Understand NEAT (complexification, speciation, innovation numbers)
- Know HyperNEAT and CPPNs (Compositional Pattern-Producing Networks)
- Use neat-python to evolve a game-playing agent
Concept Explanation
Coming soon.
Code Examples
Coming soon.
Exercises
Coming soon.
Milestone Checklist
- Evolved weights for a fixed network
- Understand NEAT speciation and innovation
- Used neat-python on a control task
- Can explain direct vs indirect encoding trade-offs
Was this page helpful?