Skip to main content

Module 19: Neuroevolution: Evolving Network Topology - When the Architecture Matters as Much as the Weights

NEAT evolves both the topology and weights of neural networks, starting minimal and growing complex. HyperNEAT encodes patterns indirectly. These algorithms discover network architectures that no human would design - and they work.

Learning Objectives

  1. Understand direct vs indirect encoding for neural networks
  2. Implement weight evolution for a fixed-topology network
  3. Understand NEAT (complexification, speciation, innovation numbers)
  4. Know HyperNEAT and CPPNs (Compositional Pattern-Producing Networks)
  5. Use neat-python to evolve a game-playing agent

Concept Explanation

Coming soon.

Code Examples

Coming soon.

Exercises

Coming soon.

Milestone Checklist

  • Evolved weights for a fixed network
  • Understand NEAT speciation and innovation
  • Used neat-python on a control task
  • Can explain direct vs indirect encoding trade-offs

Was this page helpful?