JAX implementation of the Neat (Evolving Neural Networks through Augmenting Topologies)
algorithm.
Forward Pass:
- Add connection weights
- Add individual activation functions
- Add conditional activation of output nodes, return output values
- Test forward when a single neuron is linked to multiple receivers
- Test forward pass on larger architectures
Mutations:
- Determine mutation frequency and common practices
- Implement the following mutations:
- Weight shift
- Weight reset
- Add node
- Add connection
- Mutate activation
- Wrap all mutations in a single function
Misc:
- Add Hydra config for constant attributes
- Separate
max_nodes
andmax_connections
- Add bias
- Set the minimum sender index to 1 instead of 0
Crossing:
- Add novelty fields to Network dataclass
- Implement crossing for two simple networks
- Create a Species dataclass
- Define a distance metrics between networks to cluster species
- Efficient Evolution of Neural Network Topologies, Kenneth O. Stanley and Risto Miikkulainen, 2001