NEAT

Well now I am at GMU and I am working on adding NEAT to ECJ! Back when I was learning about HyperNEAT I neglected to read about NEAT. Which now that I have been reading about it I wish I had taken the time. Then I would have known that what I called fault tolerance is called ablation testing and this testing was performed on NEAT!

CCSC HyperNEAT Poster

I was accepted to present a poster about my research into the fault tolerance of HyperNEAT at the CCSC 2010 Conference at Juniata College. I put my research paper on my website at http://drew.freehostingcloud.com/research/hyperneat/fault-tolerance-of-hyperneat.html.

HyperNEAT Capture the Flag

I created a capture the flag game to explore the capability of cooperative teams in a competitive environments using hyperNEAT.  I want to know if competing teams can learn when it is in their best interest to cooperate with a competing team.  I made a short video of my progress seen below:

Fault Tolerance Thoughts

Can an evolved hidden layer in the substrate handle faults not only in the output layer but also in the hidden layer without being trained with example faults?  I must first train it to get to the goal.  Then I should set one of the nodes in the hidden or output layers to be zero to test for fault tolerance.

If this doesn’t work I think that I need to move the output layer nodes to compensate since it is based on geometry.

HyperNEAT Experiment

I am currently experimenting with HyperNEAT to find out how well it responds to faults.  Such as, mechanical faults in a robot’s actuators.  I want to know whether HyperNEAT’s use of geometry will allow it to adapt to faults faster or slower than a regular feed forward backprop neural net.  I am using the CSharp HyperSharpNEAT implementation of HyperNEAT for my experiment.  By conducting this experiment I am, not only learning more about the algorithm, but I am also learning CSharp.

HyperNEAT

I just recently began to experiment with HyperNEAT (Hypercube based Neuroevolution of Augmenting Topologies). It is a new way to evolve neural networks created by Dr. Stanley. It uses the geometry of the placement of neurons in a neural network as input into a CPPN (Compositional Pattern Producing Network) which is evolved to learn the connection weights between neurons.