Well a follow up on the neural net idea. I was thinking that maybe it would be useful as a means of representing a dimension of time. HyperNEAT’s network takes structure and location into account. I think that by staggering input nodes, ordering them would be adding the concept of time to how the inputs are weighted. To get this effect all you need to do is connect an input node to a hidden layer node that has other hidden layer neurons as input. How would you show that this is true? I have seen some ANN architectures that have this structure however I have not seen that they argue that by structuring the network in this way that it actually adds any information.