yaricom / goNEAT

The GOLang implementation of NeuroEvolution of Augmented Topologies (NEAT) method to evolve and train Artificial Neural Networks without error back propagation
MIT License
74 stars 19 forks source link

How would one go about supporting recurrent graphs? #45

Open qwertyuu opened 2 years ago

qwertyuu commented 2 years ago

I would like to give you a hand, or at least try to, by beginning the implementation of recurrent networks in NEAT.

I love the idea of recurrent networks and their emergent properties. That said, have you thought about what steps would be needed for such an implementation? or if another implementation does it well, or if papers cite being able to generate recurrent networks with neat? I remember reading religiously the original NEAT paper and not finding much about the higher-level properties of the generated graphs that NEAT would create, so I am still wondering how one could bring this about (how are time steps counted, how is recurrent links represented, etc.)

Since you explicitly included paths of your code to enable recurrence, I wonder what were your plans about this.

Thanks again for your nice library!

yaricom commented 2 years ago

The library supports production of the recurrent links, but it is part of the evolutionary process how to produce such links.

The library right now also supports the idea of modular networks, which can be used to construct initial genome with a kind of recurrent logic incorporated from the beginning of evolution.

I have not investigated this any further yet, but it seems like very promising idea to use genome modules to imitate recurrent networks.

Also, I'm still working on ES-HyperNEAT which allows to produce even more complex and larger phenotypes. It is still work in progress which I'm planning to work on.

I'm living in Ukraine and right now we have a war here. So, it is hard to estimate when I would be able to continue on this library. I hope that it would be possible sooner or later.