yaricom / goESHyperNEAT

The implementation of evolvable-substrate HyperNEAT algorithm in GO language. ES-HyperNEAT is an extension of the original HyperNEAT method for evolving large-scale artificial neural networks.
MIT License
14 stars 3 forks source link

Finish ESHyperNEAT implementation #1

Open yaricom opened 3 years ago

ghost commented 3 years ago

Is this still an open issue or has ESHyperNEAT been integrated fully into the current code base?

yaricom commented 3 years ago

It is still work in progress. However, basic types and routines already implemented. I need to find some time to finish with integration and to create some experiments as a proof of concept.

ghost commented 3 years ago

How much work is there left to be able to use the library for some experiments? I've been reading the relevant papers and want to implement es-hyper neat in go, and this seems to be the closest project to the end goal. I am willing to contribute to the effort, is there anything you'd need another developer on?

yaricom commented 3 years ago

The library already has everything related to the HyperNEAT and ES-HyperNEAT algorithms implemented. Now we need to implement some experiments as POC to test that implementation is actually working. I’m thinking about implementation of classic retina experiment as it was described in original paper.

In my book I mentioned this experiment and provided Python implementation which can be found at https://github.com/PacktPublishing/Hands-on-Neuroevolution-with-Python/tree/master/Chapter8.

In other words, the retina experiment needs to be ported to the GO language and we need to write experiment runner similar to this one https://github.com/yaricom/goNEAT_NS/blob/master/executor.go.

If you have time and desire to dive into this it would be very appreciated.

I will provide detailed experiment description as a draft of related chapter in the book.

ghost commented 3 years ago

I would be interested in developing/porting the retina experiment and executor in go, I've developed stock chart rl gyms in python in the past. I have some time to develop it during the next few weeks. Once that is good to go, would you be able to write-up an example of using the library with the experiment executor? Also, how reliable would you say this library is for training models in a very large, distributed context? I'm looking to use this library afterward for a distributed es-hyperneat training algorithm that will run on many computers. The NEAT library was surprisingly fast and the code base is rigorous and well commented.

yaricom commented 3 years ago

Hello Vlad,

Unfortunately, right now, I have no time to work on this. So, if you are interested I can provide you with necessary guidelines to accomplish the POC with less efforts.

I'm expecting that during POC implementation some bugs will be found in the library that will require fixes. I can help with this.

The ES-HyperNEAT library under the hood uses NEAT library. However, it adds on top the CPPN and the hypercube-based substrate implementations. The part related to the NEAT already tested within several projects. The CPPN and hypercube is novel and not battle-tested yet. Thus, we need to proof the reliability of this library with POC.

My hope is that with several experiments implemented as a POC it will become stable enough to be considered for use in other projects.

ghost commented 3 years ago

Hello Iaroslav, Id be glad to help bring this project to v1.0. Ive looked through the codebase and most of it is fairly straightforward. I could use some guidance just on a high level of how youd workflow the apis functions on a generic task. Also, the way you build networks in the goNEAT library with traits and nnodes and the fastmodularnetworksolver, how relevant is that to this implementation? (as the quadtree/pruning/pruning algorithm is its replacement I think and youve already written this in the new version.) Anyways, im ready to take on the task and if youd be kind enough to provide me with the aforementioned guidelines and a roadmap of steps on finalizing and training with the library. Afterwards, I could also finalize the github repo if need be.

On Mon, Mar 15, 2021 at 12:40 PM Iaroslav Omelianenko < @.***> wrote:

Hello Vlad,

Unfortunately, right now, I have no time to work on this. So, if you are interested I can provide you with necessary guidelines to accomplish the POC with less efforts.

I'm expecting that during POC implementation some bugs will be found in the library that will require fixes. I can help with this.

The ES-HyperNEAT library under the hood uses NEAT library. However, it adds on top the CPPN and the hypercube-based substrate implementations. The part related to the NEAT already tested within several projects. The CPPN and hypercube is novel and not battle-tested yet. Thus, we need to proof the reliability of this library with POC.

My hope is that with several experiments implemented as a POC it will become stable enough to be considered for use in other projects.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/yaricom/goESHyperNEAT/issues/1#issuecomment-799567229, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHXBMUGJWRLZ3LJCZVHZSTDTDY2BPANCNFSM4TFOES4Q .

yaricom commented 3 years ago

The ES-HyperNEAT and HyperNEAT algorithms based on Compositional Pattern Producing Network (CPPN) which effectively draws connections within hypercube substrate. The evolution of the CPPN is controlled by the NEAT algorithm as implemented in goNEAT library.

We need to use the goNEAT library to evolve the genome of the CPPN. During the evolutionary process at the end of each epoch we need to create the network solver (phenotype) from the genotype of the CPPN and use it to generate the substrate network solver by invoking https://github.com/yaricom/goESHyperNEAT/blob/f5607074891691afe1e4209ef790d9b7df7d5372/cppn/evolvable_substrate.go#L42

The generated substrate network solver is ANN which can be used to find solution of the experiment at current epoch of the evolution and to collect relevant statistics.

In other words, we have network solvers which is an optimized ANN implementations (phenotype) and the evolving CPPN genome (controlled by the NEAT).

Quadtree pruning is the part of substrate evolution mechanics, but it is not a part of genetic evolution - just a smart tricks to automatically place the ANN nodes and arrange links in-between.

Please find some details about ES-HyperNEAT and retina experiment at https://github.com/yaricom/goESHyperNEAT/blob/master/docs/Chapter_8.pdf

I have created project with implementation stages at https://github.com/yaricom/goESHyperNEAT/projects/2. Please take a look.

ghost commented 3 years ago

Thank you for the crystal clear explanation. I understand the objectives and am looking forward to starting to code in the next few days. I also appreciate your quick response time so far and the docs have cleared up all misconceptions. Take care.

yaricom commented 3 years ago

Hello Vlad,

I believe it would be better to use Execute(context *neat.NeatContext, start_genome, gen_eval). You can get necessary neat.NeatContext from the ESHyperNEATContext. The last one is a composite of neat.NeatContext and HyperNEATContext. You can get the NeatContext from it directly.

ghost commented 3 years ago

Hi Iaroslav, Looks like the retina experiment is done, nice work! Have you got it to train and win on your side? I haven't been able to beat retina, but I've been able to beat cart_pole and XOR that I copied over from goNEAT/experiments. I would like to use these experiments in goESHyperNEAT, but the only difference is that OrgEvaluate() doesn't create the Ann Solver from our CPPNs. Maybe for backwards compatibility, we can add that functionality into the goNEAT library? If its not worth the hassle I can just push my updated experiment files.

I have one question about the models we evolve. Is there a limit on how large and complex they may grow? And this applies to Traits, because we supply a few Traits in the setup genome files, but are more Traits appended or removed over training time or are the weights just mutated?

It's been a pleasure working on this so far, if there is anything to code in a short time to finalize this library let me know, ill see if it is in my capacities.

Take care, Vlad

yaricom commented 3 years ago

Hi Vlad,

I made some fixes to the retina experiment and related parts of the ES-HyperNEAT which already in the master. Now it runs without any issues, but I've not executed it long enough to get some results. It can about 1000 epochs of evolution to find a successful solver.

Meantime, I'm looking into possible bugs in the Fast Network Solver implementation to avoid wasting time by waiting for evolution to complete :)

So, basically, I'm back to baseline goNEAT library and preparing v2 release, which is compliant with GO modules versioning, i.e., having v2 in the path. Also, in my plans is to implement support for Numpy format when dumping experiment statistics. This will allow to use many standard Python ML tools to visualize and analyse experiment results. It can take a while.

The HyperNEAT algorithm is designed to support large networks, but I have not measured the ultimate extremes. I believe it depends on the bare metal power that you have.

The goNEAT library has many mutators, not only weights. The different mutators implementations can be seen at genome.go#L699. What mutations will be applied controlled by NeatContext configuration parameters.

Traits can be mutated as well. However, despite the fact that it is part of NEAT they are not used for evolution directly right now. Rather, it is reserved for the future extensions of the algorithm allowing to consider Traits to provide additional control over the evolutionary process.

Thank you for your help. I'm planning to finish with goNEAT library enhancements and to update ESHyperNEAT library to use the updated version. After that it would be more clear what else needs to be done.

Best regards, Iaroslav

ghost commented 3 years ago

This all sounds great! I am planning to use this library in an upcoming project as a core component, so let's hope it goes well!

The Python data extension would be great, I was thinking of creating a visualizer for creating .png's from the GraphBuilder ML file format in Golang (if it doesn't exist already, and if it does can you let me know lol).

Good luck with the projects, I think they will garner some well-deserved appreciation soon enough

yaricom commented 3 years ago

Hello Vald,

The GraphBuilder allows to dump graph suing GraphML format. You can use Cytoscape to visualize saved data after that.

I wish you good luck with your upcoming project. Let me know when you have some results. I would be interested to see how HyperNEAT performs.

Best regards, Iaroslav