colgreen / sharpneat

SharpNEAT - Evolution of Neural Networks. A C# .NET Framework.
https://sharpneat.sourceforge.io/
Other
386 stars 99 forks source link

Any tutorial? #32

Open Jenscaasen opened 6 years ago

Jenscaasen commented 6 years ago

Hi, sorry for opening an issue for this, but i dont know how else to contact you. Is there any documentation on how to actually implement SharpNeat? There is a collection of examples somewhere in the internet using sharpneat, which i can not find again, and there is that 8 year old tutorial from someone using it for TicTacToe, but which is incompatible to the current Nuget release. Outside of that, how do i use SharpNeat classes to give it input and getting outputs?

Best regards

polgaro commented 5 years ago

This guy made a tutorial: http://www.nashcoding.com/2010/07/17/tutorial-evolving-neural-networks-with-sharpneat-2-part-1/

Also, here's a simple implementation you can copy from: https://github.com/polgaro/NEAT

t4ccer commented 4 years ago

I created simple tutorial here: https://t4ccer.com/posts/sharpneat-tutorial/

Jenscaasen commented 4 years ago

I created simple tutorial here: https://t4ccer.com/posts/sharpneat-tutorial/

Thank you for that, thats so far the easiest to use tutorial that only uses the NUGET Package.

I was wondering if there is any documentation or experience on how to make the algorithm explore the possible outputs more "agressively". In my case after 2000 generations he still keeps most outputs at 0.5 (the default i presume).

t4ccer commented 4 years ago

I'm glad to hear that my tutorial is useful.

In my experience setting complexityRegulationStrategy to NullComplexityRegulationStrategy made network evolve faster. You can also increase number of specimen in each generation. Of course if You need deeper understanding of NEAT You can read this paper. Unfortunately I didn't find any sharpneat documentation so I experimented myself.

Also if your network always performs with the same fitness it may be problem with fitness computing function.

garnier77 commented 4 years ago

Hello everyone thank you for the tutorials as NEAT is very poorly documented and hard to understand and to put in models. I am willing to pay anyone that can help me build a time-series model to implement in NEAT and we do this through screen share Teamviewer or Skype and at the same time teaching me how NEAT works. Please email me or skype me if interested JGARNIER77

On Mon, Aug 10, 2020 at 10:20 AM t4ccer notifications@github.com wrote:

I'm glad to hear that my tutorial is useful.

In my experience setting complexityRegulationStrategy to NullComplexityRegulationStrategy made network evolve faster. You can also increase number of specimen in each generation. Of course if You need deeper understanding of NEAT You can read this http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf paper. Unfortunately I didn't find any sharpneat documentation so I experimented myself.

Also if your network always performs with the same fitness it may be problem with fitness computing function.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/colgreen/sharpneat/issues/32#issuecomment-671482931, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFZOZ3YXIG3PJVTKM3K2T4DSAAT7TANCNFSM4FON5YDA .

marcus1337 commented 3 years ago

Hello everyone thank you for the tutorials as NEAT is very poorly documented and hard to understand and to put in models. I am willing to pay anyone that can help me build a time-series model to implement in NEAT and we do this through screen share Teamviewer or Skype and at the same time teaching me how NEAT works. Please email me or skype me if interested JGARNIER77 On Mon, Aug 10, 2020 at 10:20 AM t4ccer @.***> wrote: I'm glad to hear that my tutorial is useful. In my experience setting complexityRegulationStrategy to NullComplexityRegulationStrategy made network evolve faster. You can also increase number of specimen in each generation. Of course if You need deeper understanding of NEAT You can read this http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf paper. Unfortunately I didn't find any sharpneat documentation so I experimented myself. Also if your network always performs with the same fitness it may be problem with fitness computing function. — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub <#32 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFZOZ3YXIG3PJVTKM3K2T4DSAAT7TANCNFSM4FON5YDA .

Hello, I have been developing highly efficient NEAT code for free which is available through my GitHub page. I am willing to give tutorials.

CopilotCoding commented 1 year ago

I generated my own API reference for using SharpNEAT at https://copilotcoding.github.io/ if you want to use it for your own reference its there.

Most of the information about SharpNEAT is heavily outdated, you could use older more documented versions instead, but if you want 4.0.0 documentation you will need to create it or wait for someone else to make it.

2 months ago, January 14, 2023: https://www.youtube.com/watch?v=pqVOAo669n0

Defunct dead link: http://www.nashcoding.com/2010/07/17/tutorial-evolving-neural-networks-with-sharpneat-2-part-1/

12 years ago, March 23, 2011: https://github.com/tansey/sharpneat-tutorials 4 years ago, March 17, 2019: https://vbstudio.hu/en/blog/20190317-Growing-an-AI-with-NEAT 4 years ago, Mar 31, 2019: https://github.com/polgaro/NEAT 2 years ago, May 7, 2020: https://t4ccer.com/posts/sharpneat-tutorial/

If anyone has a more updated version of these based on the new version 4 library, please let me know:

4 years ago, Aug 31, 2019: https://github.com/lordjesus/UnityNEAT

3 years ago, Nov 7, 2020: https://github.com/flo-wolf/UnitySharpNEAT