Open morsemind opened 5 years ago
I've thought about using backprop, possibly to create improved versions of elites or something, but never got around to it. You'd have to write the backprop code yourself, or at a minimum convert the network to some format that would let you optimize it in some other framework, then convert it back.
The first place I'd probably try adding it is in the reproduce method of the reproduction class, most likely here: https://github.com/CodeReclaimers/neat-python/blob/master/neat/reproduction.py#L161
Hey, Thank you! I was thinking the same- reproduction seemed to be the most comfortable place for an add on. Will share an update when I get around to it.
My minimal experience with NN has led me to believe that back prop is important if not downright necessary for problems of continuous nature. Classification problems tend to allow a larger bandwidth for less-than-threshold errors in the prediction but that leeway goes for a toss when you are looking for numerical fitting. Does this make sense?
@morsemind What about doing: Train with NEAT for some generations. Start to backprop for 5 epochs with TRPO with the selected individuals to proceed to the next generation.
Thoughts?
First relevant paper for this issue I could think of Differentiable Pattern Producing Network
Great minds think alike. You guys are awesome, I came up the same idea half years ago after I read the paper of neat. But I hardly have time to realize it, would you like work together on this great idea?
I think it needs to be implemented by someone
I second that. Backpropagation would be a valuable addition to this library.
Hi. [ Disclaimer: I am pretty new - to github, ANNs and NEAT as well. ] I was wondering if executing backpropagation with a certain scheme might help training. Is it currently possible to do so? If not, how should I go about adding backpropagation as a potential way of generating offsprings? Any inputs/suggestions would be immensely helpful. Thanks a ton.