crisbodnar / TensorFlow-NEAT

TensorFlow Eager implementation of NEAT and Adaptive HyperNEAT
Apache License 2.0
118 stars 32 forks source link

optimizing the neat with backpropagation ? #2

Open winatawelly opened 5 years ago

winatawelly commented 5 years ago

do you think it is possible to backprop the neat result ? if so what do you think the best way(s) to do that ? thanks!

crisbodnar commented 5 years ago

Yes. As long as the evolved network is differentiable, this can be done. This is not different in any way from optimising a regular neural network, so you should find enough resources online for doing this.