Open winatawelly opened 5 years ago
do you think it is possible to backprop the neat result ? if so what do you think the best way(s) to do that ? thanks!
Yes. As long as the evolved network is differentiable, this can be done. This is not different in any way from optimising a regular neural network, so you should find enough resources online for doing this.
do you think it is possible to backprop the neat result ? if so what do you think the best way(s) to do that ? thanks!