wboler05 / pso_neural_net

1 stars 0 forks source link

PSO and BP Combined #14

Closed wboler05 closed 7 years ago

wboler05 commented 7 years ago

Last night, I read [1] discussing the combination of PSO and RNNs. In their discussion, the authors found a tradeoff between PSO and back propagation. PSO is good for finding a global maxima/minima but not so good at fine tuning the error. Back propagation is good for fine tuning the error, but gets trapped in a local maxima/minima. Their novel idea was to treat the training of the RNN with a maximum amount of epochs/generations first with PSO, i.e. 50 generations, and then finalize the training with back propagation.

I believe it would be beneficial to implement back propagation in the training of the neural net in combination with PSO, as a final approach for fine-tuning the accuracy of our best global solution.

[1] P. Xiao, G. K. Venayagamoorthy and K. A. Corzine, "Combined Training of Recurrent Neural Networks with Particle Swarm Optimization and Backpropagation Algorithms for Impedance Identification," 2007 IEEE Swarm Intelligence Symposium, Honolulu, HI, 2007, pp. 9-15. doi: 10.1109/SIS.2007.368020 URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4223149&isnumber=4223144

wboler05 commented 7 years ago

We have decided not to pursue this route.