tkaczenko / GradientDescent

University project to optimize function using gradient descent: Web app based on Flask
0 stars 0 forks source link

Using scipy's genetic algorithm for initial parameter estimation in gradient descent #1

Open zunzun opened 7 years ago

zunzun commented 7 years ago

I see you are writing Python code for gradient descent on GitHub. A general problem for gradient descent and other non-linear algorithms - particularly for more complex equations - is the choice of initial parameters to start the "descent" in error space. Without good starting parameters, the algorithm will stop in a local error minimum. For this reason the authors of scipy have added a genetic algorithm for initial parameter estimation for use in gradient descent. The module is named scipy.optimize.differential_evolution.

I have used scipy's Differential Evolution genetic algorithm to determine initial parameters for fitting a double Lorentzian peak equation to Raman spectroscopy of carbon nanotubes and found that the results were excellent. The GitHub project, with a test spectroscopy data file, is:

https://github.com/zunzun/RamanSpectroscopyFit

I also have a Python flask curve fitting and surface fitting web app on github at https://github.com/zunzun/FlaskFit - the fitted 3D surface equation plots from matplotlib look really cool.

If you have any questions, please let me know. My background is in nuclear engineering and industrial radiation physics, and I love Python, so I will be glad to help.

tkaczenko commented 7 years ago

Thanks a lot for your answer. I've tried to implement a trivial algorithm for my course project, using amazing language. Genetic algorithms and another algorithms is so interesting. Maybe, I'll return to this algorithm in future, but I think it would be after graduate.