meteficha / nonlinear-optimization

Various iterative algorithms for optimization of nonlinear functions.
GNU General Public License v3.0
7 stars 6 forks source link

How about consolidating with other optimization packages? #10

Open msakai opened 4 years ago

msakai commented 4 years ago

I think it would be nice if there is a package like Python's scipy.optimize in Haskell world, which provides various optimization algorithms in the same interface in one-stop. And I think nonlinear-optimization is a good package name to accumulate various optimization algorithms.

Also, I'm developing AD-enabling wrapper for nonlinear-optimization (nonlinear-optimization-ad and nonlinear-optimization-backprop). But writing such wrapper for each optimization algorithm and package is not economical, and writing only one wrapper for the one-stop package is much better.

meteficha commented 4 years ago

I'm not actively developing this package anymore, so if you have any ideas on how to improve it I'm happy to give you carte blanche for it :).

msakai commented 4 years ago

Thanks you very much.

I'd like to

  1. add one or more optimization algorithms, and then
  2. add a module to abstract them.

At the moment, I'm considering to add L-BFGS-B for (1), because I personally often use CG and L-BFGS-B in other languages.

Note that there are already two bindings for L-BFGS-B on Hackage, but I think none of them are widely used.

I'll make pull requests for (1) and (2) when I'm ready.

meteficha commented 4 years ago

Sounds like a good improvement!

I’ve invited you as a collaborator to the GitHub project. Let me know what your username is on Hackage and I’ll give you uploading rights as well. (:

riaqn commented 3 years ago

Hey folks, any news on this? I would be very exicted to see some progress in numerical optimization in haskell; considering we have cool packages like ad, it's a shame that we don't have more NLP library in haskell.

Also, as I'm quite new to this field, is the algorithm in this lib (HagerZhang05) considered "good and effective" or even standard?

msakai commented 1 year ago

@meteficha @riaqn

After three years, I finally went ahead with the plan.

Here are my initial attempts:

They currently support LBFGS (through lbfgs package), CG_DESCENT (through this nonlinear-optimization package), and naive Newton's method as optimization algorithms, and can be combined with ad and backprop.

I said that I would extend the nonlinear-optimization package, but I changed the plan. One reason was the licensing issue, as I wanted to keep the GPL-licensed code (e.g. CG_DESCENT and nonlinear-optimization) as optional dependencies.

msakai commented 1 year ago

@meteficha

I’ve invited you as a collaborator to the GitHub project. Let me know what your username is on Hackage and I’ll give you uploading rights as well. (:

Sorry, I forgot to reply for three years. :bow:

My Hackage username is MasahiroSakai.

Although I changed my plan, I am still willing to do some maintenance work on this package, and I am delighted to be one of the maintainers of Hackage.

For now, I would like to reflect my recent PRs (#11, #12, #13) to Hackage as well.