Closed jstac closed 5 years ago
Happy to share -- link. This was obviously more of a learning exercise than an attempt at writing efficient code, but I think it is enough to build off of.
I think NM is definitely the right place to start -- It's simple and relatively robust. However, adding a method that allows for box constraints (such as lbfgs) could be very useful.
Thank you for the notebook @cc7768 . I worked my way through it and found it very informative! How would you go about adding bounds on which the NM algorithm searches if doing so is even possible? It seems to me (I may be mistaken though) that the only sorts of bounds you can place are bounds on the parameters of the NM algorithm.
I've been looking around in the scipy optimise libraries and the algorithms that allows you to specify bounds are the BFGS, L-BFGS-B, TNC and SLSQP algorithms yet they're not quite as robust as they require derivatives to be known.
You are correct that the way I coded up the NM algorithm didn't allow for any constraints (box or otherwise) on the solution. Others have actually found ways to include box constraints -- I haven't read this closely (but I have used their code successfully), but NLOpt
's NM algorithm allows one to impose box constraints. You can find the details and a couple citations on their algorithms page (NM is about 3/4 of way down).
I definitely recommend starting by seeing if you can get box constraints on the NM algorithm just because it is relatively robust and is simple to write. Accessing these other algorithms would be nice in the future, but, if someone wants one of those, they (1) might be able to call the algorithm from elsewhere or (2) could contribute to QE :smile: (I also keep secretly hoping that someone in the scientific python community will start a package oriented towards jitted optimization algorithms for numba functions now that you can pass functions as arguments)
I think a lot of people are hoping that, @cc7768.
In a month or so they will at least be able to use quantecon.optimize
, which will provide the basics --- and perhaps a starting point for a dedicated library.
Thanks for contributing your code.
Hi @cc7768 and @jstac
I've refactored the original Nelder Mead and jitted it, but I am quite sceptical on whether it works properly (it gives the same solutions as the original one). I've pushed up a notebook in this link.
I've written some testcases at the bottom of it and also had a comparison agaisnt scipy's optimise function. Note that this is still minimisation as it would be easier for me to compare and debug. I would like some help on whether is this actually giving the right solutions and also, I noticed that it fails for scalar maximisation routines @cc7768
Additionally, I've added in bounded constraints which works by at each point, check whether are the decision variables are out of bounds, if so, shift its value to the closest bound. I'd like some thoughts on this too thanks.
Cheers!
Additionally, I've added in bounded constraints which works by at each point, check whether are the decision variables are out of bounds, if so, shift its value to the closest bound. I'd like some thoughts on this too thanks.
Projection like that can work, and is basically the way it's done in NLopt and elsewhere. You have to be careful though, as you might end up in a situation where your simplex collapses into a lower-dimensional subspace. This means that you have a coordinate that doesn't change unless you end up in the shrink
step of Nelder-Mead's algorithm. Even worse, if that bound is at 0
, a shrink step can't even help you.
But, it's probably good enough for many problems.
Is there a discussion somewhere that is about why you want this kind of functionality in QuantEcon.py/.jl and not just call out to Scipy/(Optim/NLopt).jl? Seems like a big burden on maintenance, especially if you don't just implement vanilla BFGS with a backtracking line search or similar.
Edit: that said, you're more than welcome to copy our Nelder-Mead with adaptive paramters (that is the values of the greek letters depend on len(x)
) from Optim.jl.
I would also like to understand why quantecon
should have its own optimize sub-package as opposed to contributing relevant code upstream to Scipy
(or Julia equivalent). Main reason that I came up with is that quantecon
has numba
as a dependency which allows for substantial efficiency gains over scipy
functionality.
Thanks for your comments guys. @davidrpugh It's all about Numba. I want some lightweight optimization routines that can be jit-compiled inside Bellman operators and the like. Now that we have them I'll combine them with @albop's jitted interpolation routines to numba-fy all our dynamic programming code on the lecture site.
Speed in the routine itself is less important than being able to jit compile the calling functions that contain these routines (value function iteration, etc.).
@pkofod @chrishyland I think any multivariate optimizer we implement should be plain vanilla.
Speed in the routine itself is less important than being able to jit compile the calling functions that contain these routines (value function iteration, etc.).
Yes, this certainly makes sense in Python, until SciPy - or maybe an alternative - comes up with general purpose optimization code in numba.
Hi @pkofod
I wanted to ask whether did you have any sort of guidance in terms of implementing constraints into the Nelder Mead optimisation routine? In particular, how does one ensure that constraints are being satisfied? I'm not certain on whether ensuring constraints are being satisfied can be implemented in a similar to what I did for the bounds like I mentioned earlier. Sorry for the vague and open question
What kinds of constraints are we talking about?
Just your usual equality and inequality constraints. Whatever is easier to implement first!
I thought we decided that box constraints (a_n <= x_n <= b_n for all n) were enough last time we discussed this? If you think we need more then it would have to be inequality constraints, so we can implement box constraints.
Ah right. The thing is that what @pkofod mentioned earlier about potential issues and if we have an inequality constraint implemented, that gives us both the benefit of box constraints but also having inequality constraints which will come in quite handy in the future.
Sure thing. Sounds good.
It would be nice to have at least one multivariate maximizer, akin to our Brent based scalar routine in
https://github.com/QuantEcon/QuantEcon.py/blob/master/quantecon/optimize/scalar_maximization.py
This is a topic I know very little about but I'm thinking of a local routine such as this one:
https://docs.scipy.org/doc/scipy/reference/optimize.minimize-neldermead.html
Or perhaps this one, which seems to have the option to supply a bounding box:
https://docs.scipy.org/doc/scipy/reference/optimize.minimize-lbfgsb.html
One problem is that some of the SciPy code is wrapped C / Fortran.
@cc7768 I seem to remember to you had some Nedler-Mead code. Might you be willing to share? Would this be a good starting point? @sglyon @albop @oyamad Any thoughts?
CC @chrishyland @spvdchachan
I should note that @chrishyland and @spvdchachan will do the leg work on this so from others I'm just hoping for a few pointers.