NumEconCopenhagen / IntroProg-lectures

Introductionary programming lectures
MIT License
32 stars 72 forks source link

Different solutions with scipy.optimize.minimize #8

Closed johanoelgaard closed 8 months ago

johanoelgaard commented 1 year ago

We are trying to solve Q3 and believe we have done so. However, when solving continuously using scipy.optimize.minimize we get different solutions depending on if we explicitly state method = SLSQP or not and the same happens if we add a constraint, to the problem even though the solution found without the constraint does not violate the constraint itself – our constraint is constraints = ({'type': 'ineq', 'fun': lambda x: 24 - x[0] - x[1]},{'type': 'ineq', 'fun': lambda x: 24 - x[2] - x[3]}). Why is it that different methods give drastically different solutions and is it possible to find know what the 'right' solution is without checking the actual utility in each of the solutions?

AskerNC commented 1 year ago

I think you're maybe getting caught in local optimums. I've been tinkering a bit with solving the problem and noticed that this happens sometimes (especially with SLSQP). Using Nelder-Mead seems to work the best for me. (then you have to have a penalty in your objective functions if the constraint is violated, as it doesn't take explicit constraints).

It's impossible to know with certainty that you have the right solution. What I would do is this: in your solver try different initial guesses and perhaps different solvers, then choose the one that minimizes your objective function. From a given solver, you can get out the result of the objective function using results.fun (for function) if you did results = optimize.minimize(). And then select the one with the lowest value.

AskerNC commented 1 year ago

Another tip would be to use the solutions from the discrete solution as initial guesses, as they are likely to be close to the true answer

AskerNC commented 1 year ago

Also: the method COBYLA, which you have not learned about, seems to work better than SLSQP for me. It takes explicit constraints.

AskerNC commented 1 year ago

My (hopefully) final comment: Jeppe suggested this might be a scaling issue due to numerical inaccuracy for small numbers, so if you multiply the return of your objective function by something like 100 (which is a monotone transformation and there the answer will be the same), you'll get consistent answers for different methods. But great question! It's good that you're testing your solutions.