joaquimg / BilevelJuMP.jl

Bilevel optimization in JuMP
Other
106 stars 27 forks source link

Implementation of Iterative ProductMode() #184

Open LukasBarner opened 2 years ago

LukasBarner commented 2 years ago

As discussed on discourse, an iterative solution strategy for ProductMode() would be nice. I have given this a first (and probably naive) shot. Changes include:

I would be happy about a brief discussion, especially in case anyone sees a more elegant approach. Warnings/errors are not yet implemented, also not thoroughly tested on my end. This is more to test the waters and not a final approach :D It's my first PR, so I would be happy about some feedback!

LukasBarner commented 2 years ago

I edited a bit of the code to make it more concise and a little more performant.

For practical usage, it is important to set the right solver attributes before running the model. I guess this can hardly be done inside BilevelJuMP ex-ante, as code should work independent of the solver. For example, in Ipopt the following settings should be made:

set_optimizer_attribute(model, "warm_start_init_point", "yes") set_optimizer_attribute(model, "warm_start_bound_push", 1e-12) set_optimizer_attribute(model, "warm_start_bound_frac", 1e-12) set_optimizer_attribute(model, "warm_start_slack_bound_frac", 1e-12) set_optimizer_attribute(model, "warm_start_slack_bound_push", 1e-12) set_optimizer_attribute(model, "warm_start_mult_bound_push", 1e-12)

Additionally, not printing the solver log gives a higher level overview. set_optimizer_attribute(model, "print_level", 0)

In the future, some of the warmstart related settings may be included automatically for popular solvers, but I think leaving this to the user should be sufficient for now :D

LukasBarner commented 2 years ago

Hey @joaquimg, I have now also implemented a default solution strategy without iteratively using copy_to (this seems to be considerably better from a performance point of view). I have tested my code for bigger problems on a cluster computer, and it allowed me to solve problems that were not solvable with sufficiently low regularization via the one-shot strategy. There are still a few points that I'm not really happy with (for example the comp_idxs_in_solver (returned by get_solver_comp_idxs), but after some testing I believe the code works as intended and doesn't break anything existing. In case you are testing with Ipopt, there are still a few problems though:

LukasBarner commented 2 years ago

Think this should be good to go for now. @joaquimg, could you give some feedback?

joaquimg commented 2 years ago

Will look into it!

LukasBarner commented 2 years ago

Now includes tests and a mutable log as well. Note that I had to update Ipopt compat (because it did not have methods to alter the rhs prior to https://github.com/jump-dev/Ipopt.jl/pull/336).

joaquimg commented 2 years ago

Formatter was added to master, so you can just run that and it will fix part of the style.

LukasBarner commented 2 years ago

can you check if this is better?

LukasBarner commented 1 year ago

Just realized that the following line is problematic if there exist different types of complements (greater/less) within the same model: https://github.com/LukasBarner/BilevelJuMP.jl/blob/88561d99bb8045feef1926bf8f86b9733d7acdb7/src/jump.jl#L1060 To keep everything as easy as possible, I flipped the second complementarity constraint for conic problems. In case this is not a solution you prefer, I could also do something else (like a conditional, or something more type-stable).

Further, the previous attempt did not work correctly with slacks, which is also fixed now.

LukasBarner commented 1 year ago

@joaquimg is there anything I can do/fix atm?