AI4OPT / OPFGenerator

Instance generator for OPF problems
MIT License
2 stars 3 forks source link

Update loads using `set_normalized_rhs` #56

Closed klamike closed 5 months ago

klamike commented 7 months ago
  1. Adds update! method for each OPF that changes RHS of power balance constraint
  2. Updates sampler.jl to use update! instead of re-building a model for each seed
  3. Adds a test for update! that checks if an updated model has the same optimal objective as a re-built model
  4. Adds a test for sampler.jl that checks the HDF5 outputs exist and are OPTIMAL/FEASIBLE_POINT. (Note this takes 9min on GitHub!)
  5. Adds a test for update! that checks that all the normalized_rhs are the same as a re-built model with the same data.

CI is expected to fail until MathOptInterface 1.28 is released (see MathOptInterface.jl#2452), specifically test_sampler_script and test_update.

TODO:

klamike commented 7 months ago

I was getting some weird behavior with this branch where the Float128 model ended up being infeasible after my changes, but feasible before for the same seed... I should write a test that builds a new model and checks if its the same as an updated model. If that passes, it must be because of rearranging the terms of the power balance constraints?

mtanneau commented 7 months ago

This is weird indeed. Are you able to share the case & seed?

klamike commented 7 months ago

It was with the config.toml in this branch (I edited the one in main to match), several seeds from 1-10. I believe 9 and 10 both had it.

mtanneau commented 7 months ago

I do not fully follow the logic in the current exp/sampler.jl. It looks like we're solving every OPF model here https://github.com/AI4OPT/OPFGenerator/blob/b1e54101a64ef6d8eb0682579d0abc0e40b8d104/exp/sampler.jl#L101 which occurs before we actually start generating data. Do we need to solve all those OPF models initially?

I can add the possibility to perform in-place sampling of new OPF instances. The syntax would be rand!(rng, opf_sampler, data_to_be_updated), which would complement the existing new_data = rand(rng, opf_sampler). This, in turn, would allow us to write something like

rand!(rng, opf_sampler, opf_data)
update!(opf_model, opf_data)
solve!(opf_model)

which would then allow to move the part where we iterate over seeds inside the main function.

klamike commented 7 months ago

The script is by no means final. That solve was just for my own checks and should be removed prior to merging.

I agree with your syntax suggestion.

klamike commented 7 months ago

I added a test that checks the objective value of an updated model vs a freshly created model, which is currently failing. Not sure why set_normalized_rhs isn't working as expected.

mtanneau commented 6 months ago

The infeasibility issue appears to be caused by something between JuMP and Clarabel. I opened an issue upstream: https://github.com/oxfordcontrol/Clarabel.jl/issues/160

[EDIT]: the issue is caused by a bug in JuMP: https://github.com/jump-dev/MathOptInterface.jl/issues/2452 Oscar is usually pretty fast at fixing those, but I don't know how long it's going to take.

We will also need to bump the JuMP compat once they release a bug fix a tag a new version.

klamike commented 6 months ago

Try to compare the old sampler with this one. On small cases on my Mac, I didn't see much speed up.

mtanneau commented 6 months ago

I am putting this PR on hold until the upstream issue gets resolved. This is definitely a feature we want to integrate, but we can't do that given the current bug in JuMP's bridges.

klamike commented 5 months ago

~@mtanneau This PR is ready for your review. It will be a pain to merge with #77...~

I need to update this branch still

klamike commented 5 months ago

@mtanneau Ok, now it's ready :)

mtanneau commented 5 months ago

I'm running a full data-generation run locally. If all works fine, we're ready to go

mtanneau commented 5 months ago

LGTM!