Closed rb004f closed 5 years ago
The above proposal should goes into bound_propagation() as a special feature. Using bound_propagation(), m.l_var_tight and m.u_var_tight will be updated. These keeps the tighten bounds (using any method, Bound Propagation, BT, PBT...) for constructing local NLP model and bounding model.
For discrete variables, if ub==lb, replace var with parameters during model construction utilizing the affine constraint structure.
For continuous variables, if isapprox(ub, lb; atol=10e-8) == true, then assign var with ub/2+lb/2. This might need more discussion.
Another comment: directly doing this is not as easy as it seems. As the model is parsed into MPB for upper bound search, deleting a variable is not easy since the abstraction layer is fairly raw. But this definitely motivates a different design of data structure which has more control of the raw model data.
Can lead to numerical issues. Leave the onus on the user to remove redundancies.
in discussions with @jac0320 we noticed that in some cases bound tightening reduced variables to constants. To improve performance, we could replace variables with constants.
Suggested architecture: We can use our relaxation pattern matching architecture that already replaces constraints with their relaxation. In this case we look for any constraint that has a variable whose ub = lb, and replace that constraint with the "relaxed constraint" where that variable is replaced with a constant