lanl-ansi / Alpine.jl

A Julia/JuMP-based Global Optimization Solver for Non-convex Programs
https://lanl-ansi.github.io/Alpine.jl/latest/
Other
244 stars 39 forks source link

Bugs in Lower bounding iterations!! #108

Closed krishpat closed 2 years ago

krishpat commented 5 years ago

Case 1 - Consider this NLP:

@variable(m, -2 <= x[1:2] <= 2)
@variable(m, y[1:4])
setlowerbound(y[1], 0)
setlowerbound(y[3], 0)
@NLobjective(m, Min, 30 + y[3]*y[4] + 30*y[1]*y[2] + y[1]*y[2]*y[3]*y[4])
@NLconstraint(m, y[1] == (x[1] + x[2] + 1)^2 )
@NLconstraint(m, y[2] == 19 - 14*x[1] + 3*x[1]^2 - 14*x[2] + 6*x[1]*x[2] + 3*x[2]^2)
@NLconstraint(m, y[3] == (2*x[1] - 3*x[2])^2)
@NLconstraint(m, y[4] == 18 - 32*x[1] + 12*x[1]^2 + 48*x[2] - 36*x[1]*x[2] + 27*x[2]^2)

In this case, Alpine's lower bounding iterations look like

LOWER-BOUNDING ITERATIONS 

| Iter   | Incumbent       | Best Incumbent      | Lower Bound        | Gap (%)         | Time      
| 1      | 840.0           | 840.0               | -5.8117041151e6    | LARGE           | 2.13s            
| 2      | -               | 840.0               | -539358.4477       | LARGE           | 2.28s            
| 3      | 25484.9863      | 840.0               | -371263.2724       | LARGE           | 2.54s            
| 4      | 974.7453        | 840.0               | -19315.1917        | LARGE           | 4.05s            
| 5      | 130.763         | 130.763             | -7143.2366         | LARGE           | 7.26s            
| 6      | 90.1609         | 90.1609             | -1329.1636         | LARGE           | 18.94s           
| 7      | 91.7004         | 90.1609             | 424.8675           | 371.23243       | 34.69s           
| 8      | 112.9848        | 90.1609             | 424.8675           | 371.23243       | 62.79s           
| 9      | -               | 90.1609             | 913.9391           | 913.67539       | 107.2s      

In the above output, there is clearly a problem as the lower bound of Iteration 7 flips and becomes greater than the best incumbent which cannot be possible. So, there is some bug in either the incumbent evaluation or the lower bounds.

Case 2 - Consider the same NLP as above but with constraints 1 and 3 expanded out as shown here:

@NLconstraint(m, y[1] == x[1]^2 + x[2]^2 + 1 + 2*x[1]*x[2]+ 2*x[1] + 2*x[2] )
@NLconstraint(m, y[3] == 4*x[1]^2 + 9*x[2]^2 - 12*x[1]*x[2])

In case-2, Alpine's output starting from presolve looks like:

=======================================================================
PRESOLVE 
  Doing local search
  Local solver returns a feasible point
  Starting bound-tightening
  Variables whose bounds were tightened:
    VAR 1: 5.0% contraction |-2.0 --> | -1.8125 - 2.0 | <-- 2.0 |
    VAR 2: 26.0% contraction |-2.0 --> | -0.9609 - 2.0 | <-- 2.0 |
    VAR 3: 100.0% contraction |0.0 --> | 0.0 - 25.0 | <-- 10000.0 |
    VAR 4: 100.0% contraction |-10000.0 --> | -5.4214 - 80.9028 | <-- 10000.0 |
    VAR 5: 99.0% contraction |0.0 --> | 0.0 - 92.641 | <-- 10000.0 |
    VAR 6: 98.0% contraction |-10000.0 --> | -27.6143 - 449.9234 | <-- 10000.0 |
  Completed presolve in 1.74s (10 iterations).

====================================================================================================
LOWER-BOUNDING ITERATIONS 

| Iter   | Incumbent       | Best Incumbent      | Lower Bound        | Gap (%)         | Time      
 Warning: Infeasibility detected via convex relaxation Infeasibility
| finish | 840.0           | 840.0               | -Inf               | LARGE           | 1.74s            
====================================================================================================

*** Alpine ended with status Infeasible ***
Warning: Not solved to optimality, status: Infeasible

"Infeasibility detected via convex relaxation Infeasibility" cannot be true as the local solver has already returned a feasible point during presolve. Also -Inf lower bound even after tightening the bounds of the variables quite substantially in the presolve looks surprising. There could be a bug in the way bounds are assigned to the variables but not sure.

Thanks in advance.

harshangrjn commented 5 years ago

@krishpat Did you try running your problem by projecting out y[1], y[2], y[3] and y[4] and running it as an unconstrained NLP?

That said, there is surely some bug in the lower bounding which needs to be fixed. Thanks for catching this.

krishpat commented 5 years ago

I already tried that but the lower bounding iterations get stuck in large bounds. Hence took this route of lifting.

harshangrjn commented 5 years ago

Here is a simpler version of the NLP which has a similar issue as above to help us debug:

@variable(m, -2 <= x[1:2] <= 2)
@variable(m, y[1:3])
@NLobjective(m, Min, y[2] + y[1]*y[2]*y[3])
@constraint(m, y[1] == x[1])
@constraint(m, y[2] == x[2])
@NLconstraint(m, y[3] == x[2]^2)
harshangrjn commented 5 years ago

@krishpat A bug has been fixed which partly solved the issue. But it took some time for us to realize that the real issue in lower bounding was because of CPLEX's Presolve turned on, which was creating incorrect bounding MIQCPs. Once CPLEX's presolve was turned off, Alpine runs fine where the lower bounds are correct. As expected, the run times are slow since the presolve is turned off. Here are the iterations:

| Iter   | Incumbent       | Best Incumbent      | Lower Bound        | Gap (%)         | Time      
| 1      | 3.0             | 3.0                 | -3.2690800891e6    | LARGE           | 11.11s           
| 2      | 1967.0625       | 3.0                 | -769128.4          | LARGE           | 11.25s           
| 3      | 3.0             | 3.0                 | -463842.5838       | LARGE           | 11.56s           
| 4      | 7282.0377       | 3.0                 | -411704.55         | LARGE           | 11.92s           
| 5      | 14752.2544      | 3.0                 | -243471.2477       | LARGE           | 12.54s           
| 6      | 2667.5681       | 3.0                 | -141049.9399       | LARGE           | 13.34s           
| 7      | 1264.8148       | 3.0                 | -58872.1104        | LARGE           | 15.71s           
| 8      | 2068.1399       | 3.0                 | -36590.4597        | LARGE           | 18.87s           
| 9      | 14752.2544      | 3.0                 | -24685.4696        | LARGE           | 24.19s           
| 10     | 7282.0377       | 3.0                 | -16704.5316        | LARGE           | 30.51s           
| 11     | 1193.3596       | 3.0                 | -7597.4989         | LARGE           | 40.49s           
| 12     | 5612.5218       | 3.0                 | -6369.169          | LARGE           | 57.05s           
| 13     | 574.8677        | 3.0                 | -2941.0098         | LARGE           | 69.37s           
| 14     | 838.1052        | 3.0                 | -1551.3039         | LARGE           | 100.91s          
| 15     | -               | 3.0                 | -664.1935          | LARGE           | 145.78s          
| 16     | -               | 3.0                 | -208.0886          | LARGE           | 204.89s          
| 17     | -               | 3.0                 | -181.3196          | LARGE           | 280.32s          
| 18     | -               | 3.0                 | -104.8134          | LARGE           | 674.98s          
| 19     | -               | 3.0                 | -56.6786           | LARGE           | 873.89s          
| 20     | 878.8997        | 3.0                 | -48.787            | LARGE           | 958.95s         
| 21     | -               | 3.0                 | -9.9553            | 431.84216       | 1062.09s         
| 22     | 26.1238         | 3.0                 | -5.362             | 278.73186       | 1155.94s         
| 23     | -               | 3.0                 | -4.0276            | 234.25452       | 1441.69s         
| 24     | -               | 3.0                 | 1.7177             | 42.74334        | 1810.92s         
| 25     | 3.1321          | 3.0                 | 2.5969             | 13.437          | 1975.91s         
| 26     | 3.0083          | 3.0                 | 2.8924             | 3.58576         | 2015.32s         
| 27     | 3.0083          | 3.0                 | 2.9743             | 0.85659         | 2145.04s        
| 28     | 3.0009          | 3.0                 | 2.9989             | 0.03184         | 2256.24s  
krishpat commented 5 years ago

@harshangrjn Thanks for fixing this. Its very surprising that the presolve of CPLEX is buggy. Does this happen particularly with MIQCPs or with MILPs too?

And can you show me how to turn off the Presolve?

harshangrjn commented 5 years ago

@krishpat So far, I have observed issues of presolve only for MIQCPs. This needs to be reported to CPLEX.

Here is what I used for creating the model:

m = Model(solver=AlpineSolver(nlp_solver = IpoptSolver(print_level=0, sb="yes"), mip_solver = CplexSolver(CPX_PARAM_SCRIND=0, CPX_PARAM_PREIND=0)))