lanl-ansi / Alpine.jl

A Julia/JuMP-based Global Optimization Solver for Non-convex Programs
https://lanl-ansi.github.io/Alpine.jl/latest/
Other
244 stars 39 forks source link

Support for multi-linear term with binary variables inside #72

Closed jac0320 closed 6 years ago

jac0320 commented 6 years ago

Meant to address #55 with several improvements.

Data structure changes:

Algorithm changes and Function additions

Error Schemes and default changes

Test changes:

--------------OLD COMMENTS-------------- This PR aims to detect, organize, and construct exact re-formulation of multi-linear terms with binary variables inside

The change targets the more generic term, x[1]x[2]...x[M]y[1]y[2]...y[N], where x[.] are binary variables while y[.] are continuous variables. The algorithm is pretty simple, a lifted variable \hat{x} for all x[.] is built and similarly \hat{y} for all y[.]. After this, the term is left with \hat{x}\hat{y} and recognized as <:binlin> operator and reformulated using McCormick relaxation. For \hat{x}, since they are all binary variables, you can relax them exactly using a generalized form. For \hat{y}, the treatment is exactly the <:multilinear> one. For special cases, the choice of <:multilinear> is not unique since it could also be a <:bilinear> or <:monomial>.

There are some significant changes happened in operator.jl. The changes are meant for better readability and easier maintenance. The expression operator procedures are with a new look and still with consideration of user-inputs.

codecov[bot] commented 6 years ago

Codecov Report

Merging #72 into master will increase coverage by 10.49%. The diff coverage is 88.41%.

Impacted file tree graph

@@             Coverage Diff             @@
##           master      #72       +/-   ##
===========================================
+ Coverage   67.71%   78.21%   +10.49%     
===========================================
  Files          27       27               
  Lines        5957     5696      -261     
===========================================
+ Hits         4034     4455      +421     
+ Misses       1923     1241      -682
Impacted Files Coverage Δ
test/examples/linearlift.jl 91.66% <ø> (ø)
test/examples/nlp.jl 89.28% <ø> (ø)
test/examples/exprstest.jl 8.66% <ø> (ø)
src/log.jl 74.52% <ø> (-0.24%) :arrow_down:
src/solver.jl 60.15% <ø> (ø) :arrow_up:
test/examples/convex.jl 0% <ø> (ø)
test/examples/binprod.jl 0% <0%> (ø)
test/examples/eniplac.jl 0% <0%> (ø)
test/algorithm.jl 100% <100%> (ø) :arrow_up:
test/examples/castro2m2.jl 99.82% <100%> (ø)
... and 21 more

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 6e57313...4b282f0. Read the comment docs.

harshangrjn commented 6 years ago

Take this example: x1*x2*y1*y2. I assume you are doing x = <x1,x2>^{MC-bin}and y = <y1,y2>^{lambda} and further <x,y>^{MC}. From your description, looks like you are not partitioning y1 or y2. But, if we need tighter relaxations for y, we will need to partition y1 or y2 or both. Is this supported in this PR?

jac0320 commented 6 years ago

It is not supported. No partitions will be added to binary variables x1, x2. Since x = <x1,x2>^{MC-bin} is recognized as a binary product term, which will be "relaxed" exactly using the generalized linearization method you wrote.

I wonder why do we want to add any partitions to them? But do note that it is modularized how the relaxation are added towards binary product term. If you need any different relaxation, you can easily change it in POD.

harshangrjn commented 6 years ago

Sorry! It was a typo in my comment. Corrected it above. I obviously meant partitioning continuous variables only appearing in lambda relaxation form.

jac0320 commented 6 years ago

The y# variables are still getting partitioned while the lifted variable y is not.

jac0320 commented 6 years ago

@kaarthiksundar Any issue with this PR?