convexengineering / gpkit

Geometric programming for engineers
http://gpkit.readthedocs.org
MIT License
206 stars 40 forks source link

How gpkit fits in to modeling package options #94

Closed whoburg closed 6 years ago

whoburg commented 9 years ago

We need to understand and document similarities and differences between gpkit and other convex optimization modeling packages under development -- especially cvxpy (cvxpy github) and JuMP (JuMP github). Neither of those projects currently supports GP, so there is a definite need for what we are doing. But, if our paradigm aligns well with either of these projects, we should consider joining them.

Below is a working document to keep track of similarities and differences.

Problem Types gpkit handles geometric programs only. cvxpy handles a wide variety of convex optimization problem classes (though notably not GP), with an emphasis towards cone programming. JuMP has an OR flavor and currently appears to support LP and QP, with plans to support SOCP and/or SDP. It can also model general NLPs.

Solvers gpkit interfaces with mosek and cvxopt. cvxpy interfaces with ECOS, CVXOPT, and SCS. JuMP currently interfaces with Clp, Cbc, GLPK, Gurobi, MOSEK, and CPLEX, along with several general nonlinear solvers.

Programming Language cvxpy and gpkit are written in Python. JuMP is written in Julia.

Development teams JuMP and gpkit are being developed at MIT. cvxpy is being developed at Stanford in Stephen Boyd's group.

Constraint generation cvxpy and gpkit use operator overloading. JuMP uses julia macros.

bqpd commented 9 years ago

Syntaxwise:

Language builtins and abilities-wise, julia is probably more suited to this kind of thing than python.

mlubin commented 9 years ago

Another modeling package to add to the list is https://github.com/cvxgrp/Convex.jl, also from Stephen Boyd's group. They don't support GP either, not sure what their plans are.

I'm guessing you're aware that the original matlab CVX supports GP, though curiously they solve GPs with "successive approximation" instead of calling dedicated GP solvers. Also YALMIP.

JuMP's syntax for declaring variables is really just syntactic sugar, and you can use something like Variable() instead. The biggest difference between JuMP and CVX derivatives at this point is that JuMP:

  1. Is mostly based on scalar operations, while CVX from its matlab heritage is all about matrices
  2. Doesn't (currently) perform automatic transformations in the style of disciplined convex programming. I'm not familiar enough with GP to know to what extent this is important.

+1 for Julia over Python as far as technical features of the language are concerned (though I'm biased).

bqpd commented 9 years ago

@mlubin, thanks for dropping by! From your paper Computing in Operations Research using Julia, Julia macros look like an awesome way to simulate a custom AML (and sizehint! is quite neat), and your point that any operator overloading in Python will involve the creation of temporary objects is well-taken.

IainNZ commented 9 years ago

+1 to the notion that YALMIP and CVX are the GP modeling that are best supported at this point.

Also things about JuMP not mentioned in OP:

On GP: I just whipped this up in the past hour or so: https://github.com/IainNZ/GPTest

Basically, you can already model a GP in JuMP (JuMP can model nonlinear functions, although the documentation in the link above pre-dates that, as does the paper), although nothing is going to come of it as you can only fire it off to a general nonlinear solver like Ipopt. We also have the JuliaOpt MathProgBase layer which basically lets you plug new solvers into the JuliaOpt architecture. So what I've whipped up is a pseudo-solver that takes in a nonlinear problem, walks the expression tree, and converts the cosntraints into GP standard form (it doesn't try to send it anywhere). Its very basic - it just solves a simple problem thats not too far from standard form.

bqpd commented 9 years ago

@IainNZ: whoa! I was just reading the Julia docs and thinking of implementing a posynomial parser myself. Methods are music to my eyes.

IainNZ commented 9 years ago

Hah, yeah, its really easy compared to other languages. You can mash together operator overloading and syntax trees into a lot of things. I envision a very rapid path to having GP modeling proof-of-concept in JuMP a) Make a pseudo-solver for GP, e.g. GPSolver b) GPSolver takes as a construction argument the actual GP solver to use, e.g. GPSolver(solver=:Mosek)

Then proceed as normal in JuMP, e.g.

m = Model(solver=GPSolver(solver=:Mosek))
#...
solve(m)

Longer term we can have a discussion about how to abstract away the connection to MOSEK a bit.

whoburg commented 9 years ago

Removed from Release 0.1.0 milestone.