JuliaGraphs / LightGraphsFlows.jl

Flow algorithms on LightGraphs
Other
36 stars 11 forks source link

Mincost jump version #35

Closed matbesancon closed 4 years ago

matbesancon commented 4 years ago

For now we'll use JuMP, one pain is that the numeric type is forced to be Float64. Not a big deal as long as there are no type-generic LP solver

ericphanson commented 4 years ago

there are no type-generic LP solver

Tulip.jl?

matbesancon commented 4 years ago

You're right on Tulip I think, but not all real types can be used because it relies on non-generic LinearAlgebra if I'm not mistaken, @mtanneau is it correct?

ericphanson commented 4 years ago

I believe that used to be true, but he added support for more generic linear algebra via LDLFactorizations, and now at least BigFloat's can be used (as shown in the readme).

matbesancon commented 4 years ago

I think at some point this will be fixed at the JuMP level

ericphanson commented 4 years ago

That would be great! To be clear, I wasn't really suggesting rewriting it with MOI, just pointing out that there is a generic solver :). I made the other choice in Convex.jl, but that's since I really wanted to solve some SDPs in high-precision, and I think either option is pretty defensible.

matbesancon commented 4 years ago

no trouble, and yes I'm impatiently waiting to get eltype-generic linear and MIP solvers for many aspects

matbesancon commented 4 years ago

closes #16

mtanneau commented 4 years ago

not all real types can be used because it relies on non-generic LinearAlgebra

Any real type should work. If it's not Float64, then the linear algebra is generic and it will use LDLFactorizations.jl (if your matrix is sparse) or generic Cholesky (if your matrix is dense).

That being said, for min-cost flow, network simplex is likely to be much more efficient that interior-point.

matbesancon commented 4 years ago

yes, still good to know, say you want to solve a symbolic flow, or differentiate through a network flow, etc