lanl-ansi / MathOptAI.jl

Embed trained machine learning predictors in JuMP
https://lanl-ansi.github.io/MathOptAI.jl/
Other
29 stars 1 forks source link

DNM: add hooks for JuMP extensions #101

Closed odow closed 3 days ago

odow commented 1 month ago

Closes #92

@pulsipher: what should we do about set_name, set_binary etc?

odow commented 1 month ago

This just feels wrong. It's an abstraction on top of JuMP to make two extensions work together. Can we not come up with a way where this just works normally?

pulsipher commented 1 month ago

what should we do about set_name, set_binary etc?

We could leave it as kwargs... or explicitly have them. I believe only set_name and set_binary are needed for MOAI.

This just feels wrong. It's an abstraction on top of JuMP to make two extensions work together. Can we not come up with a way where this just works normally?

I agree that it would be nice to more naturally have these extensions "just work" together. Here, I believe the core issue is that extensions at the JuMP level do not inherently compose all that well whereas MOI-level ones do.

With all that said, I still would very much like to avoid duplicating much of MOAI to work with InfiniteOpt. This PR may or may not be the right approach. I open to alternative strategies to facilitate the use of JuMP.AbstractVariables, including ones that might require changes to InfiniteOpt. There is no need to rush this.

odow commented 1 month ago

Here, I believe the core issue is that extensions at the JuMP level do not inherently compose

Yeah. I had mainly thought this applied to extensions that both created different AbstractVariableRef subtypes. It didn't occur to me that it would be a problem for basic @variable usage.

odow commented 3 days ago

Closing because this is out-of-date, and I don't particularly like it. I'll leave the issue open and we can have another think.