This pull request does some considerable code refactoring and improves the modeling interface. The following are the main additions:
Use a MathOptInterface backend for an optigraph. The biggest reason to do this is that it should make development of new Plasmo.jl features more flexible. We don't HAVE to go through JuMP to optimize a model. Now when optimizing an optigraph, it aggregates the MOI backends on optinodes instead of aggregating a new JuMP Model. My initial tests don't show that this is any slower, but it should help for problems with many nonlinear constraints. It should also reduce the memory footprint.
Use custom NodeOptimizer model backend on optinodes. This makes it possible to populate optinode solutions and allows the use of value to be used on optinode variables so it aligns better with JuMP.
An OptiGraphNLPEvaluator. This currently produces the same results as the JuMP.NLPEvaluator. When we aggregate optinode backends, we don't catch nonlinear constraints or objective functions, so we pass the evaluator to an NLPBlock in the graph backend.
@NLconstraint and @NLobjective now work directly on optinodes. All syntax in Plasmo.jl should align with JuMP.
This pull request does some considerable code refactoring and improves the modeling interface. The following are the main additions:
value
to be used on optinode variables so it aligns better with JuMP.