JuliaOpt / MathProgBase.jl

DEPRECATED: Solver-independent functions (i.e. linprog and mixintprog) and low-level interface for Mathematical Programming
Other
81 stars 38 forks source link

Is it possible to only evaluate select gradient terms? #66

Closed rgiordan closed 8 years ago

rgiordan commented 9 years ago

I am exploring using eval_grad_f to perform coordinate ascent steps in mean field variational models. At each step, I only need a few terms of the gradient, but eval_grad_f always returns the whole thing. Is there a way to save computation by only requesting a few select terms of the gradient to evaluate?

mlubin commented 9 years ago

With the reverse-mode AD used in JuMP, you basically get the whole gradient all at once, so there isn't much room to save computations. If you only need a "few" coordinates of the gradient, then you're best off performing forward-mode AD using DualNumbers a la Optim (https://github.com/JuliaOpt/Optim.jl/blob/fe546a68c6034bf8828b10aa2517c1f4db8e5c60/src/autodiff.jl). You could try calling eval_f with a Vector{Dual{Float64}} and I think it will just work if JuMP is computing the derivatives or if you've defined it yourself using pure Julia code. You'll make one call for each component of the gradient you want. I don't think we can require that eval_f works with arbitrary number types in the MPB standard though.

rgiordan commented 9 years ago

For what it's worth, this doesn't work with JuMP because of a type assertion:

# ... define a JuMP model in the variable m...
m_const_mat = JuMP.prepConstrMatrix(m);
m_eval = JuMP.JuMPNLPEvaluator(m, m_const_mat);
MathProgBase.initialize(m_eval, [:ExprGraph, :Grad, :Hess])
z_par = zeros(length(m.colVal))
function f(z_par)
    MathProgBase.eval_f(m_eval, z_par)
end
dualvec = Array(DualNumbers.Dual{Float64}, length(z_par))
f(dualvec)
# ERROR: type: eval_f: in typeassert, expected Float64, got Dual{Float64}
#  in eval_f at /home/rgiordan/.julia/v0.3/JuMP/src/nlp.jl:232
#  in f at none:2
mlubin commented 9 years ago

If you're going this deep, you can just call m_eval.eval_f_nl directly, which doesn't have the type assert.

mlubin commented 9 years ago

No guarantees that it won't break in the future though.

mlubin commented 8 years ago

This trick has certainly broken with the latest NLP rewrite. Feel free to open an issue on the julia-opt list for further discussion.