JuliaDiff / ForwardDiff.jl

Forward Mode Automatic Differentiation for Julia
Other
872 stars 140 forks source link

Provide way to specify which partial derivatives are computed for sparse Jacobians and Hessians #43

Open mlubin opened 8 years ago

mlubin commented 8 years ago

It would be useful for JuMP to have a method to compute jacobian-vector and jacobian-matrix products. The current approach for computing jacobians can be interpreted as a jacobian-matrix products with the identity matrix.

Here's the implementation in ReverseDiffSparse: https://github.com/mlubin/ReverseDiffSparse.jl/blob/2530b758bb341d3c51e8c1195134922193e1cfb2/src/hessian.jl#L231

ChrisRackauckas commented 7 years ago

Could this be done easily with the inplace methods? If it's inplace, the user could provide the sparse matrix, and the differentiation could just loop over the indices which are non-zero. That would make it easy to provide the sparsity pattern and directly allow for inplace updates.

jrevels commented 7 years ago

If it's inplace, the user could provide the sparse matrix, and the differentiation could just loop over the indices which are non-zero.

This does sound like it would be useful.

I think is slightly different than what was originally requested here, which is a way to compute Jacobian-vector/Jacobian-matrix/Hessian-vector etc. products by providing a seed vector/matrix of perturbation coefficients. I probably was overzealous with the issue retitling here...

antoine-levitt commented 6 years ago

It would indeed be great to expose a way to compute jacobian-vector products. See https://discourse.julialang.org/t/complex-dual-or-dual-complex/5455/5 for an usage example. Newton-Krylov methods also come to mind. Then ReverseDiff should also expose jacobian-transpose-times-vector methods, which would also be very useful.