roboptim / roboptim-core

RobOptim Core Layer: interface and basic mathematical tools
http://www.roboptim.net
GNU Lesser General Public License v3.0
64 stars 35 forks source link

Implement Hessian matrices for vector-valued functions #58

Open bchretien opened 10 years ago

bchretien commented 10 years ago

If we have f: R^n ---> R^m, then Jac(f) is a m x n matrix, and Hess(f) is a tensor of order 3 (m x n x n). Currently, Hessian matrices are stored as 2D matrices, thus supposing that we deal with scalar-valued functions (m = 1).

Note that Hessian matrices are symmetric, so we could maybe use Eigen's self-adjoint view.

thomas-moulard commented 10 years ago

Actually if you use the same strategy than impl_gradient then impl_hessian can be represented as matrices. I.e. you consider than non-scalar functions are the concatenation of n scalar functions and you choose in the list using an index (cf. impl_gradient prototype).

I would go for this first. One reason would be that it keeps the interface consistent and use matrices only. Yes, I was thinking of using a better matrix type for so far I didn't implement this.

Another alternative would be to use the Eigen unsupported module for tensors representation. Right now, I don't know enough on this module to really want to rely on it but it is worth mentioning it anyway.

bchretien commented 10 years ago

I've never tried Eigen's unsupported tensors either. I guess they're still doing lots of work on it, so the API will probably change quite a lot in the near future. And apparently they're adding support for symmetries.

I guess we can indeed start concatenating everything into a 2D matrix, and leave any optimization for later if someone really uses that feature.