techoe / ceres-solver

Automatically exported from code.google.com/p/ceres-solver
Other
0 stars 0 forks source link

Add support for Gauss-Newton Hessians besides Jacobians to Ceres #102

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
Jacobians are nice, but for problems where the number of variables is MUCH 
smaller than the number of residuals, it makes more sense to directly compute 
J'J at eval time instead of storing J in memory.

We could start by just supporting crs and dense solvers via this route. 

The first step here would be to introduce the notion of a Model 

which has two subclasses, JacobianModel, GaussNewtonModel.

The former will store a jacobian and a residual vector
the latter will store a gauss-newton hessian matrix and a gradient vector. 

and then build the various trust region and line search algorithms in terms of 
the Model object.. 

Original issue reported on code.google.com by sameerag...@google.com on 10 May 2013 at 6:29

GoogleCodeExporter commented 9 years ago
This seems like it would require an API change, right?  Since the current 
CostFunction API is defined in terms of J?  Or at least perhaps an expansion of 
the semantics of what CostFunction computes?

If I'm understanding correctly, this would be plenty for me--I wouldn't need to 
write my own evaluator anymore if I could do this.

Original comment by th...@google.com on 10 May 2013 at 6:48

GoogleCodeExporter commented 9 years ago
This seems like it would require an API change, right?  Since the current
CostFunction API is defined in terms of J?  Or at least perhaps an
expansion of the semantics of what CostFunction computes?

If I'm understanding correctly, this would be plenty for me--I wouldn't
need to write my own evaluator anymore if I could do this.

Original comment by th...@google.com on 10 May 2013 at 6:49

GoogleCodeExporter commented 9 years ago
no, the user evaluates jacobians, but internally right after an eval, we update 
the hessian instead of writing into a jacobian.

the change you are interested in is a combination of this issue and the next 
one.

Original comment by sameerag...@google.com on 10 May 2013 at 7:16

GoogleCodeExporter commented 9 years ago
The only real case this is of interest is where the number of residuals far 
outnumber the number of parameters, so much so that working with the jacobian 
is a pain and the user wishes to collapse it into a Gauss-Newton Hessian.  
Generally speaking this seems to be the case for large dense problems. So 
perhaps to begin with we only implement a dense evaluator and punt on the 
sparse solvers for this case?

Original comment by sameerag...@google.com on 2 Jul 2013 at 3:26

GoogleCodeExporter commented 9 years ago
That sounds completely reasonable to me.  I haven't been able to find a way to 
make use of the sparse solvers yet, so I've just been using the dense solvers.  
I think that would address my needs completely for now.

Original comment by th...@google.com on 5 Jul 2013 at 11:27

GoogleCodeExporter commented 9 years ago
There are subtle issues related to how the loss functions will work with 
something like this. In conversation with thad, it seems that just computing 
the gauss-newton hessian is not fun either for his case.

Given the amount of architectural change this will require internally, I am 
inclined to punt on this for now. We will revisit this if needed.

Original comment by sameerag...@google.com on 12 Aug 2013 at 8:30