gadomski / cpd

C++ implementation of the Coherent Point Drift point set registration algorithm.
http://www.gadom.ski/cpd
GNU General Public License v2.0
385 stars 122 forks source link

Uninformative error when affinity matrix creation throws bad_alloc #104

Closed alexsmartens closed 7 years ago

alexsmartens commented 7 years ago

Hi Pete,

I have successfully installed the last version of CPD (on my laptop). I've compiled a nonrigid program based on the rigid example, which you have with the distributive.

I've tested the nonrigid example on small artificial datasets first (10 points each) and on real datasets (2M points each). The first set of datasets was processed fine. But I got the error with the second set of datasets:

./cpd-nonrigid pt_in.txt pt_target.txt terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_alloc Aborted (core dumped)

What might be the problem? Have you had a chance to test the algorithm on large datasets?

gadomski commented 7 years ago

Yeah, two million points is most likely too much for the nonrigid. The nonrigid registration creates an MxM affinity matrix, where M is the number of points in the source dataset. The nonrigid lowrank transformation might be able to work with that many points, but it does not yet exist in this version of the library (see #57).

In my experience, I usually chop up datasets into ~10-20 thousand point chunks and run each chunk through CPD, then re-aggregate the result. I'd be very interested to hear about successful runs with larger datasets.

I'm going to keep this issue open because I should add better error reporting if the affinity matrix doesn't allocate — this is likely to be a common choke-point. Thanks for the report.