Open xiemeilong opened 1 week ago
Hi, thanks for your interest. The error seems to come from Ceres. We have not tried any datasets with such large sizes, and the original value was set to be some number we thought could not be reached. In this case, you can either try to decrease the allowed number of points by changing --TrackEstablishment.max_num_tracks
, or you can try to install the latest version of Ceres. I have checked the current repo, the constraint is lifted from 10000000 to std::numeric_limits<std::int32_t>::max()
. This should be enough and would not cause this error anymore.
I haven't modified the --TrackEstablishment.max_num_tracks parameter. However, the track count reaches 10,000,001, blocks.size() becomes 10,027,739, and the process aborts with an error.