Closed Milos9304 closed 5 years ago
Yes, the github version is what I used to generate the dataset.
1: Yes, I'm aware of this issue. Previous versions of MATLAB handled 1xN reflectance without errors, but it seems that the latest version will throw an error. I'll fix this in a later commit after I check that no other problems occur in the latest version.
2, 3: I can't reproduce this issue on my MATLAB; will try again after I get hold of and install the latest version. But can tell me which traversals cause you the issue?
The issue is present when processing these datasets: -2014-11-18-13-20-12 -2015-02-10-11-58-05 -2015-02-24-12-32-19 -2015-03-24-13-47-33 -2015-05-22-11-14-30 -2015-07-08-13-37-17
I found out it is enough to increase parameter in
compute_subsequent_offsets(ins_positions,
5000);` from 5000 to 8000 (except 2015-02-10-11-58-05 where 15000 is required). Hope this doesn't affect quality of the output.
I also added the following lines before if distances(startIdx, endIdx-startIdx-1) > ACCUMULATE_DISTANCE - 5
:
if startIdx > numFrames || endIdx > numFrames warning(sprintf('Skipping %i - %i frames due to out of array bounds.', startIdx, endIdx)) break end
This skips a few frames, but ensures the script doesn't crash. This warning is present at every single traversal that is being processed.
Hi @Milos9304 , I just tested the code on Matlab R2019a, but I still didn't get those errors. Weird. The 5000 is there mainly for speed (so we don't have to compare with all N other INS positions), and to avoid those parts where the car is moving very slowly such that many many frames is required to cover 60m. The warnings appear for frames where 5000 is insufficient, and can be safely ignored. That said, there shouldn't be any problems increasing this limit to 8000.
@yewzijian thanks for checking that! It's really weird, but I found my ways to solve the issues. Hopefully that's a problem on my side only and others don't get the same one, so you can close this issue for now.
You're welcome :) Will close this issue for now.
Hi, I am following your guide to preprocess Oxford dataset for training and I am running into several issues, are you sure the GitHub version is the tested one?
The issues include:
pcloud = pointCloud(pcloud', 'Intensity', reflectance');
instead ofpcloud = pointCloud(pcloud', 'Intensity', reflectance);
assert(startIdx + 5000 > length(ins_positions))
assertion