facebookresearch / votenet

Deep Hough Voting for 3D Object Detection in Point Clouds
MIT License
1.7k stars 376 forks source link

Extract point clouds and annotations from SUN RGB-D #90

Open Z-Jeff opened 4 years ago

Z-Jeff commented 4 years ago

After running extract_split.m, extract_rgbd_data_v2.m and extract_rgbd_data_v1.m, the number of files in 'calib/' is 6425, 'depth/' is 8786, 'image/' is 6425, 'label/' is 6425, 'label_v1' is 10335. I suppose all the number of files should be 10335, why this happens? Am I wrong?

What's more, when I run python sunrgbd_data.py --gen_v1_data, an error occurs: ------------- 5956 ------------- 5957 Traceback (most recent call last): File "sunrgbd_data.py", line 326, in save_votes=True, num_point=50000, use_v1=True, skip_empty_scene=False) File "sunrgbd_data.py", line 225, in extract_sunrgbd_data pc_upright_depth = dataset.get_depth(data_idx) File "sunrgbd_data.py", line 64, in get_depth return sunrgbd_utils.load_depth_points_mat(depth_filename) File "/home/aistudio/work/votenet/sunrgbd/sunrgbd_utils.py", line 196, in load_depth_points_mat depth = sio.loadmat(depthfilename)['instance'] File "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/scipy/io/matlab/mio.py", line 217, in loadmat MR, = mat_reader_factory(f, **kwargs) File "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/scipy/io/matlab/mio.py", line 72, in mat_reader_factory mjv, mnv = get_matfile_version(byte_stream) File "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/scipy/io/matlab/miobase.py", line 224, in get_matfile_version raise MatReadError("Mat file appears to be empty") scipy.io.matlab.miobase.MatReadError: Mat file appears to be empty

How should I fix this problem?Thank you.

Z-Jeff commented 4 years ago

It seems that extract_rgbd_data_v2.m has skip some data.

bigeyesung commented 4 years ago

HI all, I was wondering if I can directly download output files from "extract_split.m, extract_rgbd_data_v2.m and extract_rgbd_data_v1.m" ? As I have limit access to matlab.

hadarsh7798 commented 4 years ago

@bigeyesung did you find a way to directly get the output files?

alar0330 commented 4 years ago

I used Octave to run the MatLab scripts to preprocess data. You will have to make one or two edits to run under Octave, though.

hadarsh7798 commented 4 years ago

@alar0330 can you please tell me what edits have you done? I also tried using Octave, but running extract_rgbd_data_v2.m just returns empty folder. Please help!

alar0330 commented 4 years ago

I don't recall exactly, but rest assured that it was really a minor edit.

If I am not mistaken, using custom Java-libs is a bit different in Octave. One place you ought to edit is

in extract_split.m change this

hash_train = java.util.Hashtable;
hash_val = java.util.Hashtable;

with this

hash_train = javaObject("java.util.Hashtable");
hash_val = javaObject("java.util.Hashtable");

You might also have to change smth else though..

hadarsh7798 commented 4 years ago

@alar0330 Yeah this was the problem in extract_split.m file. I'm facing issues in extract_rgbd_data_v2.m file

Arnavdas commented 3 years ago

where did you'all find the files : extract_split.m, extract_rgbd_data_v2.m and extract_rgbd_data_v1.m, I mean in the SUNRGBD folder or SUNRGBDtoolbox folder or anywhere else, i mean it was written those files are specifically avaialble in a matlab folder(which i can't find from all of the sun rgbd files i have collected).....

IliasMAOUDJ commented 3 years ago

@alar0330 Yeah this was the problem in extract_split.m file. I'm facing issues in extract_rgbd_data_v2.m file

In extract_rgbd_data_v2.m, I changed the parfor loop to a for loop since Octave (without additional package) struggles with parallelized operations and parfor has no effect. Also, I changed the parsave to save: parsave(strcat(depth_folder, mat_filename), points3d_rgb); --> save(strcat(depth_folder, mat_filename), 'points3d_rgb');

This takes way more time (~2 hours for me) but no data lost in the process.

For the next operations, since scipy.loadmat (used to load .mat in depth folder) doesn't work with Octave original files, you have 2 choices:

def get_depth(self, idx): depth_filename = os.path.join(self.depth_dir, '%06d.mat'%(idx)) return sunrgbd_utils.load_depth_points_mat(depth_filename)

to

def get_depth(self, idx): 
    depth_filename = os.path.join(self.depth_dir, '%06d.txt'%(idx))
    return sunrgbd_utils.load_depth_points(depth_filename)

I used this and it seems to work fine.

daxiongpro commented 3 years ago

HI all, I was wondering if I can directly download output files from "extract_split.m, extract_rgbd_data_v2.m and extract_rgbd_data_v1.m" ? As I have limit access to matlab.

did you find a way to download the extracted data without using matlab?

GradiusTwinbee commented 1 year ago

@alar0330 Yeah this was the problem in extract_split.m file. I'm facing issues in extract_rgbd_data_v2.m file

In extract_rgbd_data_v2.m, I changed the parfor loop to a for loop since Octave (without additional package) struggles with parallelized operations and parfor has no effect. Also, I changed the parsave to save: parsave(strcat(depth_folder, mat_filename), points3d_rgb); --> save(strcat(depth_folder, mat_filename), 'points3d_rgb');

This takes way more time (~2 hours for me) but no data lost in the process.

For the next operations, since scipy.loadmat (used to load .mat in depth folder) doesn't work with Octave original files, you have 2 choices:

  • go in sunrgbd_data.py and change line 64.

def get_depth(self, idx): depth_filename = os.path.join(self.depth_dir, '%06d.mat'%(idx)) return sunrgbd_utils.load_depth_points_mat(depth_filename)

to

def get_depth(self, idx): 
    depth_filename = os.path.join(self.depth_dir, '%06d.txt'%(idx))
    return sunrgbd_utils.load_depth_points(depth_filename)

I used this and it seems to work fine.

  • when saving the .mat, you can add an option -v7 check documentation here to save the data in Matlab format. It should work but I haven't tried this one.

Even you have saved the depth data in the .mat format of Octave, you can still use the first choice here, just modify the "%06d.txt" to "%06d.mat". In this way, you don't need to run extract_rgbd_data_v2.m again.