Open Z-Jeff opened 4 years ago
It seems that extract_rgbd_data_v2.m has skip some data.
HI all, I was wondering if I can directly download output files from "extract_split.m, extract_rgbd_data_v2.m and extract_rgbd_data_v1.m" ? As I have limit access to matlab.
@bigeyesung did you find a way to directly get the output files?
I used Octave to run the MatLab scripts to preprocess data. You will have to make one or two edits to run under Octave, though.
@alar0330 can you please tell me what edits have you done? I also tried using Octave, but running extract_rgbd_data_v2.m just returns empty folder. Please help!
I don't recall exactly, but rest assured that it was really a minor edit.
If I am not mistaken, using custom Java-libs is a bit different in Octave. One place you ought to edit is
in extract_split.m
change this
hash_train = java.util.Hashtable;
hash_val = java.util.Hashtable;
with this
hash_train = javaObject("java.util.Hashtable");
hash_val = javaObject("java.util.Hashtable");
You might also have to change smth else though..
@alar0330 Yeah this was the problem in extract_split.m file. I'm facing issues in extract_rgbd_data_v2.m file
where did you'all find the files : extract_split.m, extract_rgbd_data_v2.m and extract_rgbd_data_v1.m, I mean in the SUNRGBD folder or SUNRGBDtoolbox folder or anywhere else, i mean it was written those files are specifically avaialble in a matlab folder(which i can't find from all of the sun rgbd files i have collected).....
@alar0330 Yeah this was the problem in extract_split.m file. I'm facing issues in extract_rgbd_data_v2.m file
In extract_rgbd_data_v2.m, I changed the parfor loop to a for loop since Octave (without additional package) struggles with parallelized operations and parfor has no effect.
Also, I changed the parsave to save:
parsave(strcat(depth_folder, mat_filename), points3d_rgb);
--> save(strcat(depth_folder, mat_filename), 'points3d_rgb');
This takes way more time (~2 hours for me) but no data lost in the process.
For the next operations, since scipy.loadmat (used to load .mat in depth folder) doesn't work with Octave original files, you have 2 choices:
def get_depth(self, idx): depth_filename = os.path.join(self.depth_dir, '%06d.mat'%(idx)) return sunrgbd_utils.load_depth_points_mat(depth_filename)
to
def get_depth(self, idx): depth_filename = os.path.join(self.depth_dir, '%06d.txt'%(idx)) return sunrgbd_utils.load_depth_points(depth_filename)
I used this and it seems to work fine.
-v7
check documentation here to save the data in Matlab format. It should work but I haven't tried this one.HI all, I was wondering if I can directly download output files from "extract_split.m, extract_rgbd_data_v2.m and extract_rgbd_data_v1.m" ? As I have limit access to matlab.
did you find a way to download the extracted data without using matlab?
@alar0330 Yeah this was the problem in extract_split.m file. I'm facing issues in extract_rgbd_data_v2.m file
In extract_rgbd_data_v2.m, I changed the parfor loop to a for loop since Octave (without additional package) struggles with parallelized operations and parfor has no effect. Also, I changed the parsave to save:
parsave(strcat(depth_folder, mat_filename), points3d_rgb);
-->save(strcat(depth_folder, mat_filename), 'points3d_rgb');
This takes way more time (~2 hours for me) but no data lost in the process.
For the next operations, since scipy.loadmat (used to load .mat in depth folder) doesn't work with Octave original files, you have 2 choices:
- go in sunrgbd_data.py and change line 64.
def get_depth(self, idx): depth_filename = os.path.join(self.depth_dir, '%06d.mat'%(idx)) return sunrgbd_utils.load_depth_points_mat(depth_filename)
to
def get_depth(self, idx): depth_filename = os.path.join(self.depth_dir, '%06d.txt'%(idx)) return sunrgbd_utils.load_depth_points(depth_filename)
I used this and it seems to work fine.
- when saving the .mat, you can add an option
-v7
check documentation here to save the data in Matlab format. It should work but I haven't tried this one.
Even you have saved the depth data in the .mat format of Octave, you can still use the first choice here, just modify the "%06d.txt" to "%06d.mat". In this way, you don't need to run extract_rgbd_data_v2.m again.
After running extract_split.m, extract_rgbd_data_v2.m and extract_rgbd_data_v1.m, the number of files in 'calib/' is 6425, 'depth/' is 8786, 'image/' is 6425, 'label/' is 6425, 'label_v1' is 10335. I suppose all the number of files should be 10335, why this happens? Am I wrong?
What's more, when I run python sunrgbd_data.py --gen_v1_data, an error occurs: ------------- 5956 ------------- 5957 Traceback (most recent call last): File "sunrgbd_data.py", line 326, in
save_votes=True, num_point=50000, use_v1=True, skip_empty_scene=False)
File "sunrgbd_data.py", line 225, in extract_sunrgbd_data
pc_upright_depth = dataset.get_depth(data_idx)
File "sunrgbd_data.py", line 64, in get_depth
return sunrgbd_utils.load_depth_points_mat(depth_filename)
File "/home/aistudio/work/votenet/sunrgbd/sunrgbd_utils.py", line 196, in load_depth_points_mat
depth = sio.loadmat(depthfilename)['instance']
File "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/scipy/io/matlab/mio.py", line 217, in loadmat
MR, = mat_reader_factory(f, **kwargs)
File "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/scipy/io/matlab/mio.py", line 72, in mat_reader_factory
mjv, mnv = get_matfile_version(byte_stream)
File "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/scipy/io/matlab/miobase.py", line 224, in get_matfile_version
raise MatReadError("Mat file appears to be empty")
scipy.io.matlab.miobase.MatReadError: Mat file appears to be empty
How should I fix this problem?Thank you.