-
Hi!
Could you please clarify, how is it possible to convert network depth output, bounded between 0 and 1, to real-world values in meters?
-
thanks for your great work
I can't understand why shape_rgb is 640*480, the shape_depth is 320*240?
If I used kitti.h5 to fintuning,my dataset is 1280*720, First I resized all rgb and depth to 37…
-
I notice that the vacuity in ground truth of NYU dataset has been filled. I'd like to know the method you use on inpainting dataset and whether it helps the performance of the network. Also, is the sa…
-
Could you also provide the list of files you use for training on SUNCG and NYU dataset?
Thanks.
-
Hello,
I want to train my own data.
`if args.data == 'nyu':
train_generator, test_generator = get_nyu_train_test_data( args.bs )
`
How to modify the code?
Sincerely,
HuBoni
-
I was using the main_nyu_posereg_embedding code in tensorflow, The output is:
Training
epoch 100, batch_num 1135, Minibatch Loss= 1.9201
Testing ...
Mean error: 483.2774520085452mm, max error: 682…
-
Hi, I have a few questions, mostly regarding maxdepth:
1. The maxdepth value seems to be set to 1000 throughout the code, however I noticed that the depth values in the NYU dataset range all the wa…
-
I download the NYU dataset from your link. As I see, the image_size are not 640*480 since you have removed the district label==255. How can I do the same thing to the depth images? Or could you give m…
-
Thanks for publishing the repo and the dataset with the dense labels.
A while ago I wrapped the dataset which can be downloaded from https://s3-eu-west-1.amazonaws.com/densedepth/nyu_data.zip into th…
-
I open the nyu_depth_v2_labeled.mat file and access its label attribute. When I do a max over the label file, I get 894. How should I extract semantic label from it? Shouldn't label be indexing 0~12 c…