-
In this file "FuseNet_Class_Plots.ipynb": it shows up the below information.
""
Best global pixel-wise accuracy: 0.777 mean class-wise IoU accuracy: 0.273 mean accuracy: 0.479
Org. global pixel-wi…
-
I am getting this error, kindly guide me.
795 training images
654 test images
using filled depth images
reading ./dataset/nyu_depth_v2_labeled.mat
processing images
Traceback (most recent call…
ghost updated
6 years ago
-
@shurans Hello, I have got the test result with NYU data set. And I want to test with my own data set, but I do not know how to generate .bin file from the depth data. Could you help me? Thanks and re…
-
Hi, thank you for your research. I want to run your code on my own computer. My OS is Ubuntu 14.04 and camera is kinect v1. Does your code fit this camera or must be used with Creative Senz3D?
-
1. How to get a depth of a scene? And can we get a matting graph with real distance but not fractions.
2. How Resides database generate their outdoor and indoor images.
@MintcakeDotCom
-
During training, how many images did you use as validation set?
Since I'm still coding the training framework, I'm working with the plain NYU Depth Dataset (795 Images in total, 639 for training an…
-
Hi,
I want to use your code to predict my own data.
It seems that there is only network structure in this code.
Is it a unfinished version now?
And where is the main.py then?
Pan
-
Hi,
I noticed that in your paper, Table 2, your WKDR is less than both WKDR_eq and WKDR_neq. But I think that, WKDR should be a weighted ave of them. How do you get the WKDR?
-
i am trying this paper in the caffe
i am using NYU2 raw depth dataset. about 12K~13K sampled depth image.
at the train task of you, did you use filtered image(cross bf or colorization) or not(raw de…
-
Can I estimate the real distance after prediction? Can I get like some depth information besides color?
Thank U!