Open nischnei opened 6 years ago
@nischnei Hi nischnei, would you think this is the correct implementation of sparse convolution in tensorflow? Because the code of the paper is not released yet.
@buldajs Hey buldajs,
I'm one of the authors of the paper and I got the code working yesterday. There are minor improvements to be made, which I will address in a seperate post. Furthermore, we should maybe add an example, especially if you want to use it for the depth completion task we just released:
http://www.cvlibs.net/datasets/kitti/eval_depth.php?benchmark=depth_completion
We will also release the caffe version of our code on the webpage, but have to extract it first as it is part of a huge deep learning framework which we are not allowed to release by our company.
Hello,
Yes of course you can link it to your homepage! I´m very open to any changes/Issues
EDIT: I´ve changed my username to my abbr. in case of confusion
@nischnei Hei nischnei, thanks for your answer! I am doing research on depth completion, I implement the sparse convolution operator like the author did in tensorflow. I also do some experiments on depth completion task, I observed the sparse convolution can handle the scale of input data. but the result in your paper is so impressive...(multi scale train multi scale test the same 0.99 mae). It's amazing although we can observe the sparisty invariant feature...(but maybe slight change by scale) So I am confused maybe I missed some small details......
I got tensorflow running since yesterday and haven't worked with it so far. I will need a bit more time to understand all the concepts but hopefully I can give you a better feedback by the end of the week. Currently I see, that kernel regularization seems to be turned off and the bias is initialized with 0.01 instead of zero (if you have mean free images). Fixing both seems to lead to reasonable convergence on the mentioned dataset.
@nischnei I've added the kernel_regularization and changed the bias init. to zero.
Hi @PeterTor
I am working on a 'Single Image Depth Estimation' project, and I would like to use dense depth maps for KITTI training as a starting point. I'm relatively new to the tensorflow world. It would be of help to me if I get the complete code rather than just the core function. Do you have a version now, which you can share?
Thank you very much.
Regards, Balaji
Hi @nischnei Your work is very detailed and excellent, especially the generation of data sets. Although there are references to the method of making data sets in the paper, it is unfortunate that no relevant code is seen on the Internet. So could you share the code for generating data sets? I am a student and want to reproduce the results of this article and explore the impact of dynamic objects on data sets. Thank you very much!
Hi @PeterTor @nischnei Thanks for this excellent job! I cannot find the code to make depth ground truth data yet. Can you share the code for generating the data set? I want to use this to conduct more in-depth research. Thank you very much!
Dear author,
thank you for the tensorflow version of our paper Sparsity Invariant CNNs. We would actually like to link your code on the KITTI homepage - would this be fine with you?
Thanks!
Nick