PeterTor / sparse_convolution

sparse convolution Implementation
MIT License
50 stars 13 forks source link

Link code on KITTI homepage #1

Open nischnei opened 6 years ago

nischnei commented 6 years ago

Dear author,

thank you for the tensorflow version of our paper Sparsity Invariant CNNs. We would actually like to link your code on the KITTI homepage - would this be fine with you?

Thanks!

Nick

buldajs commented 6 years ago

@nischnei Hi nischnei, would you think this is the correct implementation of sparse convolution in tensorflow? Because the code of the paper is not released yet.

nischnei commented 6 years ago

@buldajs Hey buldajs,

I'm one of the authors of the paper and I got the code working yesterday. There are minor improvements to be made, which I will address in a seperate post. Furthermore, we should maybe add an example, especially if you want to use it for the depth completion task we just released:

http://www.cvlibs.net/datasets/kitti/eval_depth.php?benchmark=depth_completion

We will also release the caffe version of our code on the webpage, but have to extract it first as it is part of a huge deep learning framework which we are not allowed to release by our company.

PeterTor commented 6 years ago

Hello,

Yes of course you can link it to your homepage! I´m very open to any changes/Issues

EDIT: I´ve changed my username to my abbr. in case of confusion

buldajs commented 6 years ago

@nischnei Hei nischnei, thanks for your answer! I am doing research on depth completion, I implement the sparse convolution operator like the author did in tensorflow. I also do some experiments on depth completion task, I observed the sparse convolution can handle the scale of input data. but the result in your paper is so impressive...(multi scale train multi scale test the same 0.99 mae). It's amazing although we can observe the sparisty invariant feature...(but maybe slight change by scale) So I am confused maybe I missed some small details......

nischnei commented 6 years ago

I got tensorflow running since yesterday and haven't worked with it so far. I will need a bit more time to understand all the concepts but hopefully I can give you a better feedback by the end of the week. Currently I see, that kernel regularization seems to be turned off and the bias is initialized with 0.01 instead of zero (if you have mean free images). Fixing both seems to lead to reasonable convergence on the mentioned dataset.

PeterTor commented 6 years ago

@nischnei I've added the kernel_regularization and changed the bias init. to zero.

balajipsk commented 6 years ago

Hi @PeterTor

I am working on a 'Single Image Depth Estimation' project, and I would like to use dense depth maps for KITTI training as a starting point. I'm relatively new to the tensorflow world. It would be of help to me if I get the complete code rather than just the core function. Do you have a version now, which you can share?

Thank you very much.

Regards, Balaji

AbnerCSZ commented 5 years ago

Hi @nischnei Your work is very detailed and excellent, especially the generation of data sets. Although there are references to the method of making data sets in the paper, it is unfortunate that no relevant code is seen on the Internet. So could you share the code for generating data sets? I am a student and want to reproduce the results of this article and explore the impact of dynamic objects on data sets. Thank you very much!

USTC-Keyanjie commented 3 years ago

Hi @PeterTor @nischnei Thanks for this excellent job! I cannot find the code to make depth ground truth data yet. Can you share the code for generating the data set? I want to use this to conduct more in-depth research. Thank you very much!