MPI-IS / bilateralNN

Learning Sparse High Dimensional Filters with Neural Networks
http://bilateralnn.is.tue.mpg.de
BSD 3-Clause "New" or "Revised" License
69 stars 25 forks source link

Question: Why does scaled_back_data shared across samples within a batch? #16

Closed hzxie closed 5 years ago

hzxie commented 5 years ago

https://github.com/MPI-IS/bilateralNN/blob/e174e4e96bf5aa51b4adca53526feb7107e3136b/bilateralnn_code/src/caffe/layers/permutohedral_layer.cu#L134-L148

According to the code above, the scaled_back_data is shared across samples within a batch? Could you tell me the reason?

I think the scaled_back_data should be decalred within the for loop.

hzxie commented 5 years ago

Because the variable is used to save the result for the current sample.