Closed hzxie closed 5 years ago
https://github.com/MPI-IS/bilateralNN/blob/e174e4e96bf5aa51b4adca53526feb7107e3136b/bilateralnn_code/src/caffe/layers/permutohedral_layer.cu#L134-L148
According to the code above, the scaled_back_data is shared across samples within a batch? Could you tell me the reason?
scaled_back_data
I think the scaled_back_data should be decalred within the for loop.
Because the variable is used to save the result for the current sample.
https://github.com/MPI-IS/bilateralNN/blob/e174e4e96bf5aa51b4adca53526feb7107e3136b/bilateralnn_code/src/caffe/layers/permutohedral_layer.cu#L134-L148
According to the code above, the
scaled_back_data
is shared across samples within a batch? Could you tell me the reason?I think the
scaled_back_data
should be decalred within the for loop.