Closed nessieyang closed 6 years ago
Its tricky, I would try giving a shot on replacing every ReLU, then only inside DenseBlock then only in DenseLayers to see what it produces.
I have tried but the results seems like not so good. I will try it more.
On 09/18/2018 21:01, Utku Ozbulak wrote:
Its tricky, I would try giving a shot on replacing every ReLU, then only inside DenseBlock then only in DenseLayers to see what it produces.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.
Hey @nessieyang, any progress on getting this to work with DenseNet?
Densenet has a bn layer, does it affect the results?
@dontLoveBugs I'm not sure, perhaps there are studies that analyzed the effects of BN for backprop-related techniques.
Let us know if you find any.
Hi, I want to move this code to densenet and other models. Here is the problem: When I run the code 'guided_backprop' based on DenseNet. Should I also replace the Relu inside the DenseBlock and Dense Layers? I know that we need to replace the Relu layers in VGG, but in Dense Block, the connection is a little bit different and complex. So do we need to replace them as well?