In the official DenseNet implementations (caffe, torch, pytorch) and others (mxnet, torchvision) a Relu activation is applied after the last batch norm and before the global pooling, however this activation does not exist in this keras implementation.
In the official DenseNet implementations (caffe, torch, pytorch) and others (mxnet, torchvision) a Relu activation is applied after the last batch norm and before the global pooling, however this activation does not exist in this keras implementation.