guanfuchen / DeepNetModel

记录每一个常用的深度模型结构的特点(图和代码)
28 stars 8 forks source link

Wide Residual Networks #2

Open guanfuchen opened 5 years ago

guanfuchen commented 5 years ago

related paper

摘要
Deep residual networks were shown to be able to scale up to thousands of layers and still have improving performance. However, each fraction of a percent of improved accuracy costs nearly doubling the number of layers, and so training very deep residual networks has a problem of diminishing feature reuse, which makes these networks very slow to train. To tackle these problems, in this paper we conduct a detailed experimental study on the architecture of ResNet blocks, based on which we propose a novel architecture where we decrease depth and increase width of residual networks. We call the resulting network structures wide residual networks (WRNs) and show that these are far superior over their commonly used thin and very deep counterparts. For example, we demonstrate that even a simple 16-layer-deep wide residual network outperforms in accuracy and efficiency all previous deep residual networks, including thousand-layerdeep networks, achieving new state-of-the-art results on CIFAR, SVHN, COCO, and significant improvements on ImageNet. Our code and models are available at https: //github.com/szagoruyko/wide-residual-networks.
guanfuchen commented 5 years ago

image

image

image

image

image

image

细节架构

image

image

guanfuchen commented 5 years ago

性能比较

image

guanfuchen commented 5 years ago

conclusions

image