Lyken17 / SparseNet

[ECCV 2018] Sparsely Aggreagated Convolutional Networks https://arxiv.org/abs/1801.05895
MIT License
125 stars 25 forks source link

data augmentation in pytorch #8

Closed Sirius083 closed 5 years ago

Sirius083 commented 5 years ago

Can you post your pytorch implementation of data augmentation/training details? (Is it same with that in tensorflow?) Since the data augmentation is per-channel-mean while the tensorflow implementation is per-pixel-mean Thanks in advance

Lyken17 commented 5 years ago

I don't think these two will make much difference.

Lyken17 commented 5 years ago

Let me know if this detail makes a significant change in performance. If that is true, I will re-rewrite the data pre-processing after nips.

Sirius083 commented 5 years ago

tmp This is the results I got under tensorflow implementation of sparseNet under densenet preprocessing implementation. I didnot get same accuracy as as densenet as well(1% accuracy lower), maybe this is due to tensorflow and pytorch optimization difference. I will have a further look into that Thank you very much

Sirius083 commented 5 years ago

I don't think these two will make much difference.

Thanks for your reply, I finnaly found it is because tensorflow implementation did not include the L2 normal of batch normalization parameters weight decay in tensorpack implementaion. After that, the model give better results.