HRNet / DEKR

This is an official implementation of our CVPR 2021 paper "Bottom-Up Human Pose Estimation Via Disentangled Keypoint Regression" (https://arxiv.org/abs/2104.02300)
MIT License
437 stars 76 forks source link

set_epoch for DistributedSampler #13

Closed ArchNew closed 3 years ago

ArchNew commented 3 years ago

Describe the bug PyTorch example suggests the use set_epoch function for DistributedSampler class before each epoch start. I could not find anywhere in your code.

https://github.com/pytorch/examples/blob/master/imagenet/main.py Line 232-234

As can be seen from the DistributedSampler class code (https://github.com/pytorch/pytorch/blob/master/torch/utils/data/distributed.py), the set_epoch function is required to set the seed for each iter function call.

Can you confirm if this function has been called on DistributedSampler (for training dataset) at some point in your code?

Copyright Claim: I ask the same question as @ananyahjha93 did. Hence I copied and slight modified his post here: https://github.com/PyTorchLightning/pytorch-lightning/issues/224#issue-493778958

Gengzigang commented 3 years ago

Hi, you can find the DistributedSampler in the lib/dataset/build.py.

ArchNew commented 3 years ago

Hi, you can find the DistributedSampler in the lib/dataset/build.py.

Either you don't understand what set_epoch() does or you don't understand what I'm saying.

Gengzigang commented 3 years ago

I am very sorry for misunderstanding you. I indeed did not call this function and I will fix it and test the results.

ArchNew commented 3 years ago

I am very sorry for misunderstanding you. I indeed did not call this function and I will fix it and test the results.'

I'm sure you can get better results than those reported in your paper. :D