Open JihwanEom opened 2 years ago
Hi, although non-deterministic operations currently, the results of different interpolate methods would not fluctuate widely. You could have a try and looking forward to your feedback. If results are still unacceptable, we would try to fix it.
Okay, is there any experiment about deterministic operation? If not, I want to conduct/share the results :)
Not yet. You may found that --deterministic
which Switch on "deterministic" mode which slows down training but the results are reproducible
is implemented by:
if deterministic:
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
in ./mmseg/apis/train.py
. So if you want to conduct experiments on interpolations, you may global search F.interpolate
and modify mode
.
You can randomly pick one config file and make comparison between different interpolation configs first.
Okay, thank you for your kind comment. I'll conduct experiments about this when I have a room.
Hi, I have a question about how to run reproducible experiments.
Here is PyTorch documentation about deterministic option : https://pytorch.org/docs/stable/generated/torch.use_deterministic_algorithms.html
In this document, I confirmed torch.nn.functional.interpolate() [linear, bilinear, bicubic, trilinear] is showing non-deterministic operations currently. But I can't give up these interpolations for training efficiency.
Is there any good solution for my situation?
Thank you. Jihwan