Closed Moreland-cas closed 10 months ago
Hi!
Have you already looked at torch.use_deterministic_algorithms
?
Another possible source is the dataloaders fetching data in a random order. I think if you just set the number of workers in the dataloader to 0 it should be deterministic.
Hi!
Have you already looked at
torch.use_deterministic_algorithms
?Another possible source is the dataloaders fetching data in a random order. I think if you just set the number of workers in the dataloader to 0 it should be deterministic.
Thanks a lot! Setting deterministic = True perfectly addresses my problem :)
Hello,
I am encountering an issue with inconsistent results when running the code, despite setting the random seed for reproducibility. I have tried using the original random seed setting code provided in your repository However, I am still facing variable outcomes in repeated runs.
The original seed setting code in your repository is: ‘’‘python def _set_random_seed(seed) -> None: """Set randomness seed in torch and numpy""" random.seed(seed) np.random.seed(seed) torch.manual_seed(seed) ’‘’ I am using a single GPU for experiments of static scene in nerf-synthetic dataset. I suspect there might be other parts of the code or dependencies that introduce randomness.
Could you please help me identify any potential sources of variability or suggest further modifications to ensure consistent results?
Thank you for your assistance.