Closed themurtazanazir closed 1 week ago
Good heavens. We can either not worry about determinism, or have it to always generate on CPU (since this only happens once).
Yes we can generate the transform on the cpu, move it to gpu afterwards. However, our current experiments have initiated it on the gpu, so we can ignore it for now and fix it lateron
Shall we close this? Not essential to do it seem.
yes... seems non essential at this point. can reopen when needed.
torch.Generator
creates different numbers for the same seed on cpu and gpu.replication code:
so our random
transform
in clr will have different values on cpu and gpu.