Closed d-rams closed 7 years ago
We are seeing non-determinism in training in spite of the deterministic seeds in the code. We need to figure out what the source is (versions of libraries, etc.)
@jrwalsh1 realized that there are small, but not-necessarily insignificant floating-point errors which can accumulate in the calculations.
We are seeing non-determinism in training in spite of the deterministic seeds in the code. We need to figure out what the source is (versions of libraries, etc.)