Random operations (e.g. tf.random.uniform(...)), have an underlying state that controls the sequence of random numbers that are generated. Setting the TensorFlow seed (tf.random.set_seed(n)) does not reset that state, so e.g.
tf.random.set_seed(0)
x = tf.random.uniform(...)
sess.run(x) # --> produces some number A
tf.random.set_seed(0)
sess.run(x) # --> produces a different number B
More concretely for NengoDL, this means that calling Simulator.reset() does not reset that internal RNG state either, so
sim.run(...)
sim.reset()
sim.run(...)
may contain different random sequences in the two runs. This is probably surprising to most users. Note however that currently there is no TensorFlow randomness in a standard Nengo model (all the randomness is through numpy), so the only way this would occur is if someone has built a TensorNode that contains random ops like tf.random.uniform.
There is currently no way to reset the TensorFlow RNG other than completely rebuilding the graph/simulator. However, there is experimental support for a different RNG implementation that does support resetting in tf.random.experimental (https://www.tensorflow.org/api_docs/python/tf/random/experimental). I looked into supporting this (which would basically mean resetting the tf.random.experimental.global_generator on Simulator.reset), but it still seems a bit buggy. We should investigate this more if this approach leaves the "experimental" status though.
Random operations (e.g.
tf.random.uniform(...)
), have an underlying state that controls the sequence of random numbers that are generated. Setting the TensorFlow seed (tf.random.set_seed(n)
) does not reset that state, so e.g.More concretely for NengoDL, this means that calling
Simulator.reset()
does not reset that internal RNG state either, somay contain different random sequences in the two runs. This is probably surprising to most users. Note however that currently there is no TensorFlow randomness in a standard Nengo model (all the randomness is through numpy), so the only way this would occur is if someone has built a TensorNode that contains random ops like
tf.random.uniform
.There is currently no way to reset the TensorFlow RNG other than completely rebuilding the graph/simulator. However, there is experimental support for a different RNG implementation that does support resetting in
tf.random.experimental
(https://www.tensorflow.org/api_docs/python/tf/random/experimental). I looked into supporting this (which would basically mean resetting thetf.random.experimental.global_generator
onSimulator.reset
), but it still seems a bit buggy. We should investigate this more if this approach leaves the "experimental" status though.