Closed OmidSaremi closed 4 years ago
TF Lattice by itself has options to fix the random seed of internal operations, and also provided deterministic initializers for layer weights. For example, (CalibratedLatticeEnsembleConfig)[https://www.tensorflow.org/lattice/api_docs/python/tfl/configs/CalibratedLatticeEnsembleConfig] has a random_seed parameter for the random feature arrangement in the lattice ensemble. However, TF as a whole has a several other sources of randomness. E.g. the data might be shuffled and tensor initialization might be randomized. If you are using the estimator API, you might be able to get deterministic training (assuming the machine architecture and TF version are kept fixed) by setting the tf_random_seed in the RunConfig passed to the estimator init as is done in this tutorial:
estimator = tfl.estimators.CannedClassifier(..., config=tf.estimator.RunConfig(tf_random_seed=42))
Keras training has a different randomness mechanism. So you'll need to look into the details or consult the TF or Keras teams on how to force deterministic training.
Hi there, A general question here. How do we do reproducible lattice training? Do you have examples?