Open plooney opened 2 years ago
I have two machines one with multiple GPUs and one with a single GPU.
I do something like
strategy = tf.distribute.MirroredStrategy() if multi_gpu else tf.distribute.get_strategy()
then in training I do
with strategy.scope(): optimizer.swap_weights()
This works fine on the multi_gpu machine but on the single GPU machine it produces the error
ValueError: Swapping weights must occur under a tf.distribute.Strategy
Is there a way to use this on a single GPU?
MirroredStrategy supports single GPU too.
I have two machines one with multiple GPUs and one with a single GPU.
I do something like
strategy = tf.distribute.MirroredStrategy() if multi_gpu else tf.distribute.get_strategy()
then in training I do
with strategy.scope(): optimizer.swap_weights()
This works fine on the multi_gpu machine but on the single GPU machine it produces the error
ValueError: Swapping weights must occur under a tf.distribute.Strategy
Is there a way to use this on a single GPU?