Open WailordHe opened 4 weeks ago
Your question Is it possible to load an optimizer that was previously saved using a distributed optimizer configuration, and then continue the training without employing a distributed optimizer?
Not without a checkpoint converter.
Your question Is it possible to load an optimizer that was previously saved using a distributed optimizer configuration, and then continue the training without employing a distributed optimizer?