Open Jason-Hong-Child opened 3 weeks ago
If you run multiple groups of parameters in parallel, it may overwrite the checkpoint? To avoid this, please modify "model_root: " in config.
Is it because I trained multiple times? #
If you run multiple groups of parameters in parallel, it may overwrite the checkpoint? To avoid this, please modify "model_root: " in config.
When executing submit.py, the model parameters loaded from
"checkpoints/ebnerd_large_x1_2ed787f6/DIN_ebnerd_large_x1_001_1860e41e.model" are inconsistent with the DIN model structure