Closed mathhyphen closed 8 months ago
Hello, to avoid this error you can just use the ADAM optimizer without the EMA, to do this simply delete the --use_ema (and therefore the --ema_decay 0.999) in the "python3 train.py [...] " command line.
Hello, to avoid this error you can use the ADAM optimizer, to do this simply delete the --use_ema (and therefore the --ema_decay 0.999) in the "python3 train.py [...] " command line.
Thank you for your answer, I will try your method.
Hi, thank you so much for your contributions and I was wondering, what can I do to avoid errors like 'EMA' object has no attribute '_optimizer_state_dict_pre_hooks'?