Open RihardsVitols opened 4 years ago
Only train_step-X.model checkpoints have generator and discriminator weights. Other checkpoints only have weight of the ema of the generator.
Thank you.
and thank you for sharing this code.
@rosinality i don't know why the purpose why you save the 010000.model checkpoint and train_step-X.model. Because when I trained model after a break, the model started training from the begining (from 8x8 resolutions) instead of saved state dict.
Thank for your code very much.
@quyet0nguyen You can use --init_size
argument to beginning for more larger resolutions.
@rosinality thanks for your answer.
dear all,
can some one tel me how i can load checkpoint in to train.py?
i was trying this: python train.py --mixing lmdb --ckpt checkpoint/010000.model
resoult: File "train.py", line 321, in
generator.module.load_state_dict(ckpt['generator'])
KeyError: 'generator'
if i try with out .model file i get: PermissionError: [Errno 13] Permission denied: 'checkpoint'
thank you