as-ideas / DeepPhonemizer

Grapheme to phoneme conversion with deep learning.
MIT License
352 stars 38 forks source link

Optimizer is None when trying to finetune a pretrained model #28

Open inigo-casanueva opened 1 year ago

inigo-casanueva commented 1 year ago

Thanks for the repo!

When trying to finetune one of the provided pretrained models, I was getting an unintuitive error. This was because the models were saved without optimizer and when trying to load the checkpoint, in line 76 in training/trainer.py, the check wouldnt stop it from loading the optimizer as checkpoint['optimizer'] existed in the dict with None value

optimizer = Adam(model.parameters())
if 'optimizer' in checkpoint:
    optimizer.load_state_dict(checkpoint['optimizer'])
for g in optimizer.param_groups:
    g['lr'] = config['training']['learning_rate']

changing the line to if 'optimizer' in checkpoint and checkpoint['optimizer']: should fix it.

cschaefer26 commented 1 year ago

Hi, thanks for the hint. I will update this if I have time :)