Closed FeU-aKlos closed 1 year ago
If I try to call prepare2 with german-HuBERT-quantizer_14_epoch.pth checkpoint, the folloiwng error emerges:
File "/usr/local/lib/python3.10/dist-packages/fairseq/checkpoint_utils.py", line 585, in _upgrade_state_dict │ {"criterion_name": "CrossEntropyCriterion", "best_loss": state["best_loss"]} │ KeyError: 'best_loss'
If quantifier_hubert_base_ls960.pth checkpoint is used, everything works well.
Is there somewhere a checkpoint of german-HuBERT-quantizer_14_epoch.pth with the necessary metadata?
I get the same error, when executing mode test
You're not supposed to load the quantifier as the hubert checkpoint, but instead load it as a quantizer model.
If I try to call prepare2 with german-HuBERT-quantizer_14_epoch.pth checkpoint, the folloiwng error emerges:
If quantifier_hubert_base_ls960.pth checkpoint is used, everything works well.
Is there somewhere a checkpoint of german-HuBERT-quantizer_14_epoch.pth with the necessary metadata?