Open rowntreerob opened 1 week ago
ban_expressivity_tokens=True
without the rest of the "if then" @233 works fine.
Hey @rowntreerob ! Thanks for raising the issue, it is indeed an issue, the variable is_expressive_model
is not initialised correctly in here when the model name is a path (in your example "/content/drive/MyDrive/data/checkpoints/spiritlm_model/spirit-lm-expressive-7b"
), we can reuse the function _ensure_model_name
from here to get correct model name.
You are welcome to create a PR to fix it.
fix file in spiritlm/model/spiritlm_model.py :change 184 and 187: from : if name == to : if os.path.basename(name) ==
running in colab, loading model from drive with:
spirit_lm = Spiritlm("/content/drive/MyDrive/data/checkpoints/spiritlm_model/spirit-lm-expressive-7b")
then, running generation step from standard example , it has error below:
and the logger has following : INFO:root:test INFO:spiritlm.model.spiritlm_model:Loading SPIRIT-LM model from the path /content/drive/MyDrive/data/checkpoints/spiritlm_model/spirit-lm-expressive-7b... INFO:spiritlm.model.spiritlm_model:SPIRIT-LM model is loaded. INFO:spiritlm.model.spiritlm_model:Loading SPIRIT-LM speech tokenizers ... INFO:spiritlm.model.spiritlm_model:SPIRIT-LM speech tokenizers are loaded. INFO:DET:testRR