Fixed by adding sentencepiece==0.1.91 dependency to requirements.txt
Error logs --
Traceback (most recent call last):
File "train_joint_bert.py", line 58, in <module>
bert_vectorizer = BERTVectorizer(is_bert, bert_model_hub_path)
File "/content/drive/.shortcut-targets-by-id/1iSRbCLJ-3Wm3LawGU5ih2XMs0JTZe2zL/Intent-Classification-Slot-Filling/dialog-nlu/vectorizers/bert_vectorizer.py", line 17, in __init__
self.create_tokenizer_from_hub_module(is_bert=is_bert)
File "/content/drive/.shortcut-targets-by-id/1iSRbCLJ-3Wm3LawGU5ih2XMs0JTZe2zL/Intent-Classification-Slot-Filling/dialog-nlu/vectorizers/bert_vectorizer.py", line 27, in create_tokenizer_from_hub_module
from vectorizers.tokenization import FullTokenizer
File "/content/drive/.shortcut-targets-by-id/1iSRbCLJ-3Wm3LawGU5ih2XMs0JTZe2zL/Intent-Classification-Slot-Filling/dialog-nlu/vectorizers/tokenization.py", line 32, in <module>
import sentencepiece as spm
ModuleNotFoundError: No module named 'sentencepiece'
missing dependency of sentencepiece module, causing module not found error for
python train_joint_bert.py --train=data/snips/train --val=data/snips/valid --save=saved_models/joint_bert_model --epochs=5 --batch=64 --type=bert
Fixed by adding sentencepiece==0.1.91 dependency to requirements.txt
Error logs --