Traceback (most recent call last):
File "/mnt/dolphinfs/ssd_pool/docker/user/hadoop-mlm/by/train_llava/build_model.py", line 65, in <module>
test()
File "/mnt/dolphinfs/ssd_pool/docker/user/hadoop-mlm/by/train_llava/build_model.py", line 53, in test
prompt = llava_processor.tokenizer.apply_chat_template(
File "/mnt/dolphinfs/ssd_pool/docker/user/hadoop-mlm/by/conda_env/by/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1803, in apply_chat_template
chat_template = self.get_chat_template(chat_template, tools)
File "/mnt/dolphinfs/ssd_pool/docker/user/hadoop-mlm/by/conda_env/by/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1964, in get_chat_template
raise ValueError(
ValueError: Cannot use chat template functions because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating
感谢你的代码,我跟着你的代码自己写了一下,如下
目前卡在了这一行
报错:
请教一下,我错在哪里了?实在不知道为啥。