File "//train/LLaMA-8bit-LoRA/finetune_peft_8bit.py", line 261, in main() File "//train/LLaMA-8bit-LoRA/finetune_peft_8bit.py", line 188, in main "unk_token": tokenizer.convert_ids_to_tokens( File "/opt/conda/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 388, in convert_ids_to_tokens for index in ids:
This issue was created on DagsHub by: Tizzzzy
I am getting this error
Traceback (most recent call last):
File "//train/LLaMA-8bit-LoRA/finetune_peft_8bit.py", line 261, in main() File "//train/LLaMA-8bit-LoRA/finetune_peft_8bit.py", line 188, in main "unk_token": tokenizer.convert_ids_to_tokens( File "/opt/conda/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 388, in convert_ids_to_tokens for index in ids:
TypeError: 'NoneType' object is not iterable