serp-ai / LLaMA-8bit-LoRA

Repository for Chat LLaMA - training a LoRA for the LLaMA (1 or 2) models on HuggingFace with 8-bit or 4-bit quantization. Research only.
https://serp.ai/tools/chat-llama/
147 stars 15 forks source link

Error on "unk_token": tokenizer.convert_ids_to_tokens #6

Open dagshub[bot] opened 8 months ago

dagshub[bot] commented 8 months ago

This issue was created on DagsHub by: Tizzzzy

I am getting this error

Traceback (most recent call last):

File "//train/LLaMA-8bit-LoRA/finetune_peft_8bit.py", line 261, in main() File "//train/LLaMA-8bit-LoRA/finetune_peft_8bit.py", line 188, in main "unk_token": tokenizer.convert_ids_to_tokens( File "/opt/conda/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 388, in convert_ids_to_tokens for index in ids:

TypeError: 'NoneType' object is not iterable

dagshub[bot] commented 8 months ago

Join the discussion on DagsHub!