Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
When I loaded model model = GenericLoraKbitModel('aleksickx/llama-7b-hf') in examples/features/int4_finetuning/LLaMA_lora_int4.ipynb, I got the following error message
RecursionError: maximum recursion depth exceeded while calling a Python object
My System Info
When I loaded model
model = GenericLoraKbitModel('aleksickx/llama-7b-hf')
inexamples/features/int4_finetuning/LLaMA_lora_int4.ipynb
, I got the following error messageAccording to the issue mentioned in https://github.com/huggingface/transformers/issues/22762 , it seems that the tokenizer of llama-7b-hf is not compatible with the latest version of transformer?
Can you provide the version of transformers(or other libs) that can successfully run the code in your notebook?