stochasticai / xTuring

Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
https://xturing.stochastic.ai
Apache License 2.0
2.61k stars 207 forks source link

RecursionError: maximum recursion depth exceeded while calling a Python object #287

Open ForAxel opened 1 month ago

ForAxel commented 1 month ago

My System Info

Python 3.10.15 torch 2.4.1 transformers 4.31.0 xturing 0.1.8 sentencepiece 0.1.99

When I loaded model model = GenericLoraKbitModel('aleksickx/llama-7b-hf') in examples/features/int4_finetuning/LLaMA_lora_int4.ipynb, I got the following error message

RecursionError: maximum recursion depth exceeded while calling a Python object

According to the issue mentioned in https://github.com/huggingface/transformers/issues/22762 , it seems that the tokenizer of llama-7b-hf is not compatible with the latest version of transformer?

Can you provide the version of transformers(or other libs) that can successfully run the code in your notebook?