[X] I have read the README and searched the existing issues.
System Info
llamafactory-cli env
[2024-06-28 02:51:51,234] [INFO] [real_accelerator.py:203:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[WARNING] Please specify the CUTLASS repo directory as environment variable $CUTLASS_PATH
[WARNING] sparse_attn requires a torch version >= 1.5 and < 2.0 but detected 2.3
[WARNING] using untested triton version (2.3.0), only 1.0.0 is known to be compatible
Traceback (most recent call last):
File "/home/kissoul/miniconda3/envs/lf/bin/llamafactory-cli", line 8, in <module>
sys.exit(main())
^^^^^^
File "/home/kissoul/WORKDIR/LLaMA-Factory/src/llamafactory/cli.py", line 111, in main
run_exp()
File "/home/kissoul/WORKDIR/LLaMA-Factory/src/llamafactory/train/tuner.py", line 56, in run_exp
run_dpo(model_args, data_args, training_args, finetuning_args, callbacks)
File "/home/kissoul/WORKDIR/LLaMA-Factory/src/llamafactory/train/dpo/workflow.py", line 45, in run_dpo
model = load_model(tokenizer, model_args, finetuning_args, training_args.do_train)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/kissoul/WORKDIR/LLaMA-Factory/src/llamafactory/model/loader.py", line 139, in load_model
model = load_unsloth_pretrained_model(config, model_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/kissoul/WORKDIR/LLaMA-Factory/src/llamafactory/model/model_utils/unsloth.py", line 53, in load_unsloth_pretrained_model
from unsloth import FastLanguageModel
File "/home/kissoul/WORKDIR/unsloth/unsloth/__init__.py", line 26, in <module>
raise ImportError(f"Unsloth: Please import Unsloth before {module}.")
ImportError: Unsloth: Please import Unsloth before bitsandbytes.
Reminder
System Info
llamafactory-cli env [2024-06-28 02:51:51,234] [INFO] [real_accelerator.py:203:get_accelerator] Setting ds_accelerator to cuda (auto detect) [WARNING] Please specify the CUTLASS repo directory as environment variable $CUTLASS_PATH [WARNING] sparse_attn requires a torch version >= 1.5 and < 2.0 but detected 2.3 [WARNING] using untested triton version (2.3.0), only 1.0.0 is known to be compatible
llamafactory
version: 0.8.3.dev0Reproduction
Expected behavior
最新主分支
Others
No response