hiyouga / LLaMA-Factory

Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
https://arxiv.org/abs/2403.13372
Apache License 2.0
34.01k stars 4.19k forks source link

Latest LLaMA-Factory repo force to use Troch 2.4 hence is clashing with Unsloth/XFormers #5431

Open thusinh1969 opened 1 month ago

thusinh1969 commented 1 month ago

Reminder

System Info

Latest LLaMA-Factory repo 12Septr2024 forces to use Torch 2.4 hence is clashing with Unsloth/XFormers

Reproduction

Expected behavior

Latest LLaMA-Factory should be used Torch 2.3 and works with Unsloth.

Others

Latest LLaMA-Factory repo 12Septr2024 forces to use Torch 2.4 hence is clashing with Unsloth/XFormers

hiyouga commented 1 month ago

Could you share the crash info?

thusinh1969 commented 1 month ago

# Standard run w/ Unsloth:

# Installs Unsloth, Xformers (Flash Attention) and all other packages!
!pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
!pip install --no-deps "xformers<0.0.27" "trl<0.9.0" peft accelerate bitsandbytes

Otherwise in local machines, your xformers version of 0.0.27.post2 is too new.
Please downgrade xformers via `pip install --force-reinstall "xformers<0.0.27"

# Re-install pip install --force-reinstall "xformers<0.0.27" --> Torch 2.3

Traceback (most recent call last):
  File "/home/steve/env/bin/llamafactory-cli", line 5, in <module>
    from llamafactory.cli import main
  File "/mnt/data01/LLaMA-Factory/src/llamafactory/cli.py", line 21, in <module>
    from . import launcher
  File "/mnt/data01/LLaMA-Factory/src/llamafactory/launcher.py", line 15, in <module>
    from llamafactory.train.tuner import run_exp  # use absolute import
  File "/mnt/data01/LLaMA-Factory/src/llamafactory/train/tuner.py", line 22, in <module>
    from ..data import get_template_and_fix_tokenizer
  File "/mnt/data01/LLaMA-Factory/src/llamafactory/data/__init__.py", line 22, in <module>
    from .loader import get_dataset
  File "/mnt/data01/LLaMA-Factory/src/llamafactory/data/loader.py", line 30, in <module>
    from .template import get_template_and_fix_tokenizer
  File "/mnt/data01/LLaMA-Factory/src/llamafactory/data/template.py", line 24, in <module>
    from .mm_plugin import get_mm_plugin
  File "/mnt/data01/LLaMA-Factory/src/llamafactory/data/mm_plugin.py", line 5, in <module>
    from transformers import ProcessorMixin
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/home/steve/env/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1735, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/steve/env/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1747, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.processing_utils because of the following error (look up to see its traceback):
module 'torch.library' has no attribute 'register_fake'
hiyouga commented 1 month ago

see https://github.com/huggingface/diffusers/issues/8958