bigcode-project / starcoder2

Home of StarCoder2!
Apache License 2.0
1.71k stars 158 forks source link

Clash in requirements for finetuning Starcoder2 #12

Open Exorust opened 6 months ago

Exorust commented 6 months ago

Facing the following error while trying to finetune Starcoder2 with the given script.

Description:

For transformers.AutoModelForCausalLM to recognize Starcoder2 transformers>4.39.0 is required.

But trl is still using transformers==4.38.2. Even if I compile from source & use trl=0.7.12.dev0 I still get an issue.

Here is the error with using transformers==4.38.2

KeyError                                  Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
   1127             try:
-> 1128                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
   1129             except KeyError:

4 frames
KeyError: 'starcoder2'

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
   1128                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
   1129             except KeyError:
-> 1130                 raise ValueError(
   1131                     f"The checkpoint you are trying to load has model type `{config_dict['model_type']}` "
   1132                     "but Transformers does not recognize this architecture. This could be because of an "

ValueError: The checkpoint you are trying to load has model type `starcoder2` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date. 

Here is the error when using transformers==4.39.0

ImportError                               Traceback (most recent call last)
[<ipython-input-2-3ef713ffd06d>](https://localhost:8080/#) in <cell line: 1>()
----> 1 from trl import SFTTrainer
      2 print("trl version:", trl.__version__)

1 frames
[/usr/local/lib/python3.10/dist-packages/trl/__init__.py](https://localhost:8080/#) in <module>
      3 __version__ = "0.7.12.dev0"
      4 
----> 5 from .core import set_seed
      6 from .environment import TextEnvironment, TextHistory
      7 from .extras import BestOfNSampler

[/usr/local/lib/python3.10/dist-packages/trl/core.py](https://localhost:8080/#) in <module>
     23 import torch.nn.functional as F
     24 from torch.nn.utils.rnn import pad_sequence
---> 25 from transformers import top_k_top_p_filtering
     26 
     27 from .import_utils import is_npu_available, is_xpu_available

ImportError: cannot import name 'top_k_top_p_filtering' from 'transformers' (/usr/local/lib/python3.10/dist-packages/transformers/__init__.py)

---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.
---------------------------------------------------------------------------
Exorust commented 6 months ago

Issue is still open: https://github.com/huggingface/trl/issues/1409 It is shown here as well

Exorust commented 5 months ago

Nothing? @loubnabnl any suggestions on how the default Finetuning script doesn't work out of the box?

mrmattwright-mt commented 5 months ago

If you change the requirements.txt to reference the latest version of trl this will work fine.

git+https://github.com/huggingface/transformers.git
accelerate==0.27.1
datasets>=2.16.1
bitsandbytes==0.41.3
peft==0.8.2
git+https://github.com/huggingface/trl.git
wandb==0.16.3
huggingface_hub==0.20.3

That should do it. I'm using Rye to install dependencies, but that change worked for me. Rye dependencies in pyproject.toml looks like this...

dependencies = [
    "transformers @ git+https://github.com/huggingface/transformers.git",
    "accelerate==0.27.1",
    "datasets>=2.16.1",
    "bitsandbytes==0.41.3",
    "peft==0.8.2",
    "trl @ git+https://github.com/huggingface/trl.git",
    "wandb==0.16.3",
    "huggingface_hub==0.20.3",
]