Lightning-AI / litgpt

Pretrain, finetune, deploy 20+ LLMs on your own data. Uses state-of-the-art techniques: flash attention, FSDP, 4-bit, LoRA, and more.
https://lightning.ai
Apache License 2.0
6.85k stars 726 forks source link

ValueError: 'Meta-Llama-3-8B-Instruct' is not a supported config name #1351

Closed BZandi closed 3 weeks ago

BZandi commented 3 weeks ago

I receive the following error when using the download.py script to load the llama3-8b weights:

....
checkpoints/meta-llama/Meta-Llama-3-70B-Instruct/model-00006-of-00030.safetensors --> checkpoints/meta-llama/Meta-Llama-3-70B-Instruct/model-00006-of-00030.bin
checkpoints/meta-llama/Meta-Llama-3-70B-Instruct/model-00009-of-00030.safetensors --> checkpoints/meta-llama/Meta-Llama-3-70B-Instruct/model-00009-of-00030.bin
Converting checkpoint files to LitGPT format.
Traceback (most recent call last):
....
conf_dict = next(config for config in configs if name == config["hf_config"]["name"])
StopIteration
During handling of the above exception, another exception occurred:
....
ValueError: 'Meta-Llama-3-8B-Instruct' is not a supported config name

Steps to reproduce:

git clone https://github.com/Lightning-AI/litgpt 
cd litgpt
python litgpt/scripts/download.py --repo_id meta-llama/Meta-Llama-3-8B-Instruct --access_token=<TOKEN>

Any idea how to solve this error? Thanks in advance!

rasbt commented 3 weeks ago

Hi there,

do you perhaps have an older version of litgpt installed? Maybe try

git clone https://github.com/Lightning-AI/litgpt 
cd litgpt
pip install -e ".[all]"
litgpt download --repo_id meta-llama/Meta-Llama-3-8B-Instruct --access_token=<TOKEN>
carmocca commented 3 weeks ago

@BZandi Can you share the complete error stacktrace? You excluded the real exception which would have appeared before the "During handling of the above exception, another exception occurred:" line

BZandi commented 3 weeks ago

Thank you for the swift response! Yes, there was an older version of litgpt installed in my environment. Creating a new environment and installing litgpt as advised by @rasbt solved the issue!

Here is the full output of the previous error message as requested by @carmocca:

Converting checkpoint files to LitGPT format.
Traceback (most recent call last):
  File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/litgpt/config.py", line 99, in from_name
    conf_dict = next(config for config in configs if name == config["hf_config"]["name"])
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/pa/Desktop/litgpt/litgpt/scripts/download.py", line 137, in <module>
    CLI(download_from_hub)
  File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/litgpt/utils.py", line 398, in CLI
    return CLI(*args, **kwargs)
  File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/jsonargparse/_cli.py", line 96, in CLI
    return _run_component(components, cfg_init)
  File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/jsonargparse/_cli.py", line 193, in _run_component
    return component(**cfg)
  File "/Users/pa/Desktop/litgpt/litgpt/scripts/download.py", line 107, in download_from_hub
    convert_hf_checkpoint(checkpoint_dir=directory, dtype=dtype, model_name=model_name)
  File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/litgpt/scripts/convert_hf_checkpoint.py", line 301, in convert_hf_checkpoint
    config = Config.from_name(model_name)
  File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/litgpt/config.py", line 101, in from_name
    raise ValueError(f"{name!r} is not a supported config name")
ValueError: 'Meta-Llama-3-8B-Instruct' is not a supported config name
rasbt commented 3 weeks ago

Glad to hear that you were able to find & address the issue!