Closed BZandi closed 3 weeks ago
Hi there,
do you perhaps have an older version of litgpt installed? Maybe try
git clone https://github.com/Lightning-AI/litgpt
cd litgpt
pip install -e ".[all]"
litgpt download --repo_id meta-llama/Meta-Llama-3-8B-Instruct --access_token=<TOKEN>
@BZandi Can you share the complete error stacktrace? You excluded the real exception which would have appeared before the "During handling of the above exception, another exception occurred:" line
Thank you for the swift response! Yes, there was an older version of litgpt installed in my environment. Creating a new environment and installing litgpt as advised by @rasbt solved the issue!
Here is the full output of the previous error message as requested by @carmocca:
Converting checkpoint files to LitGPT format.
Traceback (most recent call last):
File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/litgpt/config.py", line 99, in from_name
conf_dict = next(config for config in configs if name == config["hf_config"]["name"])
StopIteration
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/pa/Desktop/litgpt/litgpt/scripts/download.py", line 137, in <module>
CLI(download_from_hub)
File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/litgpt/utils.py", line 398, in CLI
return CLI(*args, **kwargs)
File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/jsonargparse/_cli.py", line 96, in CLI
return _run_component(components, cfg_init)
File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/jsonargparse/_cli.py", line 193, in _run_component
return component(**cfg)
File "/Users/pa/Desktop/litgpt/litgpt/scripts/download.py", line 107, in download_from_hub
convert_hf_checkpoint(checkpoint_dir=directory, dtype=dtype, model_name=model_name)
File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/litgpt/scripts/convert_hf_checkpoint.py", line 301, in convert_hf_checkpoint
config = Config.from_name(model_name)
File "/Users/pa/anaconda3/envs/litgpt/lib/python3.9/site-packages/litgpt/config.py", line 101, in from_name
raise ValueError(f"{name!r} is not a supported config name")
ValueError: 'Meta-Llama-3-8B-Instruct' is not a supported config name
Glad to hear that you were able to find & address the issue!
I receive the following error when using the
download.py
script to load the llama3-8b weights:Steps to reproduce:
Any idea how to solve this error? Thanks in advance!