meta-llama / llama-models

Utilities intended for use with Llama models.
Other
4.65k stars 810 forks source link

Download script fails with "403 Forbidden" error #70

Open nikollgjokaj opened 3 months ago

nikollgjokaj commented 3 months ago

I encountered an issue while attempting to download specific models using the provided download.sh script. The script successfully downloads the LICENSE file but fails to download the Use Policy file, resulting in a "403 Forbidden" error. Below are the details of the error and the steps I followed.

Steps to Reproduce:

Run the download.sh script.
Enter the provided URL: XXX
Enter the list of models to download: 8B,8B-instruct

The script should successfully download the LICENSE and Use Policy files and proceed to download the specified models.

Actual Result:

The LICENSE file is downloaded successfully.
The script fails to download the Use Policy file, returning a "403 Forbidden" error.

Logs/Output:

shell

--2024-07-28 11:29:57-- https://llama3-1.llamameta.net/LICENSE?XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Resolving llama3-1.llamameta.net (llama3-1.llamameta.net)... 18.239.69.95, 18.239.69.68, 18.239.69.73, ... Connecting to llama3-1.llamameta.net (llama3-1.llamameta.net)|18.239.69.95|:443... connected. HTTP request sent, awaiting response... 416 Requested Range Not Satisfiable

The file is already fully retrieved; nothing to do.

--2024-07-28 11:29:57-- https://llama3-1.llamameta.net/USE_POLICY?XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Resolving llama3-1.llamameta.net (llama3-1.llamameta.net)... 18.239.69.68, 18.239.69.95, 18.239.69.73, ... Connecting to llama3-1.llamameta.net (llama3-1.llamameta.net)|18.239.69.68|:443... connected. HTTP request sent, awaiting response... 403 Forbidden 2024-07-28 11:29:58 ERROR 403: Forbidden.

Environment:

OS: LINUX UBUNTU

Please let me know if any further information is required or if there are additional steps I can take to troubleshoot this issue.

jeffreydunaway commented 3 months ago

Any updates? It appears another person asked the same question a few hours ago.

samuelselvan commented 3 months ago

@nikollgjokaj can I have the last URL param Download-RequestID? Please not that these URLs do expire in 24 hours.

gachez commented 3 months ago

Use the download.sh script from this directory models/llama3_1/download.sh. That worked for me!

AmahAjavon commented 3 months ago

having the same issue, any update? also using the models/llama3_1/download.sh did not work for me

Piotr-rogal commented 3 months ago

I use this method and it works.

  1. Go to your llama folder (e.g. cd llama3)
  2. List files (ls)
  3. Remove download.sh (rm download.sh)
  4. Go to https://github.com/meta-llama/llama-models/blob/main/models/llama3_1/download.sh
  5. Copy data from download.sh (copy raw file)
  6. Use nano to create download.sh (nano download.sh)
  7. Paste data from memory (ctrl+v)
  8. End nano and write to file (ctrl+x)
  9. Add permision to download.sh (chmod +x download.sh)
  10. Run download.sh (./download.sh)

Rest the same as before - paste link from meta email

FryUoh commented 3 months ago

I use this method and it works.

  1. Go to your llama folder (e.g. cd llama3)
  2. List files (ls)
  3. Remove download.sh (rm download.sh)
  4. Go to https://github.com/meta-llama/llama-models/blob/main/models/llama3_1/download.sh
  5. Copy data from download.sh (copy raw file)
  6. Use nano to create download.sh (nano download.sh)
  7. Paste data from memory (ctrl+v)
  8. End nano and write to file (ctrl+x)
  9. Add permision to download.sh (chmod +x download.sh)
  10. Run download.sh (./download.sh)

Rest the same as before - paste link from meta email

it did not work for me(still reported 403)

rattrey commented 2 months ago

getting the same issue. it looks like the core python scripts install when running download.sh file but weights files are missing

AmahAjavon commented 2 months ago

There seem to also be an issue even once you download it, to actually use it with things like langchain, there is a missing config.json when you get the model from the official site vs huggingface, there are some missmatch...

ashwinb commented 2 months ago

Could you try using the llama CLI which can be obtained by pip install llama-toolchain. We have significantly upgraded its llama download functionality -- please let us know if that ends up working better for downloading the models.

jstmn commented 2 months ago

@ashwinb I'm getting the following error when using llama download, any ideas?

(venv) jstm@yggdrasil:[~/Libraries/llama]: llama download --source meta --model-id="Llama-3-70B-Instruct" --hf-token=$url
Please provide the signed URL you received via email (e.g., https://llama3-1.llamameta.net/*?Policy...): https://llama3-1.llamameta.net/*?Policy=X&Signature=Y&Key-Pair-Id=Z&Download-Request-ID=W
Downloading `checklist.chk`...
Traceback (most recent call last):
  File "/home/jstm/Libraries/llama/venv/bin/llama", line 8, in <module>
    sys.exit(main())
  File "/home/jstm/Libraries/llama/venv/lib/python3.10/site-packages/llama_toolchain/cli/llama.py", line 54, in main
    parser.run(args)
  File "/home/jstm/Libraries/llama/venv/lib/python3.10/site-packages/llama_toolchain/cli/llama.py", line 48, in run
    args.func(args)
  File "/home/jstm/Libraries/llama/venv/lib/python3.10/site-packages/llama_toolchain/cli/download.py", line 156, in run_download_cmd
    _meta_download(model, meta_url)
  File "/home/jstm/Libraries/llama/venv/lib/python3.10/site-packages/llama_toolchain/cli/download.py", line 133, in _meta_download
    asyncio.run(downloader.download())
  File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
    return future.result()
  File "/home/jstm/Libraries/llama/venv/lib/python3.10/site-packages/llama_toolchain/cli/download.py", line 194, in download
    await self.get_file_info(client)
  File "/home/jstm/Libraries/llama/venv/lib/python3.10/site-packages/llama_toolchain/cli/download.py", line 183, in get_file_info
    response.raise_for_status()
  File "/home/jstm/Libraries/llama/venv/lib/python3.10/site-packages/httpx/_models.py", line 761, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '403 Forbidden' for url 'https://llama3-1.llamameta.net/*?Policy=X&Signature=Y&Key-Pair-Id=Z&Download-Request-ID=W'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403

edit: to clarify this was with a fresh link

ved4589ved commented 2 months ago

i am facing the same issue after trying with llama download --source meta --model-id Meta-Llama3.1-8B

AlienegraGeek commented 1 month ago

Note that when using download.sh, you will need to switch the url from your application page to llama3 to successfully download. Do not use links beginning with https://llama3-1.llamameta.net, which is 3.1. Use the link at the beginning of https://download6.llamameta.net.

okellos commented 1 month ago

i am facing the same issue after trying with llama download --source meta --model-id Meta-Llama3.1-8B

Your command enabled me to download