meta-llama / llama-stack-client-python

Python SDK for Llama Stack
Apache License 2.0
77 stars 19 forks source link

405 Error only on 70B-Instruct #4

Closed dnns92 closed 1 month ago

dnns92 commented 1 month ago
  1. Description

I want to download 3.1-70B-Instruct using the python llama stack client. When downloading the 8B-Instruct Model it works fine. Other tested models also work fine. 70B-Instruct fails on downloading consolidated.01.pth, consolidated00.pth` runs through. Also see the scrollback.

Environment is WSL, 24.04 LTS, 3090 GPU

  1. Steps to reproduce python3 -m venv .venv && source .venv/bin/activate, llama download --source meta --model-id Llama3.1-70B-Instruct, paste url and wait

  2. Expected behaviour successful download

  3. Actual behaviour download fails on consolidated.01.pth with Error 405

  4. Scrollback

    llama download --source meta --model-id Llama3.1-70B-Instruct
    Please provide the signed URL you received via email (e.g., https://llama3-1.llamameta.net/*?Policy...): 
    <... redacted ...>
    Downloading `checklist.chk`...
    Already downloaded `/home/nano/.llama/checkpoints/Llama3.1-70B-Instruct/checklist.chk`, skipping...
    Downloading `tokenizer.model`...Already downloaded `/home/nano/.llama/checkpoints/Llama3.1-70B-Instruct/tokenizer.model`, skipping...
    Downloading `params.json`...
    Already downloaded `/home/nano/.llama/checkpoints/Llama3.1-70B-Instruct/params.json`, skipping...
    Downloading `consolidated.00.pth`...
    Already downloaded `/home/nano/.llama/checkpoints/Llama3.1-70B-Instruct/consolidated.00.pth`, skipping...
    Downloading `consolidated.01.pth`...
    Traceback (most recent call last):
    File "/mnt/d/repos/llama/llama3/.venv/bin/llama", line 8, in <module>
    sys.exit(main())
             ^^^^^^
    File "/mnt/d/repos/llama/llama3/.venv/lib/python3.12/site-packages/llama_stack/cli/llama.py", line 44, in main
    parser.run(args)
    File "/mnt/d/repos/llama/llama3/.venv/lib/python3.12/site-packages/llama_stack/cli/llama.py", line 38, in run
    args.func(args)
    File "/mnt/d/repos/llama/llama3/.venv/lib/python3.12/site-packages/llama_stack/cli/download.py", line 174, in run_download_cmd
    _meta_download(model, meta_url)
    File "/mnt/d/repos/llama/llama3/.venv/lib/python3.12/site-packages/llama_stack/cli/download.py", line 143, in _meta_download
    asyncio.run(downloader.download())
    File "/usr/lib/python3.12/asyncio/runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
    File "/usr/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/usr/lib/python3.12/asyncio/base_events.py", line 687, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
    File "/mnt/d/repos/llama/llama3/.venv/lib/python3.12/site-packages/llama_stack/cli/download.py", line 263, in download
    await self.get_file_info(client)
    File "/mnt/d/repos/llama/llama3/.venv/lib/python3.12/site-packages/llama_stack/cli/download.py", line 252, in get_file_info
    response.raise_for_status()
    File "/mnt/d/repos/llama/llama3/.venv/lib/python3.12/site-packages/httpx/_models.py", line 763, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
    httpx.HTTPStatusError: Client error '405 Forbidden.' for url 'https://llama3-1.llamameta.net/Llama-3.1-70B-Instruct/consolidated.01.pth?Policy=<redacted>'
    For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/405
dnns92 commented 1 month ago

Re-registering for a new url solved the issue.