Closed vinilvadakkepurakkal closed 1 week ago
Try to download the model using Hugging Face. https://huggingface.co/meta-llama/Llama-2-13b-chat https://huggingface.co/meta-llama/Llama-2-13b-chat-hf
Your URL might have expired depending. Can you try the following
pip install llama-stack -U
llama model list
Outside of this, encourage using the latest models from Llama 3, 3.1 and 3.2 versions
Please re-open the issue if it is not working.
I also have this problem even with the most current url. Have you fix this problem?
(llama2) C:\Users\vinilv>llama model download --source meta --model-id Llama-2-13b-chat Please provide the signed URL for model Llama-2-13b-chat you received via email after visiting https://www.llama.com/llama-downloads/ (e.g., https://llama3-1.llamameta.net/*?Policy...): https://download.llamameta.net/*?Policy=eyJTdGF0ZW1lbnQiOlt7InVuaXF1ZV9oYXNoIjoid3F4OWk5cGJwejd2dzQyMXM5NGtrbjBzIiwiUmVzb3VyY2UiOiJodHRwczpcL1wvZG93bmxvYWQubGxhbWFtZXRhLm5ldFwvKiIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTczMTI5NjgyNn19fV19&Signature=e%7Ejb4MWOSFJFti2ifc0c-ERN47IAOwtebku1Z%7EstnAxrRq2UUTFToCossmtbfAJQnMz7jeYuF5pbPeEJP1nFcjus46QrnnaOPI2HlwXfavrjSiF-bFdu38o0snbrRW7UpFK8UossjCeofUyXFQL43YymaPCdsehOZoDDWdg1ISG1nsnakSLtl2rOJt3FrWjEwi5XS6zS6c%7EeajqSSzPTueaobCDDfqf7-%7EenNCfoNi90Iwr18Vp%7ExzUxVSbSdhTqmHerQORwQ8BJMtWyN1MwSSy0ZENZu8H9s1YX7Vhhf8yOuc8oCFwP7QC0qbFvgMvAUF8NHaAh8-rLkZAXh%7EYhNg__&Key-Pair-Id=K15QRJLYKIFSLZ&Download-Request-ID=1180414533621479 Downloading
sys.exit(main())
File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\site-packages\llama_stack\cli\llama.py", line 44, in main
parser.run(args)
File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\site-packages\llama_stack\cli\llama.py", line 38, in run
args.func(args)
File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\site-packages\llama_stack\cli\download.py", line 177, in run_download_cmd
_meta_download(model, meta_url, info)
File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\site-packages\llama_stack\cli\download.py", line 136, in _meta_download
asyncio.run(downloader.download())
File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\asyncio\runners.py", line 44, in run
return loop.run_until_complete(main)
File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\asyncio\base_events.py", line 649, in run_until_complete
return future.result()
File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\site-packages\llama_stack\cli\download.py", line 266, in download
await self.get_file_info(client)
File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\site-packages\llama_stack\cli\download.py", line 255, in get_file_info
response.raise_for_status()
File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\site-packages\httpx_models.py", line 761, in raise_for_status
raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '403 Forbidden' for url 'https://download.llamameta.net/llama-2-13b-chat/tokenizer.model?Policy=eyJTdGF0ZW1lbnQiOlt7InVuaXF1ZV9oYXNoIjoid3F4OWk5cGJwejd2dzQyMXM5NGtrbjBzIiwiUmVzb3VyY2UiOiJodHRwczpcL1wvZG93bmxvYWQubGxhbWFtZXRhLm5ldFwvKiIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTczMTI5NjgyNn19fV19&Signature=e%7Ejb4MWOSFJFti2ifc0c-ERN47IAOwtebku1Z%7EstnAxrRq2UUTFToCossmtbfAJQnMz7jeYuF5pbPeEJP1nFcjus46QrnnaOPI2HlwXfavrjSiF-bFdu38o0snbrRW7UpFK8UossjCeofUyXFQL43YymaPCdsehOZoDDWdg1ISG1nsnakSLtl2rOJt3FrWjEwi5XS6zS6c%7EeajqSSzPTueaobCDDfqf7-%7EenNCfoNi90Iwr18Vp%7ExzUxVSbSdhTqmHerQORwQ8BJMtWyN1MwSSy0ZENZu8H9s1YX7Vhhf8yOuc8oCFwP7QC0qbFvgMvAUF8NHaAh8-rLkZAXh%7EYhNg &Key-Pair-Id=K15QRJLYKIFSLZ&Download-Request-ID=1180414533621479'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/403
checklist.chk
... Already downloadedC:\Users\vinilv\.llama\checkpoints\Llama-2-13b-chat\checklist.chk
, skipping... Downloadingtokenizer.model
... Traceback (most recent call last): File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\lib\runpy.py", line 86, in _run_code exec(code, run_globals) File "C:\Users\vinilv\AppData\Local\anaconda3\envs\llama2\Scripts\llama.exe__main.py", line 7, in(llama2) C:\Users\vinilv>