Open sihot opened 3 weeks ago
Hi, could you please provide more information on the issue - are you facing this when you run the download.sh file, are you using WSL or windows cmd line, what's your hardware specifications?
Please note that the example scripts in this repo are for running inference on single (for 8B) and multi (for 70B) GPU setups using CUDA, but Windows is not currently supported.
You might want also like to check out these examples for running Llama locally / without distributed via hugging face or ollama https://github.com/meta-llama/llama-recipes/tree/main/recipes/quickstart/Running_Llama2_Anywhere
hello ,i have input the URL,but show this issue
now ,i can not download the models, because 403
Adding a proxy to your order from other districts will fix your problem.
now ,i can not download the models, because 403
I run bash download.sh in windows cmd and got this error
Processing fstab with mount -a failed. Failed to mount C:\, see dmesg for more details. Failed to mount D:\, see dmesg for more details. Failed to mount G:\, see dmesg for more details.
<3>WSL (11) ERROR: CreateProcessEntryCommon:334: getpwuid(0) failed 2 <3>WSL (11) ERROR: CreateProcessEntryCommon:505: execvpe /bin/bash failed 2 <3>WSL (11) ERROR: CreateProcessEntryCommon:508: Create process not expected to return ![image](https://github.com/meta-llama/llama3/assets/22848341/f0726be6-0406-41e4-8982-73ec166ac210)