meta-llama / llama3

The official Meta Llama 3 GitHub site
Other
26.48k stars 2.99k forks source link

Help Needed: Installing Llama 2 70B, Llama 3 70B & LLaMA 2 30B (FP16) on Windows Locally #235

Open kirushake opened 4 months ago

kirushake commented 4 months ago

I'm trying to install Llama 2 13b chat hf, Llama 3 8B, and Llama 2 13B (FP16) on my Windows gaming rig locally that has dual RTX 4090 GPUs. I aim to access and run these models from the terminal offline. I've hit a few roadblocks and could really use some help.

Here are the specifics of my setup:

Windows 10 Dual MSI RTX 4090 Suprim Liquid X 24GB GPUs Intel Core i9 14900K 14th Gen Desktop Processor 64GB DDR5 RAM 2x Samsung 990 Pro 2TB Gen4 NVMe SSD Has anyone successfully installed and run these models in a similar setup? If so, could you provide detailed steps or point me to relevant resources? Any tips on optimizing the installation for dual GPUs would be greatly appreciated as well.

Thanks in advance for your assistance!

albertodepaola commented 4 months ago

Hi @kirushake, what would be the issues you are seeing? With some modifications the code in this repository could work on windows natively, but the code is made to work on Linux. You can use WSL on windows to execute this code as well. Thanks for asking and feel free to provide additional information.

kirushake commented 4 months ago

Hi @kirushake, what would be the issues you are seeing? With some modifications the code in this repository could work on windows natively, but the code is made to work on Linux. You can use WSL on windows to execute this code as well. Thanks for asking and feel free to provide additional information.

Hi @albertodepaola I managed to install Timdettmers/Guanaco-7b on Windows, however my system performed slowly since I could not utilize the power of two GPUs; only one GPU was handling the load. Installing and utilizing Llama 2 70 B, Llama 3 70 B, LLaMA 2 30 B (FP16), or lesser sizes that would function flawlessly on my machine, is the major goal. I recently installed WSL; could you please walk me through the process?

ajayspatil7 commented 2 months ago

Hey can you try using it with Olamma ???