lxe / simple-llm-finetuner

Simple UI for LLM Model Finetuning
MIT License
2.05k stars 132 forks source link

(WSL2) - No GPU / Cuda detected.... #13

Closed Gitterman69 closed 1 year ago

Gitterman69 commented 1 year ago
===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
CUDA SETUP: WARNING! libcuda.so not found! Do you have a CUDA driver installed? If you are on a cluster, make sure you are on a CUDA machine!
CUDA SETUP: CUDA runtime path found: /home/user/anaconda3/envs/llama/lib/libcudart.so
/home/user/anaconda3/envs/llama/lib/python3.10/site-packages/bitsandbytes/cuda_setup/main.py:136: UserWarning: WARNING: No GPU detected! Check your CUDA paths. Proceeding to load CPU-only library...
  warn(msg)
CUDA SETUP: Loading binary /home/user/anaconda3/envs/llama/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cpu.so...
/home/user/anaconda3/envs/llama/lib/python3.10/site-packages/bitsandbytes/cextension.py:31: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers and GPU quantization are unavailable.
  warn("The installed version of bitsandbytes was compiled without GPU support. "
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
calz1 commented 1 year ago

Do you have CUDA installed?

Gitterman69 commented 1 year ago

Yes i do - works but it might be a wsl2 problem?

Sent from Proton Mail for iOS

On Wed, Mar 22, 2023 at 18:34, calz1 @.***> wrote:

Do you have CUDA installed?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

calz1 commented 1 year ago

As a courtesy to the maintainer, you should do a hello world example showing that you can use your GPU from within WSL2.

lxe commented 1 year ago

For WSL, you'll need to install CUDA yourself using these steps:

https://developer.nvidia.com/cuda-downloads?target_os=Linux&target_arch=x86_64&Distribution=WSL-Ubuntu&target_version=2.0&target_type=deb_local

Then export the library path before running code:

export LD_LIBRARY_PATH=/usr/lib/wsl/lib
lxe commented 1 year ago

Added this to the README

Gitterman69 commented 1 year ago

Works now!!!! Thanks

Sent from Proton Mail for iOS

On Thu, Mar 23, 2023 at 00:38, Aleksey Smolenchuk @.***> wrote:

Added this to the README

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>