NickLucche / stable-diffusion-nvidia-docker

GPU-ready Dockerfile to run Stability.AI stable-diffusion model v2 with a simple web interface. Includes multi-GPUs support.
MIT License
357 stars 43 forks source link

Tesla GPUs needs vGPU license to pass through to Docker #18

Open angelmankel opened 1 year ago

angelmankel commented 1 year ago

Hi, I recently got two Tesla P40 GPUs which I was hoping to use with this. From my understanding, the Tesla P40s need the vGPU license in order to pass through via WSL. I am using my tesla cards locally for other applications as well and basically use this as a graphics/machine learning server running windows 11 so I don't really want to install Linux on the PC itself.

Do you see any easy way to run this without docker? Hopefully I'm wrong about the licensing. I tried to export the container run the scripts locally but I honestly don't know what i'm doing with that and didn't make much progress.

NickLucche commented 1 year ago

Mmm I don't know of any particular license between the P40s and the WSL but I am not a windows user, maybe someone else can clarify this. I think windows11 should have a more integrated support for wsl. Can you use commands like nvidia-smi by opening up a console or powershell?

angelmankel commented 1 year ago

According to the documentation they say "Tesla GPUs aren't support yet" so maybe it's not exactly a licensing issue after all but from what I read, the Tesla P40s can't be put into WDDM mode without a license (I tried) which is required for WSL unfortunately. https://docs.nvidia.com/cuda/wsl-user-guide/index.html image

When I run nvidia-smi in PowerShell this is what I get. image

If I run nvidia-smi in wsl this is what I get. image

NickLucche commented 1 year ago

any chance you can run docker run.. from powershell? I am not aware of the changes introduced in win11 sorry, I'll mark "help wanted" here

huotarih commented 1 year ago
Näyttökuva 2022-11-16 kello 22 55 53

Tesla T4 works fine.

Anonym0us33 commented 1 year ago

According to the documentation they say "Tesla GPUs aren't support yet" so maybe it's not exactly a licensing issue after all but from what I read, the Tesla P40s can't be put into WDDM mode without a license (I tried) which is required for WSL unfortunately. https://docs.nvidia.com/cuda/wsl-user-guide/index.html image

When I run nvidia-smi in PowerShell this is what I get. image

If I run nvidia-smi in wsl this is what I get. image

image similar problem. different error. Tesla K80 and RTX 3070

vleeuwenmenno commented 1 year ago

Wasn't this removed relatively recently in the newer drivers? I remember being able to run 2 VMs on my RTX 2080

helyxzion50943 commented 11 months ago

I have the very same set up like to know as much as possible wish there was a way to get in contact ?