NVIDIA / Stable-Diffusion-WebUI-TensorRT

TensorRT Extension for Stable Diffusion Web UI
MIT License
1.92k stars 146 forks source link

TensorRT doesn't install on fresh A1111 install on Ubuntu, stuck at install after copy/paste de git link #329

Open tomakorea opened 6 months ago

tomakorea commented 6 months ago

I have an RTX 3090, I'm using Ubuntu through SSH on another computer. On a fresh install of A1111, I went to extensions => then added the git and clicked installed, but it stays over 10 minutes at "processing" while my drive is a high performance NVME drive.

No debug info in the terminal...but the extensions folder has a Stable-Diffusion-WebUI-TensorRT folder. After over 10 minutes, I tried to restart A1111, but it refuses to load if the TensorRT folder is present. Setup : RTX3090 A1111 version 1.9.3 (also tested on SD Forge with same issues) Conda with Python 3.10.12 environment 32gb Ram i7 8800k Latest Ubuntu in command line/terminal mode controlled via ssh on another computer.

FurkanGozukara commented 5 months ago

same on ubuntu it stucks forever

Ornatrix commented 5 months ago

same

sup2069 commented 5 months ago

I am running into the same issue but on Debian Sid:

RTX 3090 A1111 version 1.9.0-RC-7-gradadb4e3 python: 3.10.10 torch 2.3.0+cu121 xformers: 0.0.26.post1 gradio: 3.41.2 64GB Ram 12th Gen Intel(R) Core(TM) i9-12900K

FurkanGozukara commented 5 months ago

I solved this problem and pubislihed installers for runpod and massed compute (Ubuntu)

Runpod installer : https://www.patreon.com/posts/1-click-for-sd-86438018

Massed compute installer (Ubuntu Linux) : https://www.patreon.com/posts/massed-compute-105735932

It is a shame that 2 trillion dollar company doesn't care

Also my installers install a fork that supports Lora perfectly for TensorRT

tomakorea commented 5 months ago

Too bad your stuff is behind a paywall.. it's not very open source friendly imho

CedricHwong commented 5 months ago

Manually install all pip packages mentioned in the install.py of the trt plugin in your python virtual environment, and then comment out the part about pip uninstalling cudnn in install.py before starting webui.sh. The above method works for me.

tomakorea commented 5 months ago

Manually install all pip packages mentioned in the install.py of the trt plugin in your python virtual environment, and then comment out the part about pip uninstalling cudnn in install.py before starting webui.sh. The above method works for me.

So do you mean I should install this extension manually by doing a git clone of this repository in the extensions folder ? then manually install the dependencies, then comment pip uninstalling cudnn in install.py that is located in the newly created folder inside the extensions folder ? I tried that but the webui never started, and stops after basic initialization, it doesn't even start the gradio server interface

sup2069 commented 5 months ago

Manually install all pip packages mentioned in the install.py of the trt plugin in your python virtual environment, and then comment out the part about pip uninstalling cudnn in install.py before starting webui.sh. The above method works for me.

So do you mean I should install this extension manually by doing a git clone of this repository in the extensions folder ? then manually install the dependencies, then comment pip uninstalling cudnn in install.py that is located in the newly created folder inside the extensions folder ? I tried that but the webui never started, and stops after basic initialization, it doesn't even start the gradio server interface

I did the extension route, then after a few minutes, killed the webui.sh, followed the instructions and reran the webui.sh. Or you can do the get clone command in the extension folder to save time. Got it working!

CatBut commented 5 months ago

Manually install all pip packages mentioned in the install.py of the trt plugin in your python virtual environment, and then comment out the part about pip uninstalling cudnn in install.py before starting webui.sh. The above method works for me.

So do you mean I should install this extension manually by doing a git clone of this repository in the extensions folder ? then manually install the dependencies, then comment pip uninstalling cudnn in install.py that is located in the newly created folder inside the extensions folder ? I tried that but the webui never started, and stops after basic initialization, it doesn't even start the gradio server interface

I did the extension route, then after a few minutes, killed the webui.sh, followed the instructions and reran the webui.sh. Or you can do the get clone command in the extension folder to save time. Got it working!

What @sup2069 is saying is to go into the extensions folder and run the git clone on the repo. Then comment out the problem code. I think the problem comes form Nvidia making this extension for Windos users. So, when you run webui.sh it tries running some .exe commands that work natively on Windows but when using WSL or Linux it just causes problems. I'm by no means a pro, and that this might not actuly be the resons why it's not working without some modifications, but this is my leading theory.

My fix

When you Open the install.py with a Text editor like VS Code or Notepad you'll find on line (on line 34) that says "if launch.is_installed("nvidia-cudnn-cu11"):" this the part @sup2069 wants you to comment out (Place a # in front to comment it out). I had some other problems after I got it fully loaded, and brought up the web environment.

I ran into a new error where when I tried to make some TRTs it started saying that some of the packages were not installed and would soft fail. I tried to fix this by going back into the install.py and forcing it to install some of the packages. As of now, I can not make any TRTs. I tried several fixes here on GitHub, but after several hours I have returned empty-handed.

I'm running this on Ubuntu in WSL, using Forge (RIP).

Saniel0 commented 5 months ago

I found this post that helped me fix the issue. https://github.com/NVIDIA/Stable-Diffusion-WebUI-TensorRT/issues/204#issuecomment-1879870974