erew123 / alltalk_tts

AllTalk is based on the Coqui TTS engine, similar to the Coqui_tts extension for Text generation webUI, however supports a variety of advanced features, such as a settings page, low VRAM support, DeepSpeed, narrator, model finetuning, custom models, wav file maintenance. It can also be used with 3rd Party software via JSON calls.
GNU Affero General Public License v3.0
816 stars 91 forks source link

Issue with installing nvidia-cudnn-cu116 #182

Closed allenhs closed 4 months ago

allenhs commented 4 months ago

Hello, I have an updated version of oobabooga and have encountered the following error:

alltalk

My system gpu is an Nvidia 4090 and I am running Windows 11.

erew123 commented 4 months ago

Hi @allenhs

Based on the above, I can see its trying to install cudnn for CUDA 11.6 cu116.

If you are using this within text-gen-webui, it should be using cu118 or cu121 (it asks at time of installation). You did start the text-gen-webui Python environment before installing?

image

Full details are here https://github.com/erew123/alltalk_tts/tree/main?tab=readme-ov-file#installation-and-setup-issues

If you DID start the Python environment with cmd_windows.bat and you are still experiencing the issue, could you please generate me a diagnostics.log file so I can try understand what could be wrong in the environment.

https://github.com/erew123/alltalk_tts/tree/main?tab=readme-ov-file#-help-with-problems

Thanks

allenhs commented 4 months ago

Thanks for the reply,

Yeah I manually start the Environment that pinokio creates for Oobabooga, I use install Alltalk for oobabooga all the time with this method.

image

Oh and I also ran the enviroment with the cmd_windows.bat too and it activates the same environment and results with the same issue.

image

Here are the results when trying to generate the diagnostic file after starting the environment with cmd_windows.bat:

image

erew123 commented 4 months ago

Hi @allenhs

I don't know why Pinokio is set to (assumedly) install PyTorch CUDA 11.6, but I don't support PyTorch CUDA 11.6. I'm only supporting the base requirements that text-gen-webui installs, which is either PyTorch CUDA 11.8 or PyTorch CUDA 12.1.

You can confirm your PyTorch version in the following way:

image

The above example shows PyTorch 2.2.1 with cu121, which is CUDA 12.1. Below is an explainer:

image

Not knowing Pinokio, I cant say if the PyTorch version is something you (the end user) is something you set up, or if its something Pinokio does as part of its installation routine, either as a system wide setting for Pinokio OR specific to each application it installs.

I can suggest 3x possible routes (in no particular order):


1) Upgrade your PyTorch within the Pinokio Python environment to have CUDA 11.8 or 12.1 https://pytorch.org/get-started/locally/

I don't personally use Pinokio so I don't know if this would break anything else. Again, not sure whats installed/setup the curren PyTorch w/ CUDA 11.6 environment.

You would run cmd_windows.bat and then run the correct PyTorch command from the Pytorch website.


2) Edit the \alltalk_tts\system\requirements\requirements_textgen.txt removing:

nvidia-cublas-cu11>=11.11.3.6
nvidia-cudnn-cu11>=9.0.0.312

From the file. The net effect of doing this, is that you wouldn't be able to use finetuning without performing a manual installation of the Nvidia CUDA Toolkit (As detailed in the Finetuning documentation).


3) Speak to the person who developed/setup the Pinokio Text-gen-webui and ask them why it appears to be using PyTorch CUDA 11.6.


As I say, my best guess is that somehow you have PyTorch with CUDA 11.6 on your system, due to the fact the the PIP installer is trying to install nvidia-cudnn-cu11 for PyTorch with CUDA 11.6 (As below cu116).

image

If you had PyTorch with CUDA 11.8 (cu118) or CUDA 12.1 (cu121) it would be attempting to install those versions.

Just as an addition, not using PyTorch with CUDA 12.1 will give you quite a performance drop, esp if you are using an RTX based GPU.

allenhs commented 4 months ago

Hi @erew123,

It looks like oobabooga is using 2.2.1+cu121. Maybe Pinokio is doing some other weird thing under the hood.

image

Given that it was reporting a good version, I decided to go with option 3 and it seems to have worked great. Thank you for helping, especially given that the root of the problem is outside of Alltalk.