IAHispano / Applio

VITS-based Voice Conversion focused on simplicity, quality and performance.
https://applio.org
MIT License
1.36k stars 230 forks source link

[BUG] Cuda out of memory on Linux during inference #406

Closed shirounanashi closed 1 month ago

shirounanashi commented 3 months ago

Before You Report a Bug My setup is a GTX 1660 Super, Ryzen 5600G (16GB ram).

Bug Description I use Applio on both Windows 11 and Linux (Arch), but on Linux, it is giving this Cuda out of memory error, both in the last version and in the last commit.

Steps to Reproduce Outline the steps to replicate the issue:

  1. Simply make the inference with the default settings

Expected Behavior Make the inference without giving cuda out of memory

Assets image

Desktop Details:

Additional Context I'm not using IAHispano's fairseq.

aitronssesin commented 3 months ago

It could be because a lot of things but I think is one of this:

  1. Your GPU driver is outdated: https://docs.nvidia.com/deeplearning/cudnn/latest/reference/support-matrix.html
  2. Your GPU only has 5 GB of VRAM and it's also a GTX which there's some people having issues with them
shirounanashi commented 3 months ago

Thank you, it really was a driver problem, but it wasn't because it was outdated, it was because it wasn't installed, both cuda and cudnn. I installed it and solved the problem

shirounanashi commented 2 months ago

Testing further, I discovered that it is a problem with Applio on Linux, a problem that does not happen in RVC WebUI, that is, it has nothing to do with the driver as I thought it would be when closing the issue

aitronssesin commented 2 months ago

Applio uses identical code for GPU detection and utilization in both RVC WebUI. We only chnaged the Torch version, hence I'm sharing this link https://docs.nvidia.com/deeplearning/cudnn/latest/reference/support-matrix.html for you to verify compatibility with our current setup.

shirounanashi commented 2 months ago

@aitronssesin In theory, my GPU was supposed to run smoothly. But even with the latest version of the drivers and cudnn in a clean Arch installation, it still gives me this Cuda out of memory problem, which doesn't happen with the RVC Web UI. Honestly, I don't know why this happens, since it doesn't happen on Windows on the same PC

aitronssesin commented 2 months ago

It could be an issue with Arch because in my Ubuntu server it works without any issues.

shirounanashi commented 2 months ago

It may be, but it doesn't make sense for the RVC WebUI to work without problems

aitronssesin commented 2 months ago

Yes because of the torch version maybe

shirounanashi commented 2 months ago

I tried updating torch, torchaudio and torchvision, but the Cuda out of memory problem still occurred

aitronssesin commented 2 months ago

Sorry I didn't explain me well I was saying that probably the newer torch version we are using is broken in arch but try this:

pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/cu121

shirounanashi commented 2 months ago

I didn't explain myself well either, I updated to the version I was using on the RVC Web UI. But I tested the version you sent and the problem still exists. I also noticed that Applio doesn't seem to release the VRAM until I close its window in the terminal