wladradchenko / wunjo.wladradchenko.ru

Wunjo CE: Face Swap, Lip Sync, Control Remove Objects & Text & Background, Restyling, Audio Separator, Clone Voice, Video Generation. Open Source, Local & Free.
https://wunjo.online
MIT License
832 stars 95 forks source link

wunjo ai v2 zluda #68

Closed brcisna closed 3 weeks ago

brcisna commented 3 weeks ago

Debian 13

trying to get wunjo v2 to work with zluda backend

have symlinked zluda directory to wunjo directory copied the the three zluda files to torch/libs dir on the venv use the following command as suggested on zluda github to start breifcase dev ( for Linux)

LD_LIBRARY_PATH=":$LD_LIBRARY_PATH"

I dont understand when i go into settings on wunjo,,i do not have option for GPU usage? I seen the snippet about launching google-chome with a command to enable gpu,,,but that always just says 'site not found' Can anyone give better details on this particular command. I dont understand how the command that was shown can launch an instance of wunjo?

TIA

wladradchenko commented 3 weeks ago

Hi @brcisna . Can you show prints by some code? You need to activate python3.10 venv, next python and run:

import torch
import sys
import os
from subprocess import call
print('_____Python, Pytorch, Cuda info____')
print('__Python VERSION:', sys.version)
print('__pyTorch VERSION:', torch.__version__)
print('__CUDA RUNTIME API VERSION')
#os.system('nvcc --version')
print('__CUDNN VERSION:', torch.backends.cudnn.version())
print('_____nvidia-smi GPU details____')
call(["nvidia-smi", "--format=csv", "--query-gpu=index,name,driver_version,memory.total,memory.used,memory.free"])
print('_____Device assignments____')
print('Number CUDA Devices:', torch.cuda.device_count())
print ('Current cuda device: ', torch.cuda.current_device(), ' **May not correspond to nvidia-smi ID above, check visibility parameter')
print("Device name: ", torch.cuda.get_device_name(torch.cuda.current_device()))

Anyway, you can also add lines in app.py, to know whether working with CUDA is available when using your ZLUDA.

Additionally, you can write in app.py:

import os
os.environ["CUDA_DEVICE_ORDER"] = "PCI_BUS_ID"
os.environ["CUDA_VISIBLE_DEVICES"] = "0"

Also, you need to switch line 67 os.environ['DEBUG'] = 'False' on os.environ['DEBUG'] = 'True' to see errors in console or you can open log folder and find your log file with error.

I think there is no need to open new topics separately from the question, if they are interconnected, it will be more difficult for users to navigate.

brcisna commented 3 weeks ago

@wladradchenko

Success! With adding this extra lines i could see no devices being found. Was a lot of rooting around trying to find out why. Purely by accident it hit me i needed to install pytorch with ROCm!!!

As an aside comfyUI was showing no Nvidia device detected as well. So,,, i found a link for the right commanr for pytoch with ROCm

In wunjo i done pip uninstall torch torchaudio torchvision Then I done

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/rocm6.0

wunjo now detected a GPU device !! But i got this error ModuleNotFoundError: No module named 'torchvision.transforms.functional_tensor

SO,,, done this: Open ,,,,,,,/venv/lib/python3.10/site-packages/basicsr/data/degradations.py and on line 8, simply change:

from torchvision.transforms.functional_tensor import rgb_to_grayscale to:

from torchvision.transforms.functional import rgb_to_grayscale

wunjo_gpu

I think i may be able to get Zluda working now Will report

Also,,,, You should make a note for AMD GPU users,,,to do the same,, IE: remove torch torchvision torchaudio Then pip install torch with rocm. Also,,,the edit i hafd to do,,i think,,is fixed in later versions of torch-rocm... I remember having to do this edit,,,a year ago,,,when i first started into this journey> Thanks for the help

wladradchenko commented 3 weeks ago

@brcisna Nice to hear that it worked for you. Good job.

brcisna commented 3 weeks ago

Adding... I got error about xformers not being installed and a couple features would not be available ,,,so i done the following:

pip3 install -U xformers --index-url https://download.pytorch.org/whl/rocm6.1

NOW,,,no errors at all in console.