chaiNNer-org / chaiNNer

A node-based image processing GUI aimed at making chaining image processing tasks easy and customizable. Born as an AI upscaling application, chaiNNer has grown into an extremely flexible and powerful programmatic image processing application.
https://chaiNNer.app
GNU General Public License v3.0
4.48k stars 280 forks source link

Add support for Pytorch ROCm #1399

Open RobotRoss opened 1 year ago

RobotRoss commented 1 year ago

Motivation Some of the models I have will not convert over to NCNN due to unsupported opsets. I have an AMD GPU so using Pytorch is infuriatingly slow.

Description Add support for Pytorch's ROCm workflow, which enables GPU acceleration on AMD GPUs https://pytorch.org/blog/pytorch-for-amd-rocm-platform-now-available-as-python-package/ https://news.ycombinator.com/item?id=32679671

Alternatives Use of NCNN, however this does not work on many Pytorch models.

RunDevelopment commented 1 year ago

Do you know how compatible ROCm versions are with newer Linux distros? The PyTorch version we are targeting only support ROCm 4.2 (May 2021), so newer distros aren't officially supported.

Also, we assume that PyTorch GPU support is CUDA only right now, so we'll need to do some refactoring to throughout the frontend.


It should be mentioned that ROCm is Linux only. This isn't really a problem, since improving GPU support for any platform is a good thing, but I wanted to mention it because it wasn't before.

0x4E69676874466F78 commented 1 year ago

@RunDevelopment it maybe useful: for Windows there is another alternative as ONNX-DirectML like https://github.com/huggingface/diffusers/compare/main...harishanand95:diffusers:dml

joeyballentine commented 1 year ago

People have already gotten rocm to work. Just use system python and install the rocm version to your system

theflyingzamboni commented 1 year ago

@RobotRoss What AMD GPU do you have? The list of GPUs supported by ROCm is small. It's pretty much just their productivity GPUs, so RDNA cards won't be able to use it.

RobotRoss commented 1 year ago

@RobotRoss What AMD GPU do you have? The list of GPUs supported by ROCm is small. It's pretty much just their productivity GPUs, so RDNA cards won't be able to use it.

RX 6900 XT - While it's not on the official supported list (only the Pro GPUs are), ROCm does work with it, as has been verified by myself and others - supposedly, the reason why the supported list is so short is because of the requirements for extensive testing; however that doesn't mean that ROCm won't work with consumer GPUs.

joeyballentine commented 1 year ago

In that case, try using it with system python + installing rocm pytorch to your system. It should just work

RobotRoss commented 1 year ago

In that case, try using it with system python + installing rocm pytorch to your system. It should just work

Unfortunately attempting to use system python will cause ChaiNNer to crash on start - logfile main.log

System Python: Python 3.11.0 (main, Oct 24 2022, 00:00:00) [GCC 12.2.1 20220819 (Red Hat 12.2.1-2)] on linux

Edit: Fixed, python3-devel package was missing. Might be worth having a warning about that missing package when using system python (The error thrown right now is generic and quite unhelpful)

joeyballentine commented 1 year ago

That's a known issue. Its supposed to warn you about that but i accidentally made that not happen a while back

RobotRoss commented 1 year ago

Okay, I've had some time to tinker, and while, yes, it is possible to get Chainner to render using ROCm, I disagree with the characterization of "it just works" - It's not a fault within Chainner itself, but instead due to the dependencies required by Chainner not supporting modern Python versions, meaning that I couldn't actually get a full install but was able to get far enough I could get an output - adding ROCm support to the Python portable that ships with Chainner would be preferable.

joeyballentine commented 1 year ago

You could just use an older python version. I'm pretty sure PyTorch has a ROCm version for 3.9. Yes it would probably be preferrable to give the option in the UI to install the rocm version but I don't think we'll be doing that relatively soon