Closed stopspazzing closed 9 months ago
zluda is very interesting and an admirable effort but its status and capabilities have been very misunderstood in various circles, media, etc.
If you review the README it's full of caveats and the only recent commit from the past three years is specific support for llama.cpp. Given that it's three years old it doesn't support CUDA 12 (we use CUDA 12) and it certainly doesn't support most/any of the newer accelerated operations we use with WIS. The author also includes this troublesome note:
"Realistically, it's now abandoned and will only possibly receive updates to run workloads I am personally interested in (DLSS)."
As I said it's an interesting effort but I've been puzzled why it's being hailed and reported in various circles. For the time being and the foreseeable future it's useless for anything other than very old CUDA applications and even then it's littered with caveats and issues.
Wow, thank you for clarifying all of that. I was not aware. Well, hopefully all this publicity will help push it's development toward supporting newer versions of cuda. Guess i gotta go get a 1070.
Has anyone tested using zluda with this yet? Would love to jump on board. Thanks.