pytorch ecosystem is "dependency hell" at best, and rarely works well on other platforms besides linux, especially for tasks with many deps like peft, bitsandbytes
llama cpp ecosystem uses CMAKE and is easy to get to work with linux, windows, max, and even WASM!
we're using pytorch for fine tuning only because llama-cpp doesn't support GPU
this is a big task, but it's important.
pytorch ecosystem is "dependency hell" at best, and rarely works well on other platforms besides linux, especially for tasks with many deps like peft, bitsandbytes
llama cpp ecosystem uses CMAKE and is easy to get to work with linux, windows, max, and even WASM!
we're using pytorch for fine tuning only because llama-cpp doesn't support GPU
the same will apply to stable-diffusion too!
this ticket is for