louisgv / local.ai

🎒 local.ai - Run AI locally on your PC!
https://localai.app
GNU General Public License v3.0
640 stars 55 forks source link

Adding StableLM #15

Open louisgv opened 1 year ago

step21 commented 1 year ago

Sorry for potential ignorance, but does local.ai need ggml type models? Or can it use pytorch or similar directly?

louisgv commented 1 year ago

@step21 it runs GGML-converted model atm as the backend is using the llm crate.

Would love your help in investigating ways for us to run pytorch model directly. I wonder if that's that doable with just rust and the build process such that it works out of the box for all platform.

Or... perhaps people might satisfies with a button that install everything they need (python/torch etc..) and run the python backend as an alternative of llm? Tho supporting and maintaining that pipeline for multiple platform sounds painful ;d

step21 commented 1 year ago

thanks for the clarification. I see several options - though I haven't done enough research yet to properly evaluate them. I looked into how pytorch/torch saves its files - they are just serialized python objects - but as luck will have it, there is a rust crate that supposedly can read them https://github.com/birkenfeld/serde-pickle I haven't tried this however and I am not sure if supports all types etc that pytorch uses. (and then it would probably have to be converted to ggml...) Re installing pytorch - if people have the required permissions, this should be painless with sth like conda-forge - which supports basically all architectures you want and some you might not even know about.