victordibia / llmx

An API for Chat Fine-Tuned Large Language Models (llm)
MIT License
72 stars 27 forks source link

Support for Llama.cpp LLMs #3

Open mskyttner opened 11 months ago

mskyttner commented 11 months ago

It would be interesting to try out the recently released lida library with LLMs running locally using Llama.cpp.

Could llmx support such "offline"/embedded or standalone more resource constrained scenarios with LLMs using only CPUs?

If so, can you provide an outline of steps required?