alexrozanski / LlamaChat

Chat with your favourite LLaMA models in a native macOS app
https://llamachat.app
MIT License
1.43k stars 53 forks source link

Feature Request: Add API endpoint #26

Open mkellerman opened 1 year ago

mkellerman commented 1 year ago

I love this! Nice work!

This allows me to quickly test my local models, and see how i want to configure them. Once i've confirmed (through the GUI) that everything is how i want it, it would be great if we can expose this as an API. So it's not only a CLIENT, but now can operate as a LlamaServer.

Is this something that you'd be willing to consider?

alexrozanski commented 1 year ago

@mkellerman interesting use-case, nice! you mean a local server or one running remotely that you can query? i'm actually working on some new Swift bindings to power LlamaChat v2 that you can find here: https://github.com/CameLLM

mkellerman commented 1 year ago

The same binary, but you can interact with the models through an API request, instead of UI clicks.

alexrozanski commented 1 year ago

@mkellerman sorry for the late reply on this. not something I'm considering actively atm but will track it for the future! might be something that could wrap CameLLM and be released as a separate project.