jitsi / skynet

AI core services for Jitsi
Apache License 2.0
28 stars 7 forks source link

feat: move openai api to separate process altogether #84

Closed quitrk closed 3 months ago

saghul commented 4 months ago

What's the rationale here?

quitrk commented 4 months ago

What's the rationale here?

We have no good way of determining that the llama.cpp server process has started / failed / terminated, and this approach allows us to make a strong dependency on it, because if one dies, both would exit.. @rpurdel could probably advocate more for this, as I'm kinda torn in between, due to the fact we're duplicating some env defaults

saghul commented 4 months ago

We could spawn it like so, which is probably why we struggled to get the logs: https://docs.python.org/3/library/asyncio-subprocess.html

And monitor it in a task with .wait(). If it ends, we make skynet commit seppuku.