Closed jmtatsch closed 1 year ago
i just want the same and maybe can we use https://github.com/cocktailpeanut/dalai as llama api..
@matigumma Dalai does not provide a openai compatible api. The openai api is a nice abstraction over all the models behaving slightly differently.
i made my own test here: https://github.com/matigumma/bb.agi/blob/main/baby_agi%20.ipynb
Hey @jmtatsch and @matigumma we added support for custom base url via an ENV for local development, I imagine thats what you mean by host (https://github.com/reworkd/AgentGPT/pull/543)
Hello @matigumma @jmtatsch I will be closing this as it seems to be resolved please feel free to reopen it if you still have any other questions ?
I'm setting export OPENAI_API_BASE="http://localhost:8000", but it's still using OpenAI Servers. Am I doing something wrong?
Hello,
I would love to run AgentGPT with local models using https://github.com/abetlen/llama-cpp-python which provides an emulated openai api.
Can the API host be made user settable via a OPENAI_API_HOST env variable?