Closed kirkog86 closed 8 months ago
Please add an option to API into local OpenAI compatible API server (llama.cpp.python[server] for example)
@kirkog86 does being able to change the API URL resolve this?
I just added in 0.3.6 support for the environment variable OPENAI_CHAT_URL.
OPENAI_CHAT_URL
I'm assuming you could target http://localhost:[your-llama-port] to use it with local models.
http://localhost:[your-llama-port]
Please add an option to API into local OpenAI compatible API server (llama.cpp.python[server] for example)