Open webdevatn opened 3 weeks ago
The OpenAI LLM allows for custom host and http session if your LLM is API-compatible with openai
please see minimal_assistant.py as example
Yes you can! I had the same problem and solved it thourgh developing a own additional LLM module. and I also passed through the language. It's not the beste way but it worked for me.
+1 for livekit devs to implement this
I have an LLM ready and deployed on a server. Can I use my apis to get the response instead of openAI.llm()?