livekit / agents

Build real-time multimodal AI applications 🤖🎙️📹
https://docs.livekit.io/agents
Apache License 2.0
1.16k stars 230 forks source link

Use a custom LLM #700

Open webdevatn opened 3 weeks ago

webdevatn commented 3 weeks ago

I have an LLM ready and deployed on a server. Can I use my apis to get the response instead of openAI.llm()?

keepingitneil commented 3 weeks ago

The OpenAI LLM allows for custom host and http session if your LLM is API-compatible with openai

lenage commented 3 weeks ago

please see minimal_assistant.py as example

ChrisFeldmeier commented 3 weeks ago

Yes you can! I had the same problem and solved it thourgh developing a own additional LLM module. and I also passed through the language. It's not the beste way but it worked for me. SCR-20240909-aj

+1 for livekit devs to implement this