phidatahq / phidata

Build AI Assistants with memory, knowledge and tools.
https://docs.phidata.com
Mozilla Public License 2.0
10.45k stars 1.51k forks source link

Question about using local LLM models #1009

Open xujia-wang opened 2 weeks ago

xujia-wang commented 2 weeks ago

Hello, when I tried to use local LLM model(Qwen2-chat-72B) with OpenAILike for LLM OS, I have encountered this bug: 20240613-141155 I didn't encounter problems when I used gpt models. Could anyone explain or solve this problem for me? Thanks a lot!

ysolanky commented 2 weeks ago

Hello @xujia-wang!

It looks like the function calling with (Qwen2-chat-72B) is not fully compatible with OpenAILike. To make function calling work, we will have to create a custom class. I will share updates soon :)