Closed cccc11231 closed 1 month ago
After all that, which may or may not be worth reading.. I found this by doing some more searching https://microsoft.github.io/autogen/blog/2023/07/14/Local-LLMs/ From what I have seen, and I look for this a lot, is that the current local llm are not as smart as chatgpt (3.5 even) and because of that do poorly in handling the queries that are involved. Another part of the issue is I believe that the local llm are not trained to handle api. Now my thoughts are that just having the basics would be a good place to start for enhancing. I think you could have python that handled apis or even nlp that would generate them or you could generate them yourself to provide the tools that the llm (one of its agents actually) would search for. Not nearly smart enough on this or I would be doing it myself. $.02
I just had a similar problem
I am also looking for ways to use local models. But I didn't find it. Many agent frameworks face this issue. They are designed for OpenAI ChatGPT Interface with functions parameters. It would be a bit troublesome to switch to another model.
right now I am using glaive-function-calling model from huggingface as function calling agent llm. it's not perfect but it's working as least. and compatible with openai format. https://huggingface.co/glaiveai/glaive-function-calling-v1
Has anyone used Autogen in conjunction with local LLM to create an agent that manages function calls? It seems that this function calling feature integrates seamlessly with OpenAI, but I'm facing challenges with LLM. Does anyone have any suggestions or experience to share regarding LLM? I was using CodeLlama.