Open lucascampodonico opened 5 months ago
Thanks for the issue. This could be a PR. @thinkall for awareness
@lucascampodonico which version of autogen are you using? What model are you using? If OpenAI on Azure, which api_version? I see you are using a deprecated version of llm_config using functions and not tools. That might cause a clash.
@davorrunje
I am using pyautogen==0.2.9
config_list = [ { "model": "gpt-4", "api_key": "sk-.........", }, ]
what is the new version of llm_config to use tools?
@lucascampodonico you can use function decorators @register_for_llm and @register_for_execution to automatically generate and add function specifications to your llm_config. OpenAI recently changed their API and functions are declared wrapped in tools JSON. You are using an old style without tools, but if you use decorators they will create the correct version of the JSON.
@thinkall please take a note of this issue and make sure you include @lucascampodonico in your RAG refactor issue/PR.
Hi @lucascampodonico , have you tried the new APIs that @ekzhu and @davorrunje have suggested? Here #1661 you can also find the updated example of using RAG with functions.
Describe the bug
Hello, I have an error when I want to use agents group chat with rag with functions. Without rag it works perfectly, but with rag it throws an error due to the context.
I modified the code in autogen/agentchat)/contrib/retrieve_user_proxy_agent.py to make it work. I don't know if it's okay that way but at the moment it's working well for me.
Steps to reproduce
Expected Behavior
Let the functions suggested by an agent work.
Screenshots and logs
Before
After
Additional Information
No response