Open seam-ctooley opened 2 months ago
Hey @seam-ctooley what version of autogen is this? i can't seem to run your script
also can you run your proxy with --detailed_debug
? it should print the raw request being made, which should help with repro
I'm on the latest Autogen version: here is a repo that reproduces the issue I'm seeing. https://github.com/seam-ctooley/litellm-bedrock-bug-repro
I've got a detailed debug log, but it seems to contain AWS creds. I'll share it tomorrow once my session expires. If we could share it over Discord as well, that would be greatly appreciated. I am "christiant_47581" on the LiteLLM server
stderr.txt Here is the full log file @krrishdholakia
Same issue here with latest LiteLLM running locally, Autogen and Claude 3 Haiku.
same here
Hey @seam-ctooley what version of autogen is this? i can't seem to run your script
This usually occurs when you install "autogen" instead of "pyautogen"
I've been able to get around the issues mentioned here by using Autogen directly with a custom client https://gist.github.com/seam-ctooley/d22f8319f313bc160388ae5949cc20b8
So I imagine the issue lies with the translation layer to Bedrock, specific format requirements with tool calling that aren't being met.
What happened?
Setup:
Autogen Agent:
LiteLLM Proxy Config
Minimal Reproducible Autogen setup:
Relevant log output
Twitter / LinkedIn details
No response