microsoft / autogen

A programming framework for agentic AI. Discord: https://aka.ms/autogen-dc. Roadmap: https://aka.ms/autogen-roadmap
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
28.16k stars 4.11k forks source link

[Bug]: Ignores my specified models and always goes to openai #2889

Open charltonh opened 1 month ago

charltonh commented 1 month ago

Describe the bug

This is a fresh install on linux and a couple of problems:

1) When setting up a new model and model name, the agent can point to that model name, but when you hover over the model name in the dropdown you can see the info coming from another model. It's as if it still is using the openai model. THIS IS A BUG.

2) Ok so you delete it and re-add it like it says to do on the agents and workflow, and now it appears to point to the model you really want, in my case pointing to https://api.groq.com/openai/v1 with a groq API key. The model tests out successfully. BUT, it still appears to be going on the openai base_url because it gives me the error:

Error occurred while processing message: Error code: 404 - {'error': {'message': 'The model llama3-70b-8192 does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

I tried exporting OPENAI_API_BASE and OPENAI_API_KEY environment variables to point to groq.com to see if that helped, and this is where I learned it's still going to openai no matter what I do, because then I got this error:

openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: gsk_k6LK****ej4u. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

So problem #2 may be something that can be configured, but it's a problem because it says it's using a model that tests out successfully, but when you actually go to run it, it's pointing to openai no matter what model it's set to.

Steps to reproduce

No response

Model Used

llama3-70b-8192 on https://api.groq.com/openai/v1

Expected Behavior

Autogen (the agent) should use the model it says it is using, going by the model name/identifier.

Screenshots and logs

No response

Additional Information

python-3.12 autogenstudio 0.56 fresh install

charltonh commented 1 month ago

Apologies, this was meant to be a bug for autogenstudio, which I thought was a newer beta version of this project.

qingyun-wu commented 3 weeks ago

@charltonh, I wonder if this is addressed?

charltonh commented 2 weeks ago

Well, sort of. Instructions need to include setting both OPENAI_BASE_URL and OPENAI_API_BASE environment variables, because Autogen Studio doesn't override this even though you may be pointing to an entirely different URL and model.