Closed JLJuradoDeloitte closed 5 months ago
Thanks for the request! I think this should work already. We use LiteLLM for support of Azure and many other providers. https://docs.litellm.ai/docs/providers/azure
If you set the environment variables as documented above, does it work?
@enyst Looks like few changes are required in OpenDevin to support Azure OpenAI. I was able to run OpenDevin with below changes+hacks
While creating litellm instance, API_Version param is not passed. I had to add few lines in llm.py to support this (Ref screenshot-1 below)
It looks like OpenDevin expecting two models: Chat completion (gpt-35-turbo) and embedding (text-embedding-ada-002 ). But config.toml allows only one deployment name. In Azure each model requires its own deployment (this does not apply in case of pure OpenAI as it only works based on Key and does not require deploymentname). If we just give "gpt-35-turbo" model and deployment name then it will start throwing "embedding operation does not work with specified model" and vise versa. For this I had to do below changes
Screenshot-1 (Changes to fix API_Version issue)
ScreenShot-2 (config.toml with new deployment param for Embedding model)
ScreenShot-3 (addition of Embedding deployment config in config.py )
ScreenShot-4 (changes in MakeFile)
ScreenShot-5 (Referring Embedding deployment in memory.py)
what is wrong with my config.toml ? why running into the errors:
@JLJuradoDeloitte @VishalHaibatpure1 Added documentation for Azure: https://github.com/OpenDevin/OpenDevin/pull/1035
Feel free to comment, correct or add to it!
What problem or use case are you trying to solve? I am developing an application that leverages the capabilities of OpenAI for AI-driven content generation and interaction. Currently, my application uses the simple OpenAI API, but I am looking to enhance its capabilities by integrating Azure OpenAI. This integration would allow my application to take advantage of Azure's advanced features, such as improved scalability, security, and compliance with specific regulatory requirements.
Describe the UX of the solution you'd like The ideal solution would allow developers to easily switch between the simple OpenAI API and Azure OpenAI within the OpenDevin project. This could be achieved through a configuration setting or an environment variable that specifies the API endpoint to be used. The user experience should remain consistent, with minimal changes required in the application code to switch between the two APIs.
Do you have thoughts on the technical implementation? Based on the examples provided in the Azure OpenAI documentation, integrating Azure OpenAI into the OpenDevin project would involve modifying the API client to support Azure's specific authentication and endpoint requirements. This could include adjusting the base URL to point to Azure's OpenAI service and ensuring that the API key and other necessary headers are correctly set for Azure's API.
Describe alternatives you've considered
Additional context
hi i'm having the same issue with setting up Azure OpenAI API with OpenDevin ,let me know how u have it done , many thx !!!
@LeoYoungChina check out https://github.com/OpenDevin/OpenDevin/issues/1027#issuecomment-2050443706 , it's just above. If you want to see a working example.
Having looked into it, it has some things it doesn't need, I believe, and it will be difficult to maintain when you need to upgrade. But it has explanations of the variables, which is very useful. 👍
I'll close this in favor of #1033 . Lets continue the discussion there. @JLJuradoDeloitte please try and let us know if it works!
What problem or use case are you trying to solve? I am developing an application that leverages the capabilities of OpenAI for AI-driven content generation and interaction. Currently, my application uses the simple OpenAI API, but I am looking to enhance its capabilities by integrating Azure OpenAI. This integration would allow my application to take advantage of Azure's advanced features, such as improved scalability, security, and compliance with specific regulatory requirements.
Describe the UX of the solution you'd like The ideal solution would allow developers to easily switch between the simple OpenAI API and Azure OpenAI within the OpenDevin project. This could be achieved through a configuration setting or an environment variable that specifies the API endpoint to be used. The user experience should remain consistent, with minimal changes required in the application code to switch between the two APIs.
Do you have thoughts on the technical implementation? Based on the examples provided in the Azure OpenAI documentation, integrating Azure OpenAI into the OpenDevin project would involve modifying the API client to support Azure's specific authentication and endpoint requirements. This could include adjusting the base URL to point to Azure's OpenAI service and ensuring that the API key and other necessary headers are correctly set for Azure's API.
Describe alternatives you've considered
Additional context