Aider-AI / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
21.2k stars 1.97k forks source link

Azure OpenAI model does not work properly #50

Closed tjroamer closed 1 year ago

tjroamer commented 1 year ago

It seems that setting up the Azure OpenAI model does not work. I passed the arguments --openai-api-base and --openai-api-key to the command line when I ran aider. The output is as follows:

Model: gpt-4
Git repo: none
Repo-map: disabled
Use /help to see in-chat commands.

It appears that my model has been recognized, but when I issued prompts, nothing happened:

────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
> create me a snake game

────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
> create me a fabonacci function.

────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
> /exit

Using OpenAI models works well however.

paul-gauthier commented 1 year ago

Thanks for trying aider and reporting this issue.

Do you get any output if you run with --no-pretty or --no-stream or both options applied?

philipsd6 commented 1 year ago

From looking at the code, it's pretty clear aider does not support Azure OpenAI endpoints at all. I'm not sure what OP passed to --openai-api-base but the documentation for the openai-python package states that the api_type must be set to azure or azure_ad and the aider codebase doesn't seem to support that.

Additionally, you have to pass in the Azure deployment name as the engine, which may or may not be the same name as the model.

philipsd6 commented 1 year ago

FWIW, I made a quick hack to the code in base_coder.py:

if "azure.com" in openai_api_base:
    openai.api_type = "azure"

and now when I run aider --openai-api-base="https://my-test-instance.openai.azure.com" --openai-api-key $OPENAI_API_KEY I get:

Traceback (most recent call last):
  File "/home/philipsd6/.local/bin/aider", line 33, in <module>
    sys.exit(load_entry_point('aider-chat', 'console_scripts', 'aider')())
  File "/home/philipsd6/src/aider/aider/main.py", line 272, in main
    coder = Coder.create(
  File "/home/philipsd6/src/aider/aider/coders/base_coder.py", line 77, in create
    if not check_model_availability(main_model):
  File "/home/philipsd6/src/aider/aider/coders/base_coder.py", line 1059, in check_model_availability
    available_models = openai.Model.list()
  File "/home/philipsd6/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/openai/api_resources/abstract/listable_api_resource.py", line 60, in list
    response, _, api_key = requestor.request(
  File "/home/philipsd6/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/openai/api_requestor.py", line 230, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "/home/philipsd6/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/openai/api_requestor.py", line 624, in _interpret_response
    self._interpret_response_line(
  File "/home/philipsd6/.local/pipx/venvs/aider-chat/lib/python3.10/site-packages/openai/api_requestor.py", line 687, in _interpret_response_line
    raise self.handle_error_response(
openai.error.InvalidRequestError: Resource not found

And if I add --model GPT35 (which is the name of the deployment in this instance, and should be passed as the engine parameter to openai.ChatCompletion.create() to that, I get this:

Traceback (most recent call last):
  File "/home/philipsd6/.local/bin/aider", line 33, in <module>
    sys.exit(load_entry_point('aider-chat', 'console_scripts', 'aider')())
  File "/home/philipsd6/src/aider/aider/main.py", line 270, in main
    main_model = models.Model(args.model)
  File "/home/philipsd6/src/aider/aider/models.py", line 31, in __init__
    raise ValueError(f"Unknown context window size for model: {name}")
ValueError: Unknown context window size for model: GPT35
philipsd6 commented 1 year ago

OK, I figured out that if I pass --model gpt-3.5-turbo, it loads just fine, but as the OP noted, it doesn't actually do anything, and this is because, as I noted above -- the engine parameter is not being set. When I forced the engine parameter into the openai.ChatCompletion.create() params, it works great. So aider needs to support setting the api_type (because maybe we want to use azure_ad or azure) and then somehow has to be able to either map models to deployments (or vice versa) or just have another parameter just for azure deployment names.

tjroamer commented 1 year ago

Thanks a lot for trying this out. Yes, I tried the solution you provided and was able to load my azure gpt-4 model. What I added in base_coder.py:

In the Coder.create function:

openai.api_type = "azure"
openai.api_version = "2023-03-15-preview"

In the Code.send_with_retries function, add this to dict:

engine="gpt-4"

Then it works. Not sure when azure will be supported.

paul-gauthier commented 1 year ago

Thank you for taking the time to work out the changes needed to support azure. This sounds like a worthwhile improvement to aider. I'll update this issue when I am able to complete this change.

Or feel free to file a PR if you'd like.

paul-gauthier commented 1 year ago

It looks like Azure has an application process to access the openai models. So I don't have access to do any testing.

Is one of you able to install aider from the azure branch and test it out?

pip install git+https://github.com/paul-gauthier/aider.git@azure

There are docs in the FAQ on how to set the args:

https://github.com/paul-gauthier/aider/blob/azure/docs/faq.md#azure

Note that the FAQ doesn't discuss the engine arg to openai.ChatCompletion.create(), but rather specifies a deployment_id arg. After a quick look at the source for the openai client, they look interchangeable. I built in support for both. You'll see --openai-api-engine is listed in the aider --help.

salilcbi commented 1 year ago

@paul-gauthier - I am verifying that I could use your azure branch to successfully connect to Azure OpenAI gpt-35-turbo-16k using a yml file populated with the information required. I asked aider to "createa a streamlit app that displays Hello World in the file called app.py" and it successfully generated the file.

paul-gauthier commented 1 year ago

Great, thanks for confirming that it works. I have merged the PR into main.

I'm going to close this issue for now, but feel free to re-open or file a new issue if you have any further problems.

paul-gauthier commented 1 year ago

FYI, we just added an #llm-integrations channel on the discord, as a place to discuss using aider with alternative or local LLMs.

https://discord.gg/X9Sq56tsaR