Open dufeng1010 opened 1 month ago
I haven't tested aider with ollama yet, but following https://aider.chat/docs/llms/ollama.html you should be able to
This might work and if it does you could set the additional argument in the settings as default
Alternatively you could try to configure aider to use ollama by default independent of the plugin, e.g. set the api key in any https://aider.chat/docs/config/dotenv.html and set a default value for model in any supported yaml location https://aider.chat/docs/config/aider_conf.html
i see anthropic_api_key, openai_api_key, deepseek_api_key set,but i don't find local model set, for example ollama. does it support local model?