paul-gauthier / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
18.66k stars 1.72k forks source link

Does repo-map work with local models? #1597

Closed robotdad closed 6 hours ago

robotdad commented 7 hours ago

Issue

I decided to try running aider with local models only and I'm noticing that the repo-map is disabled. I could not answer this looking at docs, does that feature require a gpt-4o endpoint? This is how I'm starting aider, wasn't sure how to specify a local model in the .env as the default.

aider --model ollama/llama3.1

Version and model info

aider 0.56.0 ollama/llama3.1

fry69 commented 6 hours ago

You can enable repository maps for any model, but local models might not be clever enough to understand them. Add --map-tokens <number> to the command line to force enable it.

robotdad commented 6 hours ago

That did it, thank you. Responses were immediately better but need more time playing with this model to see how capable it is.

I did sort out the other .env vars for the local model so sharing here.

## Specify the model to use for the main chat
AIDER_MODEL=ollama/llama3.1

## Local Ollama base
OLLAMA_API_BASE=http://127.0.0.1:11434

## Force enable repo-map for local model
AIDER_MAP_TOKENS=1024