Closed robotdad closed 6 hours ago
You can enable repository maps for any model, but local models might not be clever enough to understand them. Add --map-tokens <number>
to the command line to force enable it.
That did it, thank you. Responses were immediately better but need more time playing with this model to see how capable it is.
I did sort out the other .env vars for the local model so sharing here.
## Specify the model to use for the main chat
AIDER_MODEL=ollama/llama3.1
## Local Ollama base
OLLAMA_API_BASE=http://127.0.0.1:11434
## Force enable repo-map for local model
AIDER_MAP_TOKENS=1024
Issue
I decided to try running aider with local models only and I'm noticing that the repo-map is disabled. I could not answer this looking at docs, does that feature require a gpt-4o endpoint? This is how I'm starting aider, wasn't sure how to specify a local model in the .env as the default.
aider --model ollama/llama3.1
Version and model info
aider 0.56.0 ollama/llama3.1