Open MiscellaneousStuff opened 5 months ago
Seconding this since this project might come in handy in the near future. Also there are ideas about LiteLLM being able to call multiple models from Ollama that is worth anticipating. (AutoGen and other tools have similar supports as well)
Turns out that inside of the SWE-agent, you can call ollama models using ollama:<OLLAMA_MODEL_NAME_GOES_HERE
already.
Add support for LiteLLM within SWE-agent so people can use any open-source model they want for the agent, rather than just OpenAI or Anthropic APIs.
Edit: Switched from Ollama to LiteLLM as LiteLLM should be a bit more abstract than Ollama and support a wider range of closed source and open source LLMs so the community can have more freedom over what they use.