Closed skitiz closed 10 months ago
Although, personally i think there are better models for code than llama 😉
@mrT23 how do i use it and where do i make changes?
Use who ?
local llm or run PR-Agent locally with an Ollama model
How to setup a local model is out-of-scope for pr-agent. there are guides online (TGI)
Thank you so much!
Could we re-open this issue? I can't find any documentation on how to use locally run models with something like Ollama or LMStudio. @mrT23
@fredrikburmester are you sure you looked for documentation ?
https://pr-agent-docs.codium.ai/usage-guide/additional_configurations/#hugging-face
If you think something is missing or not updated, open a PR with updated documentation. I do expect community contribution in these areas
I was wondering if we could have the ability to run PR-Agent locally with an Ollama model instead of using an OpenAI key.