Codium-ai / pr-agent

🚀CodiumAI PR-Agent: An AI-Powered 🤖 Tool for Automated Pull Request Analysis, Feedback, Suggestions and More! 💻🔍
Apache License 2.0
6.03k stars 589 forks source link

Allow Running PR-Agent with Local Models running on Ollama #535

Closed skitiz closed 10 months ago

skitiz commented 11 months ago

I was wondering if we could have the ability to run PR-Agent locally with an Ollama model instead of using an OpenAI key.

mrT23 commented 11 months ago

yes

https://github.com/Codium-ai/pr-agent/blob/main/Usage.md#huggingface

mrT23 commented 11 months ago

Although, personally i think there are better models for code than llama 😉

Sammindinventory commented 10 months ago

@mrT23 how do i use it and where do i make changes?

mrT23 commented 10 months ago

Use who ?

Sammindinventory commented 10 months ago

local llm or run PR-Agent locally with an Ollama model

mrT23 commented 10 months ago

How to setup a local model is out-of-scope for pr-agent. there are guides online (TGI)

skitiz commented 10 months ago

Thank you so much!

fredrikburmester commented 4 months ago

Could we re-open this issue? I can't find any documentation on how to use locally run models with something like Ollama or LMStudio. @mrT23

mrT23 commented 4 months ago

@fredrikburmester are you sure you looked for documentation ?

image https://pr-agent-docs.codium.ai/usage-guide/additional_configurations/#hugging-face

If you think something is missing or not updated, open a PR with updated documentation. I do expect community contribution in these areas