Codium-ai / pr-agent

🚀CodiumAI PR-Agent: An AI-Powered 🤖 Tool for Automated Pull Request Analysis, Feedback, Suggestions and More! 💻🔍
Apache License 2.0
6.05k stars 590 forks source link

Ollma Local integration not working and need help to integrate with bitbucket.org with personal token #1361

Open akashsenta13 opened 4 days ago

akashsenta13 commented 4 days ago

Hello There,

I am running Ollma locally on my Machine in CLI and I have API endpoint running as per below

[ollama]
api_base = "http://127.0.0.1:11434/" 

Also, I have done settings for models as this

[config]
# models
model="ollama/llama3.2"
model_turbo="ollama/llama3.2"
fallback_models=["ollama/llama3.2"]

When I am trying to run PR Review with this configuration I am not able to run successfully. Getting errors as below

image

Also I need help regarding getting Bitbucket (using bitbucket.org) bearer_token for reviewing PR from multiple Repo, i can create single Repo token but i need to use as Generic token for all Repo in account.

mrT23 commented 3 days ago

We cannot debug your personal deployment. something is wrong there.

triple and quadruple check that you have deployed a valid model, and you are able to access it. For example, directly from litellm