stitionai / devika

Devika is an Agentic AI Software Engineer that can understand high-level human instructions, break them down into steps, research relevant information, and write code to achieve the given objective. Devika aims to be a competitive open-source alternative to Devin by Cognition AI.
MIT License
18.15k stars 2.36k forks source link

No ollama LLM found #373

Open suoko opened 4 months ago

suoko commented 4 months ago

I run the docker compose up command and all was installed correctly. I entered the ollama docker container and installed llama2 but when I run devika, no LLM is found for ollama. Should I configure something ? Or only some LLM are supported ? Starcoder is not seen either

Thanks image

cpAtor commented 4 months ago

300 adding reference to similar existing issue.

I am also facing the same issue.

heartsiddharth1 commented 4 months ago

Any update on this one please. i am not able to select the local model

ARajgor commented 4 months ago

If you are using ollama docker the. Check the serve host of ollama. I guess you have to change it from the default one

ChanghongYangR commented 4 months ago

I have the some problem.

cpAtor commented 4 months ago

If you are using ollama docker the. Check the serve host of ollama. I guess you have to change it from the default one

The following worked for me:

Ahmet0691 commented 4 months ago

Which language model in ollama works properly for this project?

ChanghongYangR commented 4 months ago

when I turned off my vpn connection, it worked. ![Uploading 屏幕截图 2024-04-07 133757.png…]()

kuendeee commented 3 months ago

Any updates here? I'm running the Ollama but the Devika still cannot recognized it. 2024-04-26 12_26_43-Administrator_ Command Prompt - ollama  serve

2024-04-26 12_28_53-config toml - devika - Visual Studio Code  Administrator