stitionai / devika

Devika is an Agentic AI Software Engineer that can understand high-level human instructions, break them down into steps, research relevant information, and write code to achieve the given objective. Devika aims to be a competitive open-source alternative to Devin by Cognition AI.
MIT License
17.85k stars 2.32k forks source link

Mind adding generic openAI support? #45

Open Ph0rk0z opened 3 months ago

Ph0rk0z commented 3 months ago

A lot of local servers use openAPI spec. tabbyAPI, textgen, llama.cpp server, etc. Is it possible to add support for that? ollama is very limiting and I want to use this with 70b+. I'm of course going to kludge the code to try but you may be able to support many projects all at once by generalizing one API.

Sandoz25 commented 3 months ago

couldnt you use litellm to act as a passthrough?

Ph0rk0z commented 3 months ago

It was easier editing the code but I got stuck on the search. Now someone has made it happen so we can finally use this: https://github.com/stitionai/devika/pull/70

thiswillbeyourgithub commented 3 months ago

Not sure I understand. Why not using litellm? It's easier to handle models with litellm than having to deal with changing the URL. Mistral large and its function calling seems promising.

Ph0rk0z commented 3 months ago

because I'm not using an API. i'm using my own model through textgen

thiswillbeyourgithub commented 3 months ago

You can use litellm to point towards OpenAI API and change the URL to point to the textgen local URL apparently.

Ph0rk0z commented 3 months ago

It's 10x easier to just add the endpoint parameter to the openAI initialization than to run another proxy server.