SmartManoj / Kevin

⚡ Kevin: Code Quick, Create Fast
MIT License
6 stars 0 forks source link

Problem with using Ollama/codellama as local LLM engine #11

Closed kevin-support-bot[bot] closed 1 month ago

kevin-support-bot[bot] commented 2 months ago

https://github.com/OpenDevin/OpenDevin/pull/3196 Issue created by @HenrikBach1

kevin-support-bot[bot] commented 2 months ago

Duplicate of https://github.com/OpenDevin/OpenDevin/issues/2844#issuecomment-2213794500?

HenrikBach1 commented 2 months ago

Probably, it may be a duplicate of OpenDevin#2844.

However, I'm not (yet) interested in to setup a local development environment to get things running.

SmartManoj commented 2 months ago

The easiest way to run OpenDevin is to open in GitHub Codespaces

HenrikBach1 commented 2 months ago

@SmartManoj Thank you for your hint.

But, does that also include access to a ollama model and how or where is that documented?

SmartManoj commented 2 months ago

You can use Ngrok for sharing local API or you can use Groq free API.

HenrikBach1 commented 2 months ago

ok.

But, I don't know (yet) how to connect the dots with above tools with/to OpenDevin and admittedly don't have that much time to figure it out reading their documentation and make some guesses.

SmartManoj commented 2 months ago

You can get the Groq API key here Model name: groq/llama3-8b-8192

Sample settings:

image

HenrikBach1 commented 2 months ago

That seems easy.

Thank you.

HenrikBach1 commented 2 months ago

but, isn't opendevin:main their unstable integration or development branch? i need a (more) stable branch

SmartManoj commented 2 months ago

You can use the latest version - v0.8.3

github-actions[bot] commented 1 month ago

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

github-actions[bot] commented 1 month ago

This issue was closed because it has been stalled for over 30 days with no activity.