Open abonetee opened 3 months ago
You need to install Ollama, then run "ollama pull mistral-nemo" in your terminal.
I eventually did so after I went and researched a lot of YouTube videos , the other issue I started facing was when I run that Ollama - engineer in my local LLM it doesn’t use the tools like Claude-engineer would, is the there a tweak to this ?
On Sun, Aug 4, 2024 at 5:00 AM Daniel Dunderfelt @.***> wrote:
You need to install Ollama, then run "ollama pull mistral-nemo" in your terminal.
— Reply to this email directly, view it on GitHub https://github.com/Doriandarko/claude-engineer/issues/158#issuecomment-2267444232, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIUZBJ5DO2GGIJL4S5RBDUDZPXUTLAVCNFSM6AAAAABL6DJQAGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENRXGQ2DIMRTGI . You are receiving this because you authored the thread.Message ID: @.***>
After i have ran python3 ollama-eng.py, i then see ollama -engineer active, but then once i ask the LLM a question i get "API Error: model "mistral-nemo" not found, try pulling it first " error.
i even had to go and get an API key from Mistral Nemo, do i need to create another .env file and put the API key in there ?