di-sukharev / opencommit

just a GPT wrapper for git — generate commit messages by an LLM in 1 sec — works best with Claude 3.5 — supports local models too
https://www.npmjs.com/package/opencommit
MIT License
6.19k stars 329 forks source link

[Bug]: Unable to use oco with ollama running mistral #310

Open jagadish-k opened 8 months ago

jagadish-k commented 8 months ago

Opencommit Version

3.0.11

Node Version

18.15.0

NPM Version

9.5.0

What OS are you seeing the problem on?

Mac

What happened?

I am unable to use opencommit to generate commit message for my staged files using locally running ollama.

I get the following error :

 ✖ local model issues. details: connect ECONNREFUSED ::1:11434

Expected Behavior

I expect opencommit to work with locally running ollama

Current Behavior

I am running ollama with mistral in one terminal. ollama run mistral In the other terminal where I have staged files, I was able to curl to it

curl http://127.0.0.1:11434
Ollama is running%

This is the config I have

OCO_OPENAI_API_KEY=undefined
OCO_TOKENS_MAX_INPUT=undefined
OCO_TOKENS_MAX_OUTPUT=undefined
OCO_OPENAI_BASE_PATH=undefined
OCO_DESCRIPTION=false
OCO_EMOJI=false
OCO_MODEL=gpt-3.5-turbo-16k
OCO_LANGUAGE=en
OCO_MESSAGE_TEMPLATE_PLACEHOLDER=$msg
OCO_PROMPT_MODULE=conventional-commit
OCO_AI_PROVIDER=ollama

Possible Solution

No response

Steps to Reproduce

No response

Relevant log output

> OCO_AI_PROVIDER='ollama' opencommit
┌  open-commit
│
◇  30 staged files:
...
◇  📝 Commit message generated
│
└  ✖ local model issues. details: connect ECONNREFUSED ::1:11434
di-sukharev commented 8 months ago

@jaroslaw-weber hi man, could you take a look please?

di-sukharev commented 8 months ago

@jagadish-k could you try with other OCO_MODEL config? now it's set to gpt-3.5-turbo-16k which is not mistral

SebastienElet commented 8 months ago

Same for me even with OCO_MODEL set to mistral

❯ OCO_AI_PROVIDER="ollama" OCO_MODEL=mistral opencommit
┌  open-commit
│
◇  2 staged files:
  .zshrc
  Makefile
│
◇  📝 Commit message generated
│
└  ✖ local model issues. details: connect ECONNREFUSED ::1:11434
Abir-Tx commented 7 months ago

Hi there, I am also getting this same error. I think the doc should add some more details on setting an ollama model with open-commit

I have this bare minimum config & ollama is running fine on port 11434 on localhost

OCO_AI_PROVIDER='ollama'
OCO_DESCRIPTION=false
OCO_EMOJI=false
Abir-Tx commented 7 months ago

Guys I have found a fix for this. You see the oco is requesting the ollama API on ipv6 loopback which is ::1 and by default ollama doest not listen on ipv6 address.

So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0 env variable.

@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.

I hope this will help thanks

di-sukharev commented 3 months ago

@Abir-Tx please do enhance the documentation when you have time for a PR and thank you for the help ❤️

Paficent commented 3 months ago

Guys I have found a fix for this. You see the oco is requesting the ollama API on ipv6 loopback which is ::1 and by default ollama doest not listen on ipv6 address.

So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0 env variable.

@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.

I hope this will help thanks

Currently trying to get opencommit to run in a github workflow action, where would I set this variable?

victorbiga commented 3 months ago

Guys I have found a fix for this. You see the oco is requesting the ollama API on ipv6 loopback which is ::1 and by default ollama doest not listen on ipv6 address.

So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0 env variable.

@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.

I hope this will help thanks

added export OLLAMA_HOST='0.0.0.0' then closed ollama, re-open it and worked

di-sukharev commented 3 months ago

@victorbiga @Abir-Tx thank you guys, do you want to make a PR for the fix?

Abir-Tx commented 3 months ago

@victorbiga @Abir-Tx thank you guys, do you want to make a PR for the fix?

Yeah I would love to. I will try to submit a PR with enhanced documentation as soon as I get some time. Thank you

di-sukharev commented 2 months ago

@Abir-Tx nice, let me know if you need any help ❤️

Abir-Tx commented 2 months ago

Thank you @di-sukharev. Give me some time I will soon submit the PR. A bit busy right now

KNIGHTCORE47 commented 2 months ago

Problem: I am unable to use opencommit to generate commit message for my staged files using locally running ollama.

BUG: ✖ Failed to generate the commit message Error: Ollama provider error: Invalid URL

Opencommit Version 3.2.2

Node Version 20.14.0

NPM Version 10.8.3

Local Machine OS Verson Windows 11 (23H2)

Ollama Model Version mistral

SOLUTION: