di-sukharev / opencommit

Auto-generate impressive commits with AI in 1 second πŸ€―πŸ”«
https://www.npmjs.com/package/opencommit
MIT License
5.67k stars 292 forks source link

[Bug]: Unable to use oco with ollama running mistral #310

Open jagadish-k opened 4 months ago

jagadish-k commented 4 months ago

Opencommit Version

3.0.11

Node Version

18.15.0

NPM Version

9.5.0

What OS are you seeing the problem on?

Mac

What happened?

I am unable to use opencommit to generate commit message for my staged files using locally running ollama.

I get the following error :

 βœ– local model issues. details: connect ECONNREFUSED ::1:11434

Expected Behavior

I expect opencommit to work with locally running ollama

Current Behavior

I am running ollama with mistral in one terminal. ollama run mistral In the other terminal where I have staged files, I was able to curl to it

curl http://127.0.0.1:11434
Ollama is running%

This is the config I have

OCO_OPENAI_API_KEY=undefined
OCO_TOKENS_MAX_INPUT=undefined
OCO_TOKENS_MAX_OUTPUT=undefined
OCO_OPENAI_BASE_PATH=undefined
OCO_DESCRIPTION=false
OCO_EMOJI=false
OCO_MODEL=gpt-3.5-turbo-16k
OCO_LANGUAGE=en
OCO_MESSAGE_TEMPLATE_PLACEHOLDER=$msg
OCO_PROMPT_MODULE=conventional-commit
OCO_AI_PROVIDER=ollama

Possible Solution

No response

Steps to Reproduce

No response

Relevant log output

> OCO_AI_PROVIDER='ollama' opencommit
β”Œ  open-commit
β”‚
β—‡  30 staged files:
...
β—‡  πŸ“ Commit message generated
β”‚
β””  βœ– local model issues. details: connect ECONNREFUSED ::1:11434
di-sukharev commented 4 months ago

@jaroslaw-weber hi man, could you take a look please?

di-sukharev commented 4 months ago

@jagadish-k could you try with other OCO_MODEL config? now it's set to gpt-3.5-turbo-16k which is not mistral

SebastienElet commented 3 months ago

Same for me even with OCO_MODEL set to mistral

❯ OCO_AI_PROVIDER="ollama" OCO_MODEL=mistral opencommit
β”Œ  open-commit
β”‚
β—‡  2 staged files:
  .zshrc
  Makefile
β”‚
β—‡  πŸ“ Commit message generated
β”‚
β””  βœ– local model issues. details: connect ECONNREFUSED ::1:11434
Abir-Tx commented 3 months ago

Hi there, I am also getting this same error. I think the doc should add some more details on setting an ollama model with open-commit

I have this bare minimum config & ollama is running fine on port 11434 on localhost

OCO_AI_PROVIDER='ollama'
OCO_DESCRIPTION=false
OCO_EMOJI=false
Abir-Tx commented 3 months ago

Guys I have found a fix for this. You see the oco is requesting the ollama API on ipv6 loopback which is ::1 and by default ollama doest not listen on ipv6 address.

So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0 env variable.

@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.

I hope this will help thanks