Open jagadish-k opened 4 months ago
@jaroslaw-weber hi man, could you take a look please?
@jagadish-k could you try with other OCO_MODEL
config? now it's set to gpt-3.5-turbo-16k
which is not mistral
Same for me even with OCO_MODEL set to mistral
β― OCO_AI_PROVIDER="ollama" OCO_MODEL=mistral opencommit
β open-commit
β
β 2 staged files:
.zshrc
Makefile
β
β π Commit message generated
β
β β local model issues. details: connect ECONNREFUSED ::1:11434
Hi there, I am also getting this same error. I think the doc should add some more details on setting an ollama model with open-commit
I have this bare minimum config & ollama is running fine on port 11434 on localhost
OCO_AI_PROVIDER='ollama'
OCO_DESCRIPTION=false
OCO_EMOJI=false
Guys I have found a fix for this. You see the oco
is requesting the ollama API on ipv6 loopback which is ::1
and by default ollama doest not listen on ipv6 address.
So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0
env variable.
@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.
I hope this will help thanks
Opencommit Version
3.0.11
Node Version
18.15.0
NPM Version
9.5.0
What OS are you seeing the problem on?
Mac
What happened?
I am unable to use opencommit to generate commit message for my staged files using locally running ollama.
I get the following error :
Expected Behavior
I expect opencommit to work with locally running ollama
Current Behavior
I am running ollama with mistral in one terminal.
ollama run mistral
In the other terminal where I have staged files, I was able to curl to itThis is the config I have
Possible Solution
No response
Steps to Reproduce
No response
Relevant log output