Open jagadish-k opened 8 months ago
@jaroslaw-weber hi man, could you take a look please?
@jagadish-k could you try with other OCO_MODEL
config? now it's set to gpt-3.5-turbo-16k
which is not mistral
Same for me even with OCO_MODEL set to mistral
❯ OCO_AI_PROVIDER="ollama" OCO_MODEL=mistral opencommit
┌ open-commit
│
◇ 2 staged files:
.zshrc
Makefile
│
◇ 📝 Commit message generated
│
└ ✖ local model issues. details: connect ECONNREFUSED ::1:11434
Hi there, I am also getting this same error. I think the doc should add some more details on setting an ollama model with open-commit
I have this bare minimum config & ollama is running fine on port 11434 on localhost
OCO_AI_PROVIDER='ollama'
OCO_DESCRIPTION=false
OCO_EMOJI=false
Guys I have found a fix for this. You see the oco
is requesting the ollama API on ipv6 loopback which is ::1
and by default ollama doest not listen on ipv6 address.
So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the OLLAMA_HOST='0.0.0.0
env variable.
@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.
I hope this will help thanks
@Abir-Tx please do enhance the documentation when you have time for a PR and thank you for the help ❤️
Guys I have found a fix for this. You see the
oco
is requesting the ollama API on ipv6 loopback which is::1
and by default ollama doest not listen on ipv6 address.So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the
OLLAMA_HOST='0.0.0.0
env variable.@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.
I hope this will help thanks
Currently trying to get opencommit to run in a github workflow action, where would I set this variable?
Guys I have found a fix for this. You see the
oco
is requesting the ollama API on ipv6 loopback which is::1
and by default ollama doest not listen on ipv6 address.So to fix the issue you can run ollama on ipv6 too or in my case I have configured ollama to listen on all interfaces by setting the
OLLAMA_HOST='0.0.0.0
env variable.@di-sukharev I can enhance the documentation on this by adding guide for setting up this if you like so I will submit a PR.
I hope this will help thanks
added export OLLAMA_HOST='0.0.0.0'
then closed ollama, re-open it and worked
@victorbiga @Abir-Tx thank you guys, do you want to make a PR for the fix?
@victorbiga @Abir-Tx thank you guys, do you want to make a PR for the fix?
Yeah I would love to. I will try to submit a PR with enhanced documentation as soon as I get some time. Thank you
@Abir-Tx nice, let me know if you need any help ❤️
Thank you @di-sukharev. Give me some time I will soon submit the PR. A bit busy right now
Problem: I am unable to use opencommit to generate commit message for my staged files using locally running ollama.
BUG: ✖ Failed to generate the commit message Error: Ollama provider error: Invalid URL
Opencommit Version 3.2.2
Node Version 20.14.0
NPM Version 10.8.3
Local Machine OS Verson Windows 11 (23H2)
Ollama Model Version mistral
SOLUTION:
Opencommit Version
3.0.11
Node Version
18.15.0
NPM Version
9.5.0
What OS are you seeing the problem on?
Mac
What happened?
I am unable to use opencommit to generate commit message for my staged files using locally running ollama.
I get the following error :
Expected Behavior
I expect opencommit to work with locally running ollama
Current Behavior
I am running ollama with mistral in one terminal.
ollama run mistral
In the other terminal where I have staged files, I was able to curl to itThis is the config I have
Possible Solution
No response
Steps to Reproduce
No response
Relevant log output