Closed FernandoAurelius closed 2 months ago
@FernandoAurelius could you try to set OCO_TOKENS_MAX_INPUT and OCO_TOKENS_MAX_OUTPUT, let me know if it works please
Sure, but just to be sure, how many tokens I should set in max input and max output?
Try 10000 each
I setted to 10000 each, but had this error:
So I tried to define it to 4096 and It worked!
I'm happy that worked but I tried to use opencommit with an older version (I think 3.10) and it worked properly, without the need to define this extra configuration. Anyway, it's fine, thank you for your help.
Opencommit Version
3.1.2
Node Version
22.7.0
NPM Version
10.8.2
What OS are you seeing the problem on?
Windows
What happened?
After the 3.1.1 update, which I was alerted that it was available by npm yesterday, everytime I try to run "oco" I get the following error:
✖ [GoogleGenerativeAI Error]: Error fetching from https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash:generateContent: [400 Bad Request] Invalid value at 'generation_config.max_output_tokens' (TYPE_INT32), "undefined" [{"@type":"type.googleapis.com/google.rpc.BadRequest","fieldViolations":[{"field":"generation_config.max_output_tokens","description":"Invalid value at 'generation_config.max_output_tokens' (TYPE_INT32), \"undefined\""}]}].
I've already installed the new version of OpenCommit (3.1.2), but the problem continues.
Expected Behavior
I expected the software to work as usual, because before the update it was working normally. I don't neither LLAMA or OpenAI to generate the commit message for me, but I found very confortable to work with Gemini, and he was the one that I chose to work with OpenCommit.
Current Behavior
See What happened.
Possible Solution
I don't think I have, unfortunately.
Steps to Reproduce
oco config set OCO_GEMINI_API_KEY="your_key_here"
oco config set OCO_AI_PROVIDER=gemini
oco config set OCO_MODEL=gemini-1.5-flash
git add .
oco
Relevant log output