Open LalaDK opened 2 weeks ago
Same here
make sure you have AI MODEL: mistral in your local.Coz ollama-commit use by default mistral Model.
run llama run mistral
Hello,
It seems that the issue is caused by the ai model not responding in a fixed json format.
We are planning to introduce Structured Output to solve this issue.
Thank you.
Unfortunately, due to my personal schedule, I will not be able to work quickly.
Hello.
I'd like to try out your tool, but I get this error message:
I've tried with different changes, but I still get the error.