Closed jhanca-robotecai closed 3 weeks ago
I've tested this on the Ollama chat model. And the questions do not mix. This is probably due to the generate model as it gets all of the previous questions and answers combined in the same prompt.
Closing the issue as solved. Incorrect configuration was selected.
Rethink and implement the context system for Ollama llama3. Please see the example below:![image](https://github.com/RobotecAI/ai-core-gem/assets/134940295/319a2eae-9aed-44ef-960d-71feecde0a83)