jmont-dev / ollama-hpp

Modern, Header-only C++ bindings for the Ollama API.
MIT License
53 stars 7 forks source link

Crash when using context generated by another model. #30

Open JG-Adams opened 1 week ago

JG-Adams commented 1 week ago

I've been adopting the use of generate() and then using the context to give AI memory. Then I tried seeing what happen when I switch AI model. An assert failed and gave me this. Assertion failed: it != m_data.m_value.object->end(), file F:/MyStudio/Technology/JG_SDL/engine/OllamaHTTP/json.hpp, line 21472

I don't know if this was even possible since I was able to do the same thing in WebUI and it was able to do fine. Is this possible? If not, is there a possible way?

JG-Adams commented 1 week ago

I think this is supposed to function. Can you fix this? I would like it very much! It does not crash when using the model that uses a context data generated by the same model. It does when you changed model and use context generated by a different model. Maybe it has something to do with how it pass the data in Json? Maybe it was expecting it to be of a certain length? I'm not sure.