ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.67k stars 220 forks source link

ollama-python equivalent of /clear and context summarization in a chat #191

Closed JiahuiKChen closed 2 days ago

JiahuiKChen commented 1 week ago

When using ollama run <model>, there's a /clear command to "clear session context". How can this be done in the ollama-python library? I can't figure out if it's possible when looking at client.py.

My use case is that I want to chat with the same model in a script but clear the context occasionally.

I'm also interested in how to summarize older conversations to enable longer history with less context, mentioned in this example. Would I have to explicitly ask the model to summarize its context so far, clear the context, then prompt the model with the saved summary -- or is there a built-in way for the model to summarize its context so far and retain the summary, while clearing the rest of its context?

JICA98 commented 1 week ago

+1

militu commented 5 days ago

For the first question, it seems natively handled, look here https://github.com/ollama/ollama/issues/4564

pdevine commented 2 days ago

@JiahuiKChen You have to handle this yourself in the client. You can just empty the messages list, which is effectively what /clear does. You'll have to repopulate the system message again (although we have been looking at doing this automatically in the future).

You definitely can ask the LLM to summarize the current conversation and then substitute that into the messages if you want the LLM to not "lose its brain". We haven't added that capability yet into Ollama.

I'll go ahead and close the issue.