The recommended way of using GPT4All such that it manages prompt templates correctly is this:
from gpt4all import GPT4All
model = GPT4All("Phi-3-mini-4k-instruct.Q4_0.gguf")
with model.chat_session():
print(model.generate("2 fun names for a pelican"))
print(model.generate("2 more"))
with model.chat_session():
print("-- should have reset --")
print(model.generate("2 more"))
Problem: there isn't currently a way to instantiate a new chat_session() with existing logged messages - which LLM needs, because it stores state in between llm and llm -c calls in a SQLite database.
This is the cause of:
30
The recommended way of using GPT4All such that it manages prompt templates correctly is this:
Problem: there isn't currently a way to instantiate a new
chat_session()
with existing logged messages - which LLM needs, because it stores state in betweenllm
andllm -c
calls in a SQLite database.I filed an issue about thish ere: