run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.81k stars 5.07k forks source link

[Question]: How can I use SimpleChatStore to keep a conversation record Germanently? #13314

Closed union-cmd closed 1 month ago

union-cmd commented 4 months ago

Question Validation

Question

chat_store = SimpleChatStore()

chat_store = SimpleChatStore.from_persist_path("./chat_stores.json")

memory = ChatMemoryBuffer.from_defaults(token_limit=5000, chat_store=chat_store, chat_store_key="code")

documents = SimpleDirectoryReader('./data_test/paul_graham').load_data()
Settings.chunk_size = 512
nodes = Settings.node_parser.get_nodes_from_documents(documents)
docstore = SimpleDocumentStore()
docstore.add_documents(nodes)
storage_context = StorageContext.from_defaults(docstore=docstore)
index = VectorStoreIndex(nodes, storage_context=storage_context)

chat_engine = index.as_chat_engine(
    chat_mode="condense_plus_context",
    memory=memory
)

response = chat_engine.chat("hello")

After I executed the above code, my chat history was not saved in the chat_stores.json file

dosubot[bot] commented 4 months ago

To ensure your conversation records are permanently saved in the chat_stores.json file with SimpleChatStore, you need to explicitly call the persist method after your chat interactions. Here's how you can do it:

chat_store.persist(persist_path="chat_stores.json")

This method should be called at points in your application where you want to save the chat history, such as after a conversation session ends or at regular intervals.

Additionally, there's a critical detail to be aware of regarding the persist method implementation. If you're encountering issues with the chat history not being saved, it might be due to incorrect indentation in the persist method within the SimpleChatStore class. Ensure that the block of code responsible for writing to the file is correctly indented to execute outside of any conditional directory existence checks. This adjustment ensures that the chat store data is written to the chat_stores.json file as intended.

For more detailed implementation and to verify the correct structure, you might want to review the source code of the SimpleChatStore class. This will help you ensure that your chat history is being saved correctly.

#### Details **Sources** - [docs/docs/module_guides/storing/chat_stores.md](https://github.com/run-llama/llama_index/blob/main/docs/docs/module_guides/storing/chat_stores.md) - [llama-index-core/llama_index/core/storage/chat_store/simple_chat_store.py](https://github.com/run-llama/llama_index/blob/main/llama-index-core/llama_index/core/storage/chat_store/simple_chat_store.py) - [llama-index-legacy/llama_index/legacy/storage/chat_store/simple_chat_store.py](https://github.com/run-llama/llama_index/blob/main/llama-index-legacy/llama_index/legacy/storage/chat_store/simple_chat_store.py) - [llama-index-core/llama_index/core/memory/chat_memory_buffer.py](https://github.com/run-llama/llama_index/blob/main/llama-index-core/llama_index/core/memory/chat_memory_buffer.py) - [llama-index-legacy/llama_index/legacy/memory/chat_memory_buffer.py](https://github.com/run-llama/llama_index/blob/main/llama-index-legacy/llama_index/legacy/memory/chat_memory_buffer.py)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

union-cmd commented 4 months ago

The data stored in stores.json is as follows

"{\"store\": {}, \"class_name\": \"SimpleChatStore\"}"

union-cmd commented 4 months ago

This was not saved successfully

dosubot[bot] commented 4 months ago

The issue with the chat_stores.json file not saving the conversation records correctly seems to stem from the serialization process of the store attribute within the SimpleChatStore. To ensure that the conversation records are properly saved, you need to serialize the store dictionary directly to JSON format before writing it to the file. This involves using json.dumps(self.store) to correctly serialize the store dictionary, which holds the conversation records, instead of relying on a potentially undefined self.json() method.

Here's how you can adjust the persist method to correctly handle the serialization:

def persist(
    self,
    persist_path: str = "chat_store.json",
    fs: Optional[fsspec.AbstractFileSystem] = None,
) -> None:
    """Persist the chat store to a file."""
    fs = fs or fsspec.filesystem("file")
    dirpath = os.path.dirname(persist_path)
    if not fs.exists(dirpath):
        fs.makedirs(dirpath)
    with fs.open(persist_path, "w") as f:
        f.write(json.dumps(self.store))  # Serialize the store dictionary correctly

This modification ensures that the store dictionary is properly converted to a JSON string and written to the specified file, addressing the issue with saving conversation records.

#### Details **Sources** - [llama-index-core/llama_index/core/storage/chat_store/simple_chat_store.py](https://github.com/run-llama/llama_index/blob/main/llama-index-core/llama_index/core/storage/chat_store/simple_chat_store.py)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

logan-markewich commented 4 months ago

@union-cmd You didn't persist the chat store.

This works fine for me

from llama_index.core.memory import ChatMemoryBuffer
from llama_index.core.storage.chat_store import SimpleChatStore
from llama_index.core import Document, VectorStoreIndex

index = VectorStoreIndex.from_documents([Document.example()])

chat_store = SimpleChatStore()
memory = ChatMemoryBuffer.from_defaults(
    token_limit=5000, 
    chat_store=chat_store, 
    chat_store_key="user1"
)

chat_engine = index.as_chat_engine(
    chat_mode="condense_plus_context",
    memory=memory
)

response = chat_engine.chat("hello!")
chat_store.persist("chat_store.json")

chat_store = SimpleChatStore.from_persist_path("chat_store.json")
memory = ChatMemoryBuffer.from_defaults(
    token_limit=5000, 
    chat_store=chat_store, 
    chat_store_key="user1"
)

print(memory.get_all())