run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.27k stars 5.17k forks source link

[Bug]: TypeError when saving message to ChatStore if return_direct=True was used in QueryEngineTool #16405

Closed MarkHmnv closed 2 weeks ago

MarkHmnv commented 2 weeks ago

Bug Description

This error occurs in the finalize_task function of the OpenAIAgentWorker class, which is used inside OpenAIAgent. It occurs when it tries to save messages to the Azure Chat Store. While saving the metadata that shows the tools used during the agent's work, the error TypeError: 'MockValSer' object cannot be converted to 'SchemaSerializer' occurs. By and large, the error is due to a bug inside AzureChatStore. Version of llama-index-storage-chat-store-azure - 0.2.0

Version

0.11.16

Steps to Reproduce

index = VectorStoreIndex.from_documents([ <your docs>])

tool = QueryEngineTool.from_defaults(
    name='name',
    query_engine=index.as_query_engine(),
    description='description',
    return_direct=True,
)

chat_store = AzureChatStore.from_account_and_key(
    account_name='name',
    account_key='key',
)

memory = ChatMemoryBuffer.from_defaults(
    token_limit=5000,
    chat_store=chat_store,
    chat_store_key='1',
)

agent = OpenAIAgent.from_tools(
    [tool],
    memory=memory,
)

await agent.achat('tool-related query')

Relevant Logs/Tracbacks

2024-10-07 16:57:44,320 - main - ERROR - Traceback (most recent call last):
  File "...\.venv\Lib\site-packages\botbuilder\core\bot_adapter.py", line 174, in run_pipeline
    return await self._middleware.receive_activity_with_status(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\botbuilder\core\middleware_set.py", line 69, in receive_activity_with_status
    return await self.receive_activity_internal(context, callback)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\botbuilder\core\middleware_set.py", line 79, in receive_activity_internal
    return await callback(context)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\botbuilder\core\activity_handler.py", line 70, in on_turn
    await self.on_message_activity(turn_context)
  File "......\common\azure_message_logger.py", line 29, in wrapper
    response, rag_message = await func(self, turn_context, *args, **kwargs)
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "......\bot\hr_bot.py", line 55, in on_message_activity
    response = await self.chat_service.agent_chat(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "......\common\chat_service.py", line 53, in agent_chat
    return await agent.achat(query)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 353, in async_wrapper
    result = await func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\core\callbacks\utils.py", line 56, in async_wrapper
    return await func(self, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\core\agent\runner\base.py", line 672, in achat
    chat_response = await self._achat(
                    ^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 353, in async_wrapper
    result = await func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\core\agent\runner\base.py", line 625, in _achat
    result = self.finalize_response(
             ^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 307, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\core\agent\runner\base.py", line 549, in finalize_response
    self.agent_worker.finalize_task(self.state.get_task(task_id))
  File "...\.venv\Lib\site-packages\llama_index\core\instrumentation\dispatcher.py", line 307, in wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\agent\openai\step.py", line 784, in finalize_task
    task.memory.put_messages(task.extra_state["new_memory"].get_all())
  File "...\.venv\Lib\site-packages\llama_index\core\memory\types.py", line 51, in put_messages
    self.put(message)
  File "...\.venv\Lib\site-packages\llama_index\core\memory\types.py", line 99, in put
    self.chat_store.add_message(self.chat_store_key, message)
  File "...\.venv\Lib\site-packages\llama_index\storage\chat_store\azure\base.py", line 237, in add_message
    **message.dict(),
      ^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\core\base\llms\types.py", line 110, in dict
    return self.model_dump(**kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\llama_index\core\base\llms\types.py", line 114, in model_dump
    msg = super().model_dump(**kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "...\.venv\Lib\site-packages\pydantic\main.py", line 390, in model_dump
    return self.__pydantic_serializer__.to_python(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: 'MockValSer' object cannot be converted to 'SchemaSerializer'
dosubot[bot] commented 2 weeks ago

The error you're encountering, TypeError: 'MockValSer' object cannot be converted to 'SchemaSerializer', seems to be related to the serialization process in the AzureChatStore when using the finalize_task function with return_direct=True in QueryEngineTool. This issue might be due to a bug in the llama-index-storage-chat-store-azure version 0.2.0, which affects how messages are serialized and stored.

The return_direct=True parameter in QueryEngineTool ensures that the response from the tool is returned directly, without being interpreted and rewritten by the agent. This can affect the execution flow and how messages are handled in the finalize_task function [1].

To address this issue, you might want to check if there are any updates or patches available for the llama-index-storage-chat-store-azure package that address this serialization problem. Additionally, reviewing the implementation of the MockValSer and SchemaSerializer objects in your code or dependencies might provide insights into resolving the type conversion issue.

To continue talking to Dosu, mention @dosu.

logan-markewich commented 2 weeks ago

The issue is coming from calling message.dict() on a message in the chat history (likely a response from the LLM). This is unrelated to the chat store being used tbh

Seems like an issue with the serialization logic in general with pydantic v2

logan-markewich commented 2 weeks ago

Its a long-standing issue it seems 🤔 I guess the switch to full pydantic v2 support brought his on

https://github.com/pydantic/pydantic/issues/7713 https://github.com/openai/openai-python/issues/1306