Closed gich2009 closed 6 months ago
π€
Thank you for bringing this issue to our attention. It seems like the ChatCompletionMessageToolCall
class is not currently designed to be JSON serializable.
As a workaround, you could implement a to_dict
method in the ChatCompletionMessageToolCall
class that converts the object into a dictionary that can be serialized. Here's a general example of how you might implement this:
class ChatCompletionMessageToolCall:
# ... existing methods and properties ...
def to_dict(self):
return {
'property1': self.property1,
'property2': self.property2,
# ... and so on for all properties that need to be serialized ...
}
Then, when you need to serialize an instance of ChatCompletionMessageToolCall
, you can call this method and pass the result to json.dumps
:
tool_call = ChatCompletionMessageToolCall()
serializable_dict = tool_call.to_dict()
json_str = json.dumps(serializable_dict)
To load the object back from JSON, you would need to implement a corresponding method that creates a ChatCompletionMessageToolCall
instance from a dictionary. This could be a from_dict
class method:
class ChatCompletionMessageToolCall:
# ... existing methods and properties ...
@classmethod
def from_dict(cls, data):
instance = cls()
instance.property1 = data['property1']
instance.property2 = data['property2']
# ... and so on for all properties ...
return instance
And you would use it like this:
data = json.loads(json_str)
tool_call = ChatCompletionMessageToolCall.from_dict(data)
Please note that this is a general approach and the actual implementation will depend on the specific properties and structure of the ChatCompletionMessageToolCall
class.
Unfortunately, I wasn't able to find the specific structure and properties of the ChatCompletionMessageToolCall
class in the LlamaIndex codebase. Could you provide more information about this class or point me to the part of the codebase where it's defined? This would help me give a more accurate answer.
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a π if this is helpful and π if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
@gich2009 can you kindly provide code to replicate the issue? ChatCompletionMessageToolCall
is an openai class and is a Pydantic BaseModel, and I just checked the source code, one should be able to convert that class to dict by using .dict()
for pydantic v1
@nerdai I think the issue is that json.dumps() doesn't know about pydantic objects, and then doesn't know how to convert it to a dict/string?
Hi @nerdai, converting with .to_dict() works as expected. The issue is that the .to_string() convertion does not work because of its reliance on json.dumps. @logan-markewich is likely onto something with json.dumps not knowing about pydantic objects.
The issue with this is that it is not consistent with other modules that have memory. The ReactAgent and the ChatEngines all allow you to convert their memory objects to a string with the .to_string().
Hi @nerdai, if the class is on the openai side, then we can just leave it as is for now. It allows you to convert it to a dict and from there you can pickle it. memory_dict = OpenAIAgent().memory.to_dict() and then you can pickle.dumps(memory_dict)
Ah okay thanks @gich2009 and @logan-markewich, this makes sense. I guess I missed the part about json.dumps in my initial read of "Steps to Reproduce" β my bad π .
On that note, I was able to use to_string
method on an OpenAIAgent, however. In fact, memory.to_string()
also eventually calls to_dict()
as well: to_string() -> json() -> to_json() -> to_dict() -> dict().
Here's snippet of code (taken from our parallel function calling notebook)
from llama_index.agent import OpenAIAgent
from llama_index.llms import OpenAI
from llama_index.tools import BaseTool, FunctionTool
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
def add(a: int, b: int) -> int:
"""Add two integers and returns the result integer"""
return a + b
# Define tools
add_tool = FunctionTool.from_defaults(fn=add)
multiply_tool = FunctionTool.from_defaults(fn=multiply)
# Define agent
llm = OpenAI(model="gpt-3.5-turbo-1106")
agent = OpenAIAgent.from_tools(
[multiply_tool, add_tool], llm=llm, verbose=True
)
# Query agent
response = agent.chat("What is (121 * 3) + 42?")
# Invoke to_string()
agent.memory.to_string()
Produces
'{"token_limit": 12288, "chat_store": {"store": {"chat_history": [{"role": "user", "content": "What is (121 * 3) + 42?", "additional_kwargs": {}}, {"role": "assistant", "content": null, "additional_kwargs": {"tool_calls": [{"id": "call_TV5lHpfLBu87BQBteXlSLTDm", "function": {"arguments": "{\\"a\\": 121, \\"b\\": 3}", "name": "multiply"}, "type": "function"}, {"id": "call_jP6lJUxWunpm2qlvcmo3WvkT", "function": {"arguments": "{\\"a\\": 363, \\"b\\": 42}", "name": "add"}, "type": "function"}]}}, {"role": "tool", "content": "363", "additional_kwargs": {"name": "multiply", "tool_call_id": "call_TV5lHpfLBu87BQBteXlSLTDm"}}, {"role": "tool", "content": "405", "additional_kwargs": {"name": "add", "tool_call_id": "call_jP6lJUxWunpm2qlvcmo3WvkT"}}, {"role": "assistant", "content": "The result of (121 * 3) is 363, and when we add 42 to it, we get 405.", "additional_kwargs": {}}]}, "class_name": "SimpleChatStore"}, "chat_store_key": "chat_history", "class_name": "ChatMemoryBuffer"}'
I should also confirm that within agent.memory
there is ChatCompletionMessageToolCall
ChatMemoryBuffer(token_limit=12288, tokenizer_fn=functools.partial(<bound method Encoding.encode of <Encoding 'cl100k_base'>>, allowed_special='all'), chat_store=SimpleChatStore(store={'chat_history': [ChatMessage(role=<MessageRole.USER: 'user'>, content='What is (121 * 3) + 42?', additional_kwargs={}), ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content=None, additional_kwargs={'tool_calls': [ChatCompletionMessageToolCall(id='call_TV5lHpfLBu87BQBteXlSLTDm', function=Function(arguments='{"a": 121, "b": 3}', name='multiply'), type='function'), ChatCompletionMessageToolCall(id='call_jP6lJUxWunpm2qlvcmo3WvkT', function=Function(arguments='{"a": 363, "b": 42}', name='add'), type='function')]}), ChatMessage(role=<MessageRole.TOOL: 'tool'>, content='363', additional_kwargs={'name': 'multiply', 'tool_call_id': 'call_TV5lHpfLBu87BQBteXlSLTDm'}), ChatMessage(role=<MessageRole.TOOL: 'tool'>, content='405', additional_kwargs={'name': 'add', 'tool_call_id': 'call_jP6lJUxWunpm2qlvcmo3WvkT'}), ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content='The result of (121 * 3) is 363, and when we add 42 to it, we get 405.', additional_kwargs={})]}), chat_store_key='chat_history')
This is quite strange @nerdai, It seems to be working on your side. This is what I get on my end even with 0.9.40.
string_memory = agent.memory.to_string()
File "/home/gich2009/Work/PA/venv-new/lib/python3.10/site-packages/llama_index/memory/chat_memory_buffer.py", line 77, in to_string return self.json() File "/home/gich2009/Work/PA/venv-new/lib/python3.10/site-packages/llama_index/schema.py", line 58, in json return self.to_json(**kwargs) File "/home/gich2009/Work/PA/venv-new/lib/python3.10/site-packages/llama_index/schema.py", line 103, in to_json return json.dumps(data) File "/usr/lib/python3.10/json/init.py", line 231, in dumps return _default_encoder.encode(obj) File "/usr/lib/python3.10/json/encoder.py", line 199, in encode chunks = self.iterencode(o, _one_shot=True) File "/usr/lib/python3.10/json/encoder.py", line 257, in iterencode return _iterencode(o, 0) File "/usr/lib/python3.10/json/encoder.py", line 179, in default raise TypeError(f'Object of type {o.class.name} ' TypeError: Object of type ChatCompletionMessageToolCall is not JSON serializable
Let me look into it further
Though my tool list includes a QueryEngine tool as well. Could it be the response format of the QueryEngine tool that causes an issue?
@nerdai even with your script I get the same exception. Here is the log:
Traceback (most recent call last):
File "/home/gich2009/Work/PA/futureplans/query_faiss.py", line 94, in
I've got a similar problem:
from llama_index.llms import OpenAI, ChatMessage
from llama_index.tools import BaseTool, FunctionTool
from llama_index.agent import OpenAIAgent
from llama_index.llms import OpenAI
from llama_index import set_global_handler
from traceloop.sdk import Traceloop
from dotenv import load_dotenv
from llama_index.storage.chat_store import SimpleChatStore
from llama_index.memory import ChatMemoryBuffer
load_dotenv()
def unrelated_query(query: str) -> None:
"""Logs that the query was unrelated to any of the tools and capabilities provided for the agent."""
print(f"Unrelated query: {query}")
pass
unrelated_query_tool = FunctionTool.from_defaults(fn=unrelated_query)
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
multiply_tool = FunctionTool.from_defaults(fn=multiply)
def add(a: int, b: int) -> int:
"""Add two integers and returns the result integer"""
return a + b
add_tool = FunctionTool.from_defaults(fn=add)
llm = OpenAI(model="gpt-4-turbo-preview")
prompt = """You will answer only to queries that can be responded with your tools.
Make sure to say is beyond your capacity to provide answers to anything unrelated and report any unrelated queries with the tool provided.
Do not answer to queries that are unrelated to your tools just say sorry I can't answer that.
You will be fined with 100$ dollars for every unrelated query you answer to."""
chat_store = SimpleChatStore()
chat_store.add_message(
key="user1",
message=ChatMessage(role="system", content=prompt),
idx=1
)
chat_memory = ChatMemoryBuffer.from_defaults(
chat_store=chat_store,
chat_store_key="user1"
)
agent = OpenAIAgent(tools=[multiply_tool, add_tool, unrelated_query_tool], llm=llm, verbose=True, memory=chat_memory, prefix_messages=[ChatMessage(role="system", content=prompt)])
response = agent.chat("Why do we exist?")
Outputs:
Added user message to memory: Why do we exist?
=== Calling Function ===
Calling function: unrelated_query with args: {"query":"Why do we exist?"}
Unrelated query: Why do we exist?
Got output: None
========================
And when you do:
agent.memory.to_dict()
It outputs
{'token_limit': 3000,
'chat_store': {'store': {'user1': [{'role': <MessageRole.SYSTEM: 'system'>,
'content': "You will answer only to queries that can be responded with your tools. \nMake sure to say is beyond your capacity to provide answers to anything unrelated and report any unrelated queries with the tool provided.\nDo not answer to queries that are unrelated to your tools just say sorry I can't answer that.\nYou will be fined with 100$ dollars for every unrelated query you answer to.",
'additional_kwargs': {}},
{'role': <MessageRole.USER: 'user'>,
'content': 'Why do we exist?',
'additional_kwargs': {}},
{'role': <MessageRole.ASSISTANT: 'assistant'>,
'content': None,
'additional_kwargs': {'tool_calls': [ChatCompletionMessageToolCall(id='call_rzA8911y2D2174WbgCbOHFgB', function=Function(arguments='{"query":"Why do we exist?"}', name='unrelated_query'), type='function')]}},
{'role': <MessageRole.TOOL: 'tool'>,
'content': 'None',
'additional_kwargs': {'name': 'unrelated_query',
'tool_call_id': 'call_rzA8911y2D2174WbgCbOHFgB'}},
{'role': <MessageRole.ASSISTANT: 'assistant'>,
'content': "Sorry, I can't answer that.",
'additional_kwargs': {}}]},
'class_name': 'SimpleChatStore'},
'chat_store_key': 'user1',
'class_name': 'ChatMemoryBuffer'}
But when you try to get the json serialization:
agent.memory.json()
It throws:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[84], [line 9](vscode-notebook-cell:?execution_count=84&line=9)
[1](vscode-notebook-cell:?execution_count=84&line=1) # chat_store.add_message(message=response, key="user1")
[2](vscode-notebook-cell:?execution_count=84&line=2)
[3](vscode-notebook-cell:?execution_count=84&line=3)
[4](vscode-notebook-cell:?execution_count=84&line=4) # chat_store.get_messages()
[5](vscode-notebook-cell:?execution_count=84&line=5)
[6](vscode-notebook-cell:?execution_count=84&line=6) # chat_store.delete_last_message("user1")
----> [9](vscode-notebook-cell:?execution_count=84&line=9) agent.memory.json()
File [~/.pyenv/versions/3.11.7/envs/tvc_chatbot_api_llamaindex/lib/python3.11/site-packages/llama_index/schema.py:58](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/envs/tvc_chatbot_api_llamaindex/lib/python3.11/site-packages/llama_index/schema.py:58), in BaseComponent.json(self, **kwargs)
[57](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/envs/tvc_chatbot_api_llamaindex/lib/python3.11/site-packages/llama_index/schema.py:57) def json(self, **kwargs: Any) -> str:
---> [58](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/envs/tvc_chatbot_api_llamaindex/lib/python3.11/site-packages/llama_index/schema.py:58) return self.to_json(**kwargs)
File [~/.pyenv/versions/3.11.7/envs/tvc_chatbot_api_llamaindex/lib/python3.11/site-packages/llama_index/schema.py:103](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/envs/tvc_chatbot_api_llamaindex/lib/python3.11/site-packages/llama_index/schema.py:103), in BaseComponent.to_json(self, **kwargs)
[101](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/envs/tvc_chatbot_api_llamaindex/lib/python3.11/site-packages/llama_index/schema.py:101) def to_json(self, **kwargs: Any) -> str:
[102](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/envs/tvc_chatbot_api_llamaindex/lib/python3.11/site-packages/llama_index/schema.py:102) data = self.to_dict(**kwargs)
--> [103](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/envs/tvc_chatbot_api_llamaindex/lib/python3.11/site-packages/llama_index/schema.py:103) return json.dumps(data)
File [~/.pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:231](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:231), in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
[226](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:226) # cached encoder
[227](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:227) if (not skipkeys and ensure_ascii and
[228](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:228) check_circular and allow_nan and
[229](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/lib/python3.11/json/__init__.py:229) cls is None and indent is None and separators is None and
...
[179](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/lib/python3.11/json/encoder.py:179) """
--> [180](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/lib/python3.11/json/encoder.py:180) raise TypeError(f'Object of type {o.__class__.__name__} '
[181](https://file+.vscode-resource.vscode-cdn.net/Users/eddie/Code/python/tvc_chatbot_api_llamaindex/~/.pyenv/versions/3.11.7/lib/python3.11/json/encoder.py:181) f'is not JSON serializable')
TypeError: Object of type ChatCompletionMessageToolCall is not JSON serializable
Bug Description
Any attempt to load or dump the memory state from an OpenAI agent is not possible because the ChatCompletionMessageToolCall is not JSON serialiazable.
Version
0.9.39
Steps to Reproduce
Try converting a memory object belonging to an OpenAIAgent to a string. OpenAIAgent().memory.to_string() since this requires json.dumps under the hood, and ChatCompletionMessageToolCall is not JSON serializable, then the behaviour breaks. You could try and circumvent this problem by using OpenAIAgent().memory.to_dict() but then at some stage in your program, you may need the json.dumps() method which will lead you to the same issue.
Relevant Logs/Tracbacks