microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
21.39k stars 3.15k forks source link

Python: Function calling breaks ChatHistory.serialize #7903

Open mlindemu opened 1 month ago

mlindemu commented 1 month ago

Describe the bug Using the new FunctionChoiceBehavior.Auto() class for my execution_settings with GPT-4o. Works great! However, an instance of ChatHistory.serialize() method raises an exception semantic_kernel.exceptions.content_exceptions.ContentSerializationError: Unable to serialize ChatHistory to JSON: Error serializing to JSON: TypeError: 'MockValSer' object cannot be converted to 'SchemaSerializer'

Parsing through each message in the ChatHistory, it looks like it can't serialize any message that relates to calling a Function. Example message extracted from the Debugger:

{'message': StreamingChatMessageContent(choice_index=0, inner_content=[ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role='assistant', tool_calls=[ChoiceDeltaToolCall(index=0, id='call_nVgttAq1pU4Tz3zRFEf5kaln', function=ChoiceDeltaToolCallFunction(arguments='', name='hr_policy_plugin-search_hr_policies'), type='function')]), finish_reason=None, index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None), ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments='{"', name=None), type=None)]), finish_reason=None, index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None), ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments='query', name=None), type=None)]), finish_reason=None, index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None), ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments='":"', name=None), type=None)]), finish_reason=None, index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None), ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments='short', name=None), type=None)]), finish_reason=None, index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None), ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments='s', name=None), type=None)]), finish_reason=None, index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None), ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments=' dress', name=None), type=None)]), finish_reason=None, index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None), ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments=' code', name=None), type=None)]), finish_reason=None, index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None), ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments='"}', name=None), type=None)]), finish_reason=None, index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None), ChatCompletionChunk(id='chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=None), finish_reason='tool_calls', index=0, logprobs=None, content_filter_results={})], created=1722960000, model='gpt-4o-2024-05-13', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_abc28019ad', usage=None)], ai_model_id='gpt-4o', metadata={'logprobs': None, 'id': 'chatcmpl-9tGtMB4gmtRdR9W9B4xc4rp8Xiexq', 'created': 1722960000, 'system_fingerprint': 'fp_abc28019ad'}, content_type='message', role=<AuthorRole.ASSISTANT: 'assistant'>, name=None, items=[FunctionCallContent(inner_content=None, ai_model_id=None, metadata={}, content_type=<ContentTypes.FUNCTION_CALL_CONTENT: 'function_call'>, id='call_nVgttAq1pU4Tz3zRFEf5kaln', index=0, name='hr_policy_plugin-search_hr_policies', function_name='search_hr_policies', plugin_name='hr_policy_plugin', arguments='{"query":"shorts dress code"}')], encoding=None, finish_reason=<FinishReason.TOOL_CALLS: 'tool_calls'>),

'exception': PydanticSerializationError(Error serializing to JSON: TypeError: 'MockValSer' object cannot be converted to 'SchemaSerializer')}

To Reproduce Steps to reproduce the behavior:

  1. Initialize a kernel that can automatically select a function if necessary
  2. Initialize an instance of ChatHistory
  3. Pass the ChatHistory to the kernel's chat_completion.get_streaming_chat_message_contents() method
  4. Serialize the ChatHistory instance by calling instance.serialize()

Expected behavior Standard JSON should be returned by the serialize() method

Screenshots N/A

Platform

Additional context Possibly related to Issue #7340

tnaber commented 1 week ago

Also experiencing this, still using work around: [msg.to_dict() for msg in chat_history.messages]