Open JuHyung-Son opened 1 week ago
To serialize a chat response containing Choice
and Usage
objects to JSON without encountering a TypeError
, you need to implement a custom JSON encoder. Here is an example of how you can do this:
import json
from typing import Any
class CustomJSONEncoder(json.JSONEncoder):
def default(self, obj: Any) -> Any:
if isinstance(obj, Choice):
return obj.__dict__
if isinstance(obj, Usage):
return obj.__dict__
return super().default(obj)
# Assuming `response` is the chat response object containing `Choice` and `Usage` objects
response_json = json.dumps(response, cls=CustomJSONEncoder)
print(response_json)
In this example, CustomJSONEncoder
converts Choice
and Usage
objects to dictionaries using their __dict__
attribute, making them JSON serializable [1].
To continue talking to Dosu, mention @dosu.
There's a similar fix applied for ChatMessage
class that we could borrow to fix this
(But also, an easy workaround is just taking the fields you want)
First, I don't recommend directly calling response.json()
. Its return value is LlamaIndex's ChatResponse, which isn't the best choice if you want to log something like an interface log.
Secondly, if it's necessary, you can implement it yourself by encoding, like this:
import os
import json
from typing import Any
from llama_index.core.base.llms.types import ChatMessage
from llama_index.llms.openai import OpenAI
from openai.types.chat.chat_completion import Choice
from openai.types.completion_usage import CompletionUsage
def openai_json_encoder(obj: Any) -> Any:
if isinstance(obj, (Choice, CompletionUsage)):
return obj.model_dump()
return json.JSONEncoder().default(obj)
llm = OpenAI()
response = llm.chat(messages=[
ChatMessage(role="system", content="You are a helpful assistant."),
ChatMessage(role="user", content="Hi, how are you?")
])
response.json(encoder=openai_json_encoder)
Lastly, if you have any other questions, we can continue to discuss.
@JuHyung-Son @logan-markewich
Of course, we can extend from ChatResponse, implement OpenAIChatResponse, and then use pydantic config like this:
class OpenAIChatResponse(ChatResponse):
class Config:
json_encoders = {
Choice: lambda x: x.dict(),
CompletionUsage: lambda x: x.dict()
}
However, I don't think this is necessary.
Okay I understand. Thanks.
But if the json
method actually does not work and it is not recommended to use, Do we even need a json
method?
The method json
is BaseModel
's method in pydantic
, My first reaction was to add a global converter like Spring Framework in Java
Bug Description
chat response has
Choice
andUsage
can be json serializeVersion
any version
Steps to Reproduce
Relevant Logs/Tracbacks
No response