🪢 Langfuse Python SDK - Instrument your LLM app with decorators or low-level SDK and get detailed tracing/observability. Works with any LLM or framework
While I use langfuse with llamaindex, I get an error output:
received error response: {'message': 'Invalid request data', 'errors': ['Expected object, received string']}
Received 400 error by Langfuse server, not retrying: {'message': 'Invalid request data', 'errors': ['Expected object, received string']}
Then I debug the task_manager file and find this code data = json.dumps(kwargs, cls=EventSerializer) .
In my case, the output object in body is type of <class 'llama_index.core.base.llms.types.CompletionResponse'>.
When it executes this line of default() in EventSerializer, a TypeError will be thrown.
TypeError: 'MockValSer' object cannot be converted to 'SchemaSerializer' (pydantic 2.8.2)
if isinstance(obj, BaseModel):
return obj.dict()
I just realized that this issue is caused by the type from llamaindex.
Would there be any risks if I handle all of the CompletionResponse here?
While I use langfuse with llamaindex, I get an error output:
Then I debug the task_manager file and find this code
data = json.dumps(kwargs, cls=EventSerializer)
. In my case, the output object in body is type of<class 'llama_index.core.base.llms.types.CompletionResponse'>
. When it executes this line ofdefault()
in EventSerializer, a TypeError will be thrown. TypeError:'MockValSer' object cannot be converted to 'SchemaSerializer'
(pydantic 2.8.2)I just realized that this issue is caused by the type from
llamaindex
. Would there be any risks if I handle all of the CompletionResponse here?