Open rohanbalkondekar opened 2 days ago
The initial issue of traces not appearing in the Phoenix dashboard was caused by package version mismatches in the environment. Specifically, strict version constraints in the requirements.txt
file led to incompatibilities between the tracing packages and other dependencies.
By removing strict version constraints and allowing pip
to install compatible versions, the package mismatches were resolved. This ensured that arize-phoenix-otel
, openinference-instrumentation-litellm
, and other dependencies worked together smoothly.
- pandas~=2.2.0
- openai~=1.54.0
- pydantic~=2.7.0
- litellm~=1.52.0
- uvicorn~=0.30.0
- fastapi~=0.111.0
- gunicorn~=22.0.0
- qdrant-client~=1.9.0
+ pydantic
+ litellm
+ uvicorn
+ fastapi
+ gunicorn
+ qdrant-client
+ python-dotenv
+ arize-phoenix
+ arize-phoenix-otel
+ openinference-instrumentation-litellm
After updating the packages, a serialization error occurred due to the inclusion of Message
objects (which are not JSON serializable by default) in the messages
list processed by the tracing instrumentation.
The problem was due to a serialization error caused by the inclusion of the tracing instrumentation from Phoenix and OpenInference in our FastAPI application. Specifically, the openinference.instrumentation.litellm
module and its LiteLLMInstrumentor
were causing issues when processing Message
objects within the messages
list sent to the LLM.
When the tracing instrumentation is active, it wraps around the litellm's completion function and attempts to serialize the messages list for tracing purposes. However, the Message objects in the messages list are not JSON serializable by default. This leads to serialization errors, preventing traces from being correctly exported to the Phoenix dashboard.
Fix: Converted Message
objects to dictionaries using the .model_dump()
method provided by Pydantic (which Message
objects inherit from). This ensured all messages were JSON serializable and compatible with the tracing mechanism.
When using tracing instrumentation that processes data structures like messages
, ensure all included objects are JSON serializable. For custom objects like Message
, use methods like .model_dump()
to convert them into dictionaries.
Reopened the issue because the package feels unstable. If there is some version mismatch or something, it simply stops working, no errors whatsoever. I need to delete the virtual environment and recreate a new one with all the latest packages and then it starts working.
Improved error logs would be helpful
I'm using Phoenix to trace LLM calls in a FastAPI application that utilizes LiteLLM. When running the application, LLM calls work correctly, and responses are returned as expected. However, traces are not appearing in the Phoenix dashboard when using the hosted Phoenix instance via
arize-phoenix-otel
.When I run similar code in a Jupyter notebook or a synchronous script using the local Phoenix instance (
arize-phoenix
), tracing works correctly, and I can see the traces in the dashboard. This leads me to believe the issue might be related to context propagation in the asynchronous FastAPI application or with the hosted Phoenix tracing.To Reproduce
Steps to reproduce the behavior:
Create a Minimal FastAPI Application
Directory Structure
main.py
utils.py
test.py
requirements.txt
Install Dependencies
Run the FastAPI Application
Send a Test Request
In another terminal, run:
This should print the LLM response (e.g., "Hello! How can I assist you today?") to the console, indicating that the LLM call was successful.
Check the Phoenix Dashboard
my-llm-app
).Expected Behavior
Actual Behavior
arize-phoenix-otel
.Environment
OS: Ubuntu 22.04.4 LTS
Python Version: 3.10.12
Packages and Versions: All latest packages
The old approach (now deemed legacy) worked just fine arize-phoenix~=4.35.0
Additional Context
Working Scenario: When using local tracing with
arize-phoenix
andOpenAIInstrumentor
, traces appear in the local Phoenix dashboard.Attempted Solutions:
Set Global Tracer Provider: Used
set_global_tracer_provider=True
in theregister()
function.Instrument Order: Ensured that instrumentation occurs before importing and using
litellm
.Debug Logging: Enabled debug logging for OpenTelemetry.
Network Connectivity: Confirmed that the application can reach
https://app.phoenix.arize.com/v1/traces
(no firewall issues).API Key Verification: Ensured the Phoenix API key is correct and properly set.
FastAPI Instrumentation: Attempted to instrument FastAPI for context propagation.
Manual Spans: Tried adding manual spans in the
call_llm
function.Observations:
arize-phoenix-otel
.Questions
Is there additional configuration required when using
arize-phoenix-otel
with a FastAPI application that uses asynchronous code?Could this issue be related to context propagation in asynchronous applications, and if so, how can it be addressed?
Are there any known issues or additional steps needed to get Phoenix tracing working in this setup with FastAPI and LiteLLM?
Are there compatibility issues with the versions of OpenTelemetry, Phoenix, or LiteLLM I am using?
Logs
Application Startup Logs:
Request Handling Logs:
No additional OpenTelemetry logs are generated during request handling, even with debug logging enabled.
Conclusion
I'm looking for guidance on how to resolve this issue. Any assistance or suggestions would be greatly appreciated. If there are any examples or documentation on integrating
arize-phoenix-otel
with FastAPI and LiteLLM, that would be very helpful.Thank You
Thank you for your time and support.
Additional context