Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
14.07k
stars
1.66k
forks
source link
[Bug]: Langsmith async integration doesn't register traces #6862
Open
tomukmatthews opened 10 hours ago
What happened?
I ran this, following the docs but it didn't register any trace.
Using langsmith's
@traceable
decorator does pick up the traces, but i currently can't send in metadata dynamically with litellm.Relevant log output
No response
Twitter / LinkedIn details
No response