Open VfBfoerst opened 1 month ago
Hey @VfBfoerst , we are working on litellm and we are hoping to release that support soon. Probably in about a week from now. I will keep you posted. Are there any specific litellm calls you are hoping to capture? would love to learn more so it meets your expectations.
Awesome! I would like to add langtrace as a somewhat middleware to the litellm proxy. So all tokens/metrics should be traced. Maybe on a api key basis to get statistics/metrics per key. All that to detect in the end bottlenecks, track usage, detect dysfunctional parts.
Thanks for this information. This is helpful. I will give you an update once we have the support live
@VfBfoerst - Just wanted to provide an update. We are still working through the support. We should have an update for you next week. Thanks for your patience.
Hey :)
I wanted to ask for an integration of litellm as we use it as a gateway to different ai providers. It would be very cool to trace the litellm requests. Is there already a way to do so? Or can it be implemented?