microsoft / promptflow

Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
https://microsoft.github.io/promptflow/
MIT License
9.15k stars 829 forks source link

[Feature Request] Traicing tool for inference #1015

Open ArtyomZemlyak opened 10 months ago

ArtyomZemlyak commented 10 months ago

Is your feature request related to a problem? Please describe. We now have good tracing capabilities in VS Code Extension. But it is not recorded in the output service (as I understand, but I could be wrong).

Describe the solution you'd like It will be very cool if we can get traces from some inference endpoint and check them with a tracing tool (like LangSmith). Or we can download the logs from the endpoint via some REST function and check them in the VSCode Extension - this could also be very useful (possibly back-end integration without having to download anything manually).

jiaochenlu commented 9 months ago

Hi @ArtyomZemlyak , we have the trace graph in the vscode extension already, single run, and visualization on batch runs, not sure if it's the "tracing capabilities" you mentioned is good. 😄 image image

From your description on solution, it seems that you have a deployed flow as an endpoint and now you're looking for a feature that allows you to retrieve or download traces and logs from inference instances, this would allow you to visualize and analyze them in your preferred system. However, direct visualization the trace of inference instance within pf would be an ideal solution.

This can indeed be seen as a form of 'post-deploy monitoring'. Do I have to understand correctly?

ArtyomZemlyak commented 9 months ago

@jiaochenlu Thanks for reply! Yes, visualizations in VSCode Extension is very good and helpfull!

And yes, for inference services with pf it can be sort of 'post-deploy monitoring'. I saw some issue, which can be one of solutions (or partely): https://github.com/microsoft/promptflow/issues/1149