trallnag / prometheus-fastapi-instrumentator

Instrument your FastAPI with Prometheus metrics.
ISC License
948 stars 84 forks source link

feat: Instrument latency without streaming duration #290

Closed dosuken123 closed 7 months ago

dosuken123 commented 7 months ago

What does this do?

Add a brief description of what the feature or update does.

This PR adds an option to track HTTP response duration without streaming duration.

Config Example:

    instrumentator.add(
        metrics.latency(
            should_include_handler=True,
            should_include_method=True,
            should_include_status=True,
            buckets=(0.5, 1, 2.5, 5, 10, 30, 60),
        ),
        metrics.latency(
            metric_name="http_request_duration_without_streaming_seconds",
            should_include_handler=True,
            should_include_method=True,
            should_include_status=True,
            buckets=(0.5, 1, 2.5, 5, 10, 30, 60),
            should_exclude_streaming_duration=True,               # <= New option
        )
    )

https://gitlab.com/gitlab-org/modelops/applied-ml/code-suggestions/ai-assist/-/blob/main/ai_gateway/app.py?ref_type=heads#L51-58

Output example:

# HELP http_request_response_start_duration_seconds Duration of HTTP requests in seconds
# TYPE http_request_response_start_duration_seconds histogram
http_request_response_start_duration_seconds_bucket{handler="/v2/code/generations",le="0.5",method="POST",status="2xx"} 0.0
http_request_response_start_duration_seconds_bucket{handler="/v2/code/generations",le="1.0",method="POST",status="2xx"} 1.0
http_request_response_start_duration_seconds_bucket{handler="/v2/code/generations",le="2.5",method="POST",status="2xx"} 1.0
http_request_response_start_duration_seconds_bucket{handler="/v2/code/generations",le="5.0",method="POST",status="2xx"} 1.0
http_request_response_start_duration_seconds_bucket{handler="/v2/code/generations",le="10.0",method="POST",status="2xx"} 1.0
http_request_response_start_duration_seconds_bucket{handler="/v2/code/generations",le="30.0",method="POST",status="2xx"} 1.0
http_request_response_start_duration_seconds_bucket{handler="/v2/code/generations",le="60.0",method="POST",status="2xx"} 1.0
http_request_response_start_duration_seconds_bucket{handler="/v2/code/generations",le="+Inf",method="POST",status="2xx"} 1.0
http_request_response_start_duration_seconds_count{handler="/v2/code/generations",method="POST",status="2xx"} 1.0
http_request_response_start_duration_seconds_sum{handler="/v2/code/generations",method="POST",status="2xx"} 0.6706487989995367
# HELP http_request_response_start_duration_seconds_created Duration of HTTP requests in seconds
# TYPE http_request_response_start_duration_seconds_created gauge
http_request_response_start_duration_seconds_created{handler="/v2/code/generations",method="POST",status="2xx"} 1.7095186511967359e+09

Fixes https://github.com/trallnag/prometheus-fastapi-instrumentator/issues/291

Why do we need it?

Users often feel the latency as the first chunk arrival instead of the last chunk arrival as LLM inference APIs usually support HTTP streaming to improve the UX. We want to instrument the duration.

Who is this for?

GitLab, software developers, LLM app optimizations

Linked issues

Related to https://gitlab.com/gitlab-com/runbooks/-/merge_requests/6928#note_1796949998

Reviewer notes

Add special notes for your reviewer.

codecov[bot] commented 7 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 95.98%. Comparing base (c608c4e) to head (3ae762e).

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #290 +/- ## ========================================== + Coverage 95.79% 95.98% +0.19% ========================================== Files 5 5 Lines 357 374 +17 ========================================== + Hits 342 359 +17 Misses 15 15 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

trallnag commented 7 months ago

Hi @dosuken123 thanks for the proposal and implementation. I will be included in the next version that will be released sometime this week

dosuken123 commented 7 months ago

@trallnag Thanks for help! Much appreciated :bow: