abetlen / llama-cpp-python

Python bindings for llama.cpp
https://llama-cpp-python.readthedocs.io
MIT License
7.91k stars 943 forks source link

Allow any format for X-Request-Id #1337

Open ging-dev opened 6 months ago

ging-dev commented 6 months ago

Prerequisites

Please answer the following questions for yourself before submitting an issue.

Expected Behavior

Normal response (200)

Current Behavior

400, Bad request

Environment and Context

Huggingface spaces docker

Failure Information (for bugs)

Feature introduced in https://github.com/abetlen/llama-cpp-python/pull/703, Huggingface Spaces already has x-request-id in the request, but it was not a valid uuid that starlette-context expected (https://github.com/tomwojcik/starlette-context/blob/992ab9401a9f557994379053fbd98471c601eda9/tests/test_plugins/test_request_id.py#L37), resulting in the server returning bad request 400.

abetlen commented 6 months ago

@ging-dev I wasn't aware that the default starlette middleware only accepts uuid's as request id's.

I think that should be changed by writing a custom request id middleware instead of adding an option to disable it. That should fix the your issue and give users more flexibility in id formats.