allenporter / llama-cpp-server

Docker images for easier running of llama-cpp-python server
Apache License 2.0
5 stars 2 forks source link

Update dependency uvicorn to v0.31.0 #130

Closed renovate[bot] closed 1 month ago

renovate[bot] commented 1 month ago

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
uvicorn (changelog) ==0.30.6 -> ==0.31.0 age adoption passing confidence

Release Notes

encode/uvicorn (uvicorn) ### [`v0.31.0`](https://redirect.github.com/encode/uvicorn/blob/HEAD/CHANGELOG.md#0310-2024-09-27) [Compare Source](https://redirect.github.com/encode/uvicorn/compare/0.30.6...0.31.0) ##### Added Improve `ProxyHeadersMiddleware` ([#​2468](https://redirect.github.com/encode/uvicorn/issues/2468)) and ([#​2231](https://redirect.github.com/encode/uvicorn/issues/2231)): - Fix the host for requests from clients running on the proxy server itself. - Fallback to host that was already set for empty x-forwarded-for headers. - Also allow to specify IP Networks as trusted hosts. This greatly simplifies deployments on docker swarm/kubernetes, where the reverse proxy might have a dynamic IP. - This includes support for IPv6 Address/Networks.

Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Enabled.

â™» Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.



This PR was generated by Mend Renovate. View the repository job log.