BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.11k stars 1.67k forks source link

[Feature]: Add various security headers #3677

Open Manouchehri opened 6 months ago

Manouchehri commented 6 months ago

The Feature

LiteLLM should add the following headers:

  1. content-security-policy
  2. cross-origin-resource-policy
  3. cross-origin-opener-policy
  4. cross-origin-embedder-policy
  5. x-frame-options
  6. x-content-type-options
  7. access-control-allow-origin

IMO should use PROXY_BASE_URL as a default to calculate these headers.

Reasonable defaults for all requests/paths would be (assuming PROXY_BASE_URL="https://example.com/":

x-content-type-options: nosniff
x-frame-options: DENY
cross-origin-resource-policy: same-origin
cross-origin-opener-policy: same-origin
cross-origin-embedder-policy: require-corp
access-control-allow-origin: https://example.com

The CSP is a bit more complicated. For example, the CSP for https://example.com/v1/chat/completions and https://example.com/ui will be completely different.

This is an UNSAFE/bad example of a CSP for LiteLLM:

content-security-policy: default-src * 'unsafe-inline'; img-src * 'self' data:

Motivation, pitch

Security hardening of LiteLLM is always a good idea imo. =)

Twitter / LinkedIn details

https://twitter.com/DaveManouchehri

krrishdholakia commented 6 months ago

@Manouchehri help me understand this more

these are headers which are part of the request, which we need to return in the response headers?

Manouchehri commented 6 months ago

These headers should be added to all responses, the request itself shouldn't have an impact.

krrishdholakia commented 6 months ago
content-security-policy
cross-origin-resource-policy
cross-origin-opener-policy
cross-origin-embedder-policy
x-frame-options
x-content-type-options
access-control-allow-origin

what do these mean exactly? @Manouchehri

and how might they change between requests?