caddyserver / cache-handler

Distributed HTTP caching module for Caddy
Apache License 2.0
262 stars 19 forks source link

Redis with Large Cached Content Fails Resulting in Memory Leak #81

Closed 0xEmma closed 4 months ago

0xEmma commented 5 months ago

Using Redis v7.2.4

Files of over >=512MB result in a Impossible to set value into Redis, write tcp 10.69.42.43:57726->10.69.42.42:6379: write: connection reset by peer

This is most likely a result of the 512M STRING Value cap of REDIS, Files of ~900MB will get stuck part way downloading on cached handlers.

Large Files(few gigs) will results in a HTTP timeout never allowing file download, and causes a memory leak taking all up the systems memory.

Suggested Fix: If redis write fails, bypass cache on first load, send response & cache in a non-blocking manner

0xEmma commented 5 months ago

Setting proto-max-bulk-len in redis conf fixes the issues for files that dont violate the HTTP timeout.(beyond a large time to start download)

However the HTTP timeout still exists in larger files and will cause a memory leak.

darkweak commented 4 months ago

Hello @0xEmma, IMHO you may use the max_body_bytes directive to prevent these files from being cached.