fastify / fastify-compress

Fastify compression utils
MIT License
203 stars 46 forks source link

Unsupported compression of streams #309

Open ludovicm67 opened 3 months ago

ludovicm67 commented 3 months ago

Prerequisites

Fastify version

4.28.1

Plugin version

7.0.3

Node.js version

20.15.1

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

14.6.1

Description

On an async function, I return a stream for the body. If the user doesn't request compression, then it works as expected. But if he asks for compression, then he gets a 500 and the following error:

{"statusCode":500,"code":"ERR_INVALID_ARG_TYPE","error":"Internal Server Error","message":"The \"string\" argument must be of type string or an instance of Buffer or ArrayBuffer. Received an instance of ReadableStream"}

Minimal reproducible example

Configure a basic project and install required modules:

mkdir mre && cd mre # Create a new directory
npm init -y # Create a new package.json file
npm pkg set type="module" # Make sure type=module in package.json
npm i fastify @fastify/compress # Install required dependencies for the repro

index.js:

import fastify from 'fastify'
import fastifyCompress from '@fastify/compress'

const app = fastify()
await app.register(fastifyCompress)

const apiUrl = 'https://jsonplaceholder.typicode.com/posts'

// Here it's working fine, but we lose the streaming capability
app.get('/', async (_req, reply) => {
  const res = await fetch(apiUrl);
  const data = await res.text(); // We need to wait for the whole response to be downloaded

  reply
    .type('application/json')
    .send(data)
  return reply
})

// Here it's broken if we ask compression, because we can't send a ReadableStream
app.get('/broken', async (_req, reply) => {
  const res = await fetch(apiUrl);
  const data = res.body; // We get a ReadableStream (see: https://developer.mozilla.org/en-US/docs/Web/API/Request/body)

  reply
    .type('application/json')
    .send(data)
  return reply
})

await app.listen({ port: 3000 })

Example of requests:

# Compression not asked by the client
curl -v http://localhost:3000/ # OK
curl -v http://localhost:3000/broken # OK (even if the path is called "broken")

# Compression asked by the client
curl -v --compressed http://localhost:3000/ # OK
curl -v --compressed http://localhost:3000/broken # Not OK

All requests are working as expected, except the last one.

Here is the output I'm getting:

> curl -v --compressed http://localhost:3000/broken

* Host localhost:3000 was resolved.
* IPv6: ::1
* IPv4: 127.0.0.1
*   Trying [::1]:3000...
* Connected to localhost (::1) port 3000
> GET /broken HTTP/1.1
> Host: localhost:3000
> User-Agent: curl/8.7.1
> Accept: */*
> Accept-Encoding: deflate, gzip
>
* Request completely sent off
< HTTP/1.1 500 Internal Server Error
< content-type: application/json; charset=utf-8
< content-length: 219
< Date: Wed, 21 Aug 2024 15:13:36 GMT
< Connection: keep-alive
< Keep-Alive: timeout=72
<
* Connection #0 to host localhost left intact
{"statusCode":500,"code":"ERR_INVALID_ARG_TYPE","error":"Internal Server Error","message":"The \"string\" argument must be of type string or an instance of Buffer or ArrayBuffer. Received an instance of ReadableStream"}

Link to code that reproduces the bug

No response

Expected Behavior

I expect that in the reproduction example, the curl -v --compressed http://localhost:3000/broken command is returning the expected output to the user, and not an error.

Maybe it's a missing feature, and implementing chunk-encoding for streams would solve this?

mcollina commented 3 months ago

Apparently globalThis.ReadableStream is not supported yet in this. I think it should be easy enough to add it if we switch from pump to require('stream').pipeline in this module. pipeline should handle these out of the box.

(As a workaround, you can call require('node:stream').Readable.fromWeb(res.body))

mcollina commented 3 months ago

A PR with a fix would be highly welcomed!

ludovicm67 commented 3 months ago

@mcollina Thank you a lot for the workaround ; this is working well 👍

Regarding replacing pump with require('stream').pipeline was not working unfortunately ; the same issue is still present.

bencoder commented 2 months ago

I attempted to solve this in #312 but in the end I think solving this would require a bigger re-architecture to more modern standards as also described by @mcollina here https://github.com/fastify/fastify-compress/issues/297#issuecomment-2119902305 but there is a test case I added on that branch if anyone else wants to pick this up