whatwg / compression

Compression Standard
https://compression.spec.whatwg.org/
Other
82 stars 21 forks source link

Resolved Promises of WritableStreamDefaultWriter Methods #55

Closed Offroaders123 closed 1 year ago

Offroaders123 commented 1 year ago

I discovered that the methods for WritableStreamDefaultWriter (at least when used with CompressionStream and DecompressionStream) don't properly resolve the same across various implementations. I was wondering if there is a defined standard behavior as to when they should return? My discoveries for this error were tracked along this issue here.

As mentioned in the WICG draft report, you can use the Compression Streams API with ArrayBuffer object by using a WritableStreamDefaultWriter, making use of the writer.write() and writer.close() methods to chunk in parts of the ArrayBuffer.

Deflate-compress an ArrayBuffer to a Uint8Array

These two methods each return Promises however, so I thought it would make sense to use them with await to ensure that any errors that happen with their calls could be bubbled back up to the main async function implementation. This doesn't appear to work in all platform implementations though. Adding these await calls works correctly in Node.js' implementation, but not in Chrome, Safari, nor Firefox's implementations.

Using writer.write() and writer.close() with await will never resolve in all three browsers, while it will resolve properly/timely with undefined in Node.js. Is this because Node.js has a unique stream implementation to that of the browsers? I don't see why these never resolve in the browsers.

The interesting part is that they all do throw errors (where applicable, say if the data was decompressed using an incompatible decompression format that doesn't match the format the ArrayBuffer uses).

So without being able to use await calls on these methods, you have to manually catch the errors with your own .catch() handlers, which feels similar to that of what you have to do when using for await loops, which Jake Archibald covered in an article on his blog.

I mainly discovered this in my own project, where I am using the Compression Streams API to discern the compression format of a given ArrayBuffer file, by the use of nested try-catch statements to find out what the file was or wasn't compressed with (Example code, similar to that of my project).

declare type CompressionFormat = "deflate" | "deflate-raw" | "gzip";

// This is where the `WritableStreamDefaultWriter` implementation is located.
declare function decompress(data: Uint8Array, format: CompressionFormat): Promise<Uint8Array>;

export type Compression = CompressionFormat | null;

export interface ReadOptions {
  compression?: Compression;
}

export async function read(data: Uint8Array, { compression }: ReadOptions = {}){

  // Snippet of the function body; incomplete

  if (compression === undefined){
    try {
      return await read(data,{ compression: null });
    } catch (error){
      try {
        return await read(data,{ compression: "gzip" });
      } catch {
        try {
          return await read(data,{ compression: "deflate" });
        } catch {
          try {
            return await read(data,{ compression: "deflate-raw" });
          } catch {
            throw error;
          }
        }
      }
    }
  }

  // From this point in the function body, `compression` can be anything except `undefined`.
  compression;

  if (compression !== null){
    data = await decompress(data,compression);
  }

  // `data` is now uncompressed, no compression format had to be known ahead of time.
  data;

}
ricea commented 1 year ago

The promise returned by write() doesn't resolve until the chunk is actually decompressed, which doesn't happen until something reads from the readable side. Since using await prevents the function from reaching the reader.read() line, nothing ever reads, and the function cannot make progress. It's stuck waiting forever for something to read.

The behaviour of Chrome, Safari and Firefox is correct.