jimmywarting / StreamSaver.js

StreamSaver writes stream to the filesystem directly asynchronous
https://jimmywarting.github.io/StreamSaver.js/example.html
MIT License
4.03k stars 417 forks source link

Firefox on Mac OS issues CPU higth #179

Open andgest opened 4 years ago

andgest commented 4 years ago

Hello, I have problems with streamsaver on MacOSX. I tested Firefox 79 on 3 macs with different major OS versions on large downloads (2GB) :

See below my code :

import streamSaver from "streamsaver";
import {WritableStream} from "web-streams-polyfill/ponyfill";

....

download(url: string, fileName: string, size?: number): Observable<number> {
    if (isNullOrUndefined(streamSaver.WritableStream)) {
      streamSaver.WritableStream = WritableStream;
    }

    if (isTrue(environment.useSelfHostedStreamSaverMitm)) {
      // Allow to avoid to see download from https://jimmywarting.github.io/StreamSaver.js
      streamSaver.mitm = window.location.origin + window.location.pathname.replace(/\/[^\/]+$/, "/") + "mitm.html";
    }
    const fileStream = streamSaver.createWriteStream(fileName, {
      size: size,
      writableStrategy: undefined,
      readableStrategy: undefined,
    });

    const requestHeaders: HeadersInit = new Headers();
    requestHeaders.set("Content-Type", "application/json");
    const token = this.oauth2Service.getAccessToken();
    if (!isNullOrUndefined(token)) {
      requestHeaders.set("Authorization", "Bearer " + token);
    }

    const fetchResult = fetch(url, {
      headers: requestHeaders,
    }).then(res => {
      if (res.status !== HttpStatus.OK) {
        return res.status;
      }
      const readableStream = res.body;
      const writableStream = streamSaver.WritableStream;
      // Optimized way for supported browser (like Chrome)
      if (writableStream && readableStream.pipeTo) {
        return readableStream.pipeTo(fileStream)
          .then(() => HttpStatus.OK);
      }

      // Standard way for other browser (like Firefox)
      const writer = fileStream.getWriter();
      const reader = res.body.getReader();
      const pump = () => reader.read()
        .then(result => {
          if (result.done) {
            writer.close();
            return HttpStatus.OK;
          } else {
            writer.write(result.value).then(pump);
            return HttpStatus.OK;
          }
        });
      return pump();
    });

    return from(fetchResult);
  }

Thank you in advance for your help ! :)

jimmywarting commented 4 years ago

Hmm, not exactly sure what it could be. One theory and issue streamsaver have is that it don't have a solid backpressure to the service worker So it could be that you are writing things too fast. This problem goes away for browser who supports transferable streams The 2nd issue is that chunks are not transferable durning the postMessage (they all get cloned)

My advice to everyone when downloading something from the server is always try to solve it using the server only instead of mimic it in service worker.

it maybe is more related to #145 if you like you could maybe try out my native file system adapter instead. It also has a way to download things via service worker (and dosen't involve any mitm) The packpressure is solved over there since the service worker can talk back to the main thread when it's ready to accept more data. and the data that are sent to the service worker is transferable instead