jimmywarting / StreamSaver.js

StreamSaver writes stream to the filesystem directly asynchronous
https://jimmywarting.github.io/StreamSaver.js/example.html
MIT License
4k stars 415 forks source link

The 102 version is normal. After Firefox is upgraded to 103 or a later version, the file fails to be exported because the stream writing is abnormal. As a result, the file download is incomplete. #292

Closed davidpan123 closed 1 year ago

davidpan123 commented 1 year ago

Compared with the 102 version 103, the following security vulnerabilities are fixed: https://www.mozilla.org/en-US/security/advisories/mfsa2022-28/

ISummerRainI commented 1 year ago

I'm facing this issue too, tried the solution for that one https://github.com/jimmywarting/StreamSaver.js/issues/290 but still the same.

davidpan123 commented 1 year ago

中文:控制台报错: 无法载入‘’。某个 ServiceWorker 拦截了请求并遇到未知错误; English: Console error: Failed to load ''. A ServiceWorker intercepts the request and encounters an unknown error.

TommyBacco commented 1 year ago

Any update on this matter? Someone managed to workaround it?

Harshit-Pratap-Singh commented 1 year ago

the download stops at 1.5 gb in firefox

yxq-neuralgalaxy commented 1 year ago

When I'm downloading a 500m file using firefox, the function pipeTo doesn't return any information。

const fileStream = streamsaver.createWriteStream(title);
  const readableStream = response.body;
  if (readableStream) {
    if (readableStream.pipeTo) {
      return readableStream.pipeTo(fileStream)
        .then(() => { 
           // downloading big files will not executed here
        });
        .catch(error => {
          // downloading big files will not executed here
        })
    }
    // Only when downloading small files will it be executed here
    const writer = fileStream.getWriter();
    const reader = readableStream.getReader();
    const pump = async () => reader.read()
      .then((res: any) => res.done
        ? writer.close()
        : writer.write(res.value).then(pump));
    await pump();

firefox:v107.0.1,streamsaver:v2.0.6,web-streams-polyfil:v3.2.1

The same code is normal in chrome

jimmywarting commented 1 year ago

seems like browser are taking extra measure to prevent 3th party origin to download stuff from service worker coming from another domain... firefox may be even more restrict nowdays...

guess the best corse of action is to provide a guide of how to download stuff using self hosted service worker... I have found one potential iframe sandbox flag that might help...

https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe#attr-sandbox

  • allow-downloads-without-user-activation Experimental: Allows for downloads to occur without a gesture from the user.
  • allow-downloads: Allows for downloads to occur with a gesture from the user.

maybe this ☝️ can help solve the Firefox problem?

Harshit-Pratap-Singh commented 1 year ago

I am using this code to write data comming in chunks from another user on a peer-to-peer connection, it works fine in chrome and is able to handle large file but when used in firefox as receiving end the file dounload fails at 1.5 gb.

function helperReceiveFile(e) {
    // console.log("ooooo");
    if (!fileStream.current && e.data.toString().includes("start")) {
      fileStream.current = streamsaver.createWriteStream(JSON.parse(e.data).fileName,{size:JSON.parse(e.data).size});
      console.log("fileStream.current--->", fileStream.current);
      // alert("jskdhjfhk")
      console.log(JSON.parse(e.data).fileName);
      fileStreamWriter.current = fileStream.current.getWriter();
      console.log("fileStreamWriter.current--->", fileStreamWriter.current);
      return;
    }
  handleReceiveFile(e) 

  }
function handleReceiveFile(e) {
    // console.log(e);
    if (e.data.toString().includes("done")) {
      setGotFile(true);
      console.log(e);
      fileStreamWriter.current.close();
      fileStream.current = null;
      fileName.current = JSON.parse(e.data).fileName;
    } else {
      // console.log(e.data);
       fileStreamWriter.current.write(new Uint8Array(e.data)).catch(e=>{
        console.log("error",e);
       })
      // worker.postMessage(e.data);
    }
  }

I want to handle large files so i have to write the data directly in disk. @jimmywarting

arcsun commented 1 year ago

I am using this code to write data comming in chunks from another user on a peer-to-peer connection, it works fine in chrome and is able to handle large file but when used in firefox as receiving end the file dounload fails at 1.5 gb.

function helperReceiveFile(e) {
    // console.log("ooooo");
    if (!fileStream.current && e.data.toString().includes("start")) {
      fileStream.current = streamsaver.createWriteStream(JSON.parse(e.data).fileName,{size:JSON.parse(e.data).size});
      console.log("fileStream.current--->", fileStream.current);
      // alert("jskdhjfhk")
      console.log(JSON.parse(e.data).fileName);
      fileStreamWriter.current = fileStream.current.getWriter();
      console.log("fileStreamWriter.current--->", fileStreamWriter.current);
      return;
    }
  handleReceiveFile(e) 

  }
function handleReceiveFile(e) {
    // console.log(e);
    if (e.data.toString().includes("done")) {
      setGotFile(true);
      console.log(e);
      fileStreamWriter.current.close();
      fileStream.current = null;
      fileName.current = JSON.parse(e.data).fileName;
    } else {
      // console.log(e.data);
       fileStreamWriter.current.write(new Uint8Array(e.data)).catch(e=>{
        console.log("error",e);
       })
      // worker.postMessage(e.data);
    }
  }

I want to handle large files so i have to write the data directly in disk. @jimmywarting

if your server spport HTTP range,you can try to split your download and still able to write the all the response to one file. I'm using 50MB and it works fine.

jeremyckahn commented 1 year ago

I've noticed that Firefox doesn't prematurely kill the Service Worker process when the Firefox Devtools have the sw.js file opened.

jeremyckahn commented 1 year ago

It seems that https://github.com/jimmywarting/StreamSaver.js/pull/305 should fix this issue.