googleapis / nodejs-storage

Node.js client for Google Cloud Storage: unified object storage for developers and enterprises, from live data serving to data analytics/ML to data archiving.
https://cloud.google.com/storage/
Apache License 2.0
896 stars 370 forks source link

NodeError: Cannot call write after a stream was destroyed #2379

Closed arnavzek closed 6 months ago

arnavzek commented 10 months ago

Environment details

Steps to reproduce

The library is working fine locally and one AWS EC@ but it produces the following error when deployed to AWS Beanstalk

web[12135]: NodeError: Cannot call write after a stream was destroyed
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at doWrite (/var/app/current/node_modules/readable-stream/lib/_stream_writable.js:390:38)
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at writeOrBuffer (/var/app/current/node_modules/readable-stream/lib/_stream_writable.js:381:5)
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at Writable.write (/var/app/current/node_modules/readable-stream/lib/_stream_writable.js:302:11)
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at HashStreamValidator.ondata (node:internal/streams/readable:777:22)
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at HashStreamValidator.emit (node:events:517:28)
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at addChunk (node:internal/streams/readable:335:12)
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at readableAddChunk (node:internal/streams/readable:308:9)
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at Readable.push (node:internal/streams/readable:245:10)
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at HashStreamValidator._transform (/var/app/current/node_modules/@google-cloud/storage/build/src/hash-stream-validator.js:83:14)
Nov 30 21:22:58 ip-172-31-39-94 web[12135]:    at Transform._write (node:internal/streams/transform:175:8)

Code

    const { buffer } = file;

    let name = nanoid();

    if (file.originalname) {
      name = name + path.extname(file.originalname).toLowerCase();
    }

    const blob = bucket.file(
      `${process.env.ENV_TYPE}/inwire/user-uploads/${name}`
    );
    const blobStream = blob.createWriteStream({
      resumable: false,
    });

    blobStream
      .on("finish", () => {
        // const publicUrl = util.format(
        //   `https://storage.googleapis.com/${bucket.name}/${blob.name}`
        // );
        resolve({ fileName: name });
      })
      .on("error", (data) => {
        console.log(data);
        reject(
          `Unable to upload image, something went wrong. Size:${
            file.size
          } File:${file.originalname} ${JSON.stringify(data)}`
        );
      })
      .end(buffer);
andrewsg commented 10 months ago

Hi Arnav, thanks for your report. I have some follow-up questions about your environment.

Is this an intermittent error or does it happen every single time on Beanstalk? Does it only happen on media operations (upload/download) or does it also happen on metadata operations (object.get, bucket.get, etc.)? Is it possible your workload is timing out in some way, causing Beanstalk to terminate open connections? Do you have free access to local disk in the location you are reading or writing?

MadeinFrance commented 10 months ago

@andrewsg I'm experiencing the same error when the function runs on GCP.Cloud Functions NodeJS. It happens every single time for uploads.

7.5.0 is the last working version.

arnavzek commented 10 months ago

It happens every single time. I don't think, I can help isolate this issue because I have switched to S3.

ddelgrosso1 commented 10 months ago

@arnavzek does it occur when resumable: true or only when resumable: false as in your example above?

ddelgrosso1 commented 10 months ago

@MadeinFrance would you be able to provide a code snippet that demonstrates the behavior on Cloud Functions? What Node version are you utilizing?

MadeinFrance commented 10 months ago

@ddelgrosso1 same as @arnavzek provided with resumable: false, NodeJS v20.

import { Storage } from "@google-cloud/storage";

const storage = new Storage({
  projectId: 'id',
});
const bucket = storage.bucket('bucket_id');

const blob = bucket.file(name);
const stream = blob.createWriteStream({
  metadata: {
    contentType: 'webp',
  },
  resumable: false,
});
ddelgrosso1 commented 6 months ago

Apologies for the late reply, circled back on this today @MadeinFrance. I tried this in GCP Cloud Functions utilizing v7.8.0 of the library and was not able to reproduce the error. Are there any other steps of configuration you might be able to provide that would be helpful in recreating the issue?

MadeinFrance commented 6 months ago

@ddelgrosso1 hi there, it's not reproducible on v7.9.0, I will stay on the latest version.

ddelgrosso1 commented 6 months ago

Going to close this out. If additional information becomes available please feel free to reopen / create a new issue.