birdofpreyru / react-native-fs

File system access for React Native
https://dr.pogodin.studio/docs/react-native-file-system
Other
121 stars 8 forks source link

How can I speed up the reading of my file from my device? #34

Closed Rossella-Mascia-Neosyn closed 2 months ago

Rossella-Mascia-Neosyn commented 2 months ago

Excuse me if I put this question here, but I don’t know where to insert it.

I have to upload videos to aws/cloudflare but the loading is really slow, this is due to two factors:

  1. Reading the file takes 20 seconds on a file weighing 23 MB
  2. From the first request where UploadId returns me to the first chunk it takes a very long time

What I want to know is how to optimize file reading from my device?

const readFileFromDevice = async (filepath: string): Promise<Buffer> => {
   const fileData = await RNFS.readFile(filepath, 'base64');
   const buffer = Buffer.from(fileData, 'base64');
    return buffer;
  };

  const sendChunk = async (
    s3Client: S3Client,
    fileName: string,
    bucketName: string,
    filepath: string,
  ): Promise<CompleteMultipartUploadCommandOutput> => {
    let uploadId;
    try {
      const multipartUpload = await s3Client.send(
        new CreateMultipartUploadCommand({
          Bucket: bucketName,
          Key: fileName,
        }),
      );
      uploadId = multipartUpload.UploadId;

      const buffer = await readFileFromDevice(filepath);

      const partSize = 5 * 1024 * 1024; // multipart needs parts of 5mb for all but the last part
      const totalParts = Math.ceil(buffer.length / partSize);
      const uploadPromises: Promise<UploadPartCommandOutput>[] = [];
      // Upload each part.
      for (let i = 0; i < totalParts; i++) {
        const start = i * partSize;
        const end = start + partSize;
        const bufferChunk = buffer.subarray(start, end);

        uploadPromises.push(
          s3Client
            .send(
              new UploadPartCommand({
                Bucket: bucketName,
                Key: fileName,
                UploadId: uploadId,
                Body: bufferChunk,
                PartNumber: i + 1,
              }),
            )
            .then(d => {
              return d;
            }),
        );
      }
      const uploadResults = await Promise.all(uploadPromises);

      return await s3Client.send(
        new CompleteMultipartUploadCommand({
          Bucket: bucketName,
          Key: fileName,
          UploadId: uploadId,
          MultipartUpload: {
            Parts: uploadResults.map(({ETag}, i) => ({
              ETag,
              PartNumber: i + 1,
            })),
          },
        }),
      );
    } catch (err) {
      console.error(err);
      if (uploadId) {
        const abortCommand = new AbortMultipartUploadCommand({
          Bucket: bucketName,
          Key: fileName,
          UploadId: uploadId,
        });
        await s3Client.send(abortCommand);
      }
      throw new Error(err as string);
    }
  };
birdofpreyru commented 2 months ago

screenshot

In general, the current implementation of many functions in this library is not optimized for large chunks / files, as data passed between JS and native layers are encoded / decoded into Base64, which is a performance and memory hit. So, read your file in chunks, and perhaps also decrease the size of upload chunks, if you need faster result for each of them.