Open atataru23 opened 3 weeks ago
Hi @atataru23 - thanks for reaching out.
The error indicates that the entire file is being loaded into memory before it can be split into smaller chunks. This approach can lead to memory issues especially on devices with limited resources such as iOS devices.
The key distinction between AWS SDK for JavaScript V2 and V3 lies in how they handle large file uploads. In V2, the S3 client provides the ManagedUpload class, which includes an upload() operation that supports uploading large objects using S3's multipart upload feature. This feature allows for efficient handling of large files by splitting them into smaller chunks and uploading them in parallel.
On the other hand, in V3, the @aws-sdk/lib-storage library is introduced which provides functionality similar to the upload() operation in V2, but with additional features and support for both Node.js and browser runtimes. The lib-storage library is designed to handle large file uploads efficiently, including multipart uploads and automatic chunk splitting.
Depending on the runtimes you're using, you can either implement the Upload class from @aws-sdk/lib-storage or follow the Multipart upload by S3 documentation to handle large file uploads efficiently, while minimizing memory usage.
Hope that helps, John
Hello @aBurmeseDev
This is my implementation using the @aws-sdk/lib-storage in V3. I'll leave only the relevant parts of the code. Runtime: Browser package.json
"@angular/core": "16.2.7",
"@aws-sdk/client-s3": "^3.617.0",
"@aws-sdk/lib-storage": "^3.617.0",
"@aws-sdk/types": "^3.609.0",
"@smithy/fetch-http-handler": "^3.2.4",
Script responsible for uploading files to S3. (relevant parts only)
import { PutObjectCommand, S3Client } from "@aws-sdk/client-s3";
import { FetchHttpHandler } from "@smithy/fetch-http-handler";
import { Upload } from "@aws-sdk/lib-storage";
import { from, Observable } from "rxjs";
export class S3ClientClass {
constructor(
) {
}
uploadMedia(trackProgress: (progress: number) => void): Observable<any> {
this.currentUpload = new Upload({
client: this.generateClient(),
params: this.generateClientCommand().input,
})
this.currentUpload.on('httpUploadProgress', (progress) => {
trackProgress(Math.round(progress.loaded / progress.total * 100))
});
return from(this.currentUpload.done())
}
private generateClient(): S3Client {
return new S3Client({
region: ENV.s3.region,
credentials: this.credentials,
useAccelerateEndpoint: ENV.s3.useAccelerated,
requestHandler: new FetchHttpHandler({
requestTimeout: 0,
}),
maxAttempts: 10000
});
}
private generateClientCommand(): PutObjectCommand {
return new PutObjectCommand({
Bucket: ENV.s3.bucket,
Key: this.generateBucketKey(),
Body: this.file,
ACL: 'private',
ContentType: contentType
});
}
}
And where I want to use the upload, I instantiate the class with the relevant data, then call the uploadMedia() method like so
this.$upload = this.s3Client.uploadMedia(
(progress: number) => this.progress = progress
)..subscribe(() => //other logic);
This implementation works as expected on Desktop and Android Devices. Large files automatically get split into chunks and get uploaded to S3.
But on iOS devices, makes a blob request which loads the entire file into memory before being split into chunks which causes the error, "Range Error: Out of Memory". The same happens for macOS, entire file loads into memory then it is split into chunks, no error though because there is enough memory. In some cases for iOS, the browser crashes.
The error always seems to lead here in chunker.js files from aws when using Safari.
if (typeof data.stream === "function") {
return getChunkStream(data.stream(), partSize, getDataReadableStream);
}
Should I move this thread to bugs instead of migration? Or is it something I overlooked?
I appreciate your help @aBurmeseDev !
Pre-Migration Checklist
UPGRADING.md
.Which JavaScript Runtime is this issue in?
Browser
AWS Lambda Usage
Describe the Migration Issue
On V3, whenever I try uploading a large file on an iOS device the AWS upload fails with the error "Range Error: Out of Memory". The entire file is loaded into memory before being split into chunks. MacOS does the same thing, but having a lot more memory the error doesn’t happen. For Windows works fine, same for Android devices. On V2 this issue does not happen.
chunker.js
The issue seems to happen here, on data.stream(), it behaves differently on safari. Makes a blob request which loads in memory and on iPhone/iPad causes page to crash or just fails to chunkify and upload.
Code Comparison
V3 Code:
V2 Code:
Observed Differences/Errors
Additional Context
Angular app. "@angular/core": "16.2.7"
AWS library version for v3 "@aws-sdk/client-s3": "^3.617.0", "@aws-sdk/lib-storage": "^3.617.0", "@aws-sdk/types": "^3.609.0", "@smithy/fetch-http-handler": "^3.2.4",
AWS library version for v2 "aws-sdk": "2.1466.0",