The size of the Array Buffer is related to how much larger the file is than the 4GB limit.
What is the expected behavior?
I would expect a runtime error or the entire file to load into a buffer.
Node has a 2GB limit for readFileSync, which is fine in many cases. The difference is that Node informs me at runtime that my file is too large:
node:fs:427
throw new ERR_FS_FILE_TOO_LARGE(size);
^
RangeError [ERR_FS_FILE_TOO_LARGE]: File size (4898508041) is greater than 2 GiB
What do you see instead?
When I ran into this issue last week, I was working on an uploader for CloudFlare video streams. Most of the videos are < 4GB. Even though Bun could not load the entire file into a buffer, it failed silently. I discovered that files > 4GB were not being fully uploaded because of a lucky spot check.
I resolved the issue using a stream instead, which is how I would have solved a similar problem with Node.
Additional information
I'm not sure what the limit of an Array Buffer should be. I'm happy to use streams, but I think that creating a handle on a file that is greater than 4GB shouldn't let you create a partial Array Buffer without warning.
What version of Bun is running?
1.0.0+822a00c4d508b54f650933a73ca5f4a3af9a7983
What platform is your computer?
Darwin 22.5.0 x86_64 i386
What steps can reproduce the bug?
The following code:
Results in the following output:
The size of the Array Buffer is related to how much larger the file is than the 4GB limit.
What is the expected behavior?
I would expect a runtime error or the entire file to load into a buffer.
Node has a 2GB limit for
readFileSync
, which is fine in many cases. The difference is that Node informs me at runtime that my file is too large:What do you see instead?
When I ran into this issue last week, I was working on an uploader for CloudFlare video streams. Most of the videos are < 4GB. Even though Bun could not load the entire file into a buffer, it failed silently. I discovered that files > 4GB were not being fully uploaded because of a lucky spot check.
I resolved the issue using a stream instead, which is how I would have solved a similar problem with Node.
Additional information
I'm not sure what the limit of an Array Buffer should be. I'm happy to use streams, but I think that creating a handle on a file that is greater than 4GB shouldn't let you create a partial Array Buffer without warning.