joltup / rn-fetch-blob

A project committed to making file access and data transfer easier, efficient for React Native developers.
MIT License
2.84k stars 784 forks source link

read chunk for large files, 250mb + #80

Open rexjrs opened 6 years ago

rexjrs commented 6 years ago

Is there a way to read chunks of an asset using an start/end offset? This is a must functionality when handling large files.

Traviskn commented 6 years ago

Not that I am aware of. I am curious what your use case is?

Reading large files into memory in JS is something I usually try to avoid, and I simply use rn-fetch-blob for downloading, uploading, and finding the paths to files. Then I delegate the actual file handling to other components.

For example with a video I would never try to read the video file into memory, I would pass the path to the video off to a component from react-native-video to handle it in their natively implemented code, or for a PDF I would do the same but with a PDF viewing component that implements native code that can handle the large file.

rexjrs commented 6 years ago

Some uploading APIs can specify an amount of bytes that we are uploading. E.G, if a file is 10mb. We might perform 10 requests. Each one at 1mb of data. That allows us to create an uploader on an app that can resume from its last checkpoint say if the user's wifi fails or the user kills the app. When they open it back up they can just continue uploading.

With reading files using a start/end offset, we can make sure we are only bringing in 1mb of data into the memory at a time.

bulats commented 6 years ago

I'm also looking for similar functionality.

bulats commented 6 years ago

@rexjrs Btw, if you still need this you can use RNFetchBlob.fs.slice(path_to_source path_to_write, firstByte, lastByte);

dkaushik95 commented 5 years ago

I have tried loading a large data from the server (~70mb). If I load it directly to the UI, it gives an out of memory error. So I cached it and read it into the file system. But it gives an out of memory too. I am looking for something which would page it internally as it is a JSON array.

dkaushik95 commented 5 years ago

On further investigation, I realized that it is not a good idea to keep such large files in JS memory and it will cause out of memory errors and the app will crash. A good fix can be to use a database like (RealmDB or WatermelonDB) which processes large data natively and gives us enough data to work with.

ihusak commented 5 years ago

Can somebody show righ implemantation this task in ReactNative code? How to download large files more than 50mb without out of memory alerts?

RangerMauve commented 5 years ago

.slice() is promising! Is there any chance we could get an equivalent for writing chunks to a file at a given offset?

This would be useful so that I could download data from a P2P network and store the chunks locally as they're downloaded. https://datproject.org/

jonathangreco commented 5 years ago

I suppose there is no news on this ?

pke commented 3 years ago

One use case would be to decrypt downloaded files on the fly in blocksize chunks. What do you think?

ainnotate commented 6 months ago

This is an important feature. Any plans for this feature?