Closed qiuwei closed 4 years ago
Yes that should be fine. With HSDS only the dataset chunks that have updates need to be written to S3, so updates shouldn't take any longer to run as the file size increases.
Closing - please reopen if you have additional questions along this line.
I have read the documents and examples. From understanding, hsds is excellent for serving a huge static hdf5 file and enables slicing, downsampling etc operations efficiently.
In our use cases, we need to update the hdf5 file frequently, e.g., on a daily basis. I am wondering whether we can use hsds for this scenario?
Thanks in advance!