Open jbentvelsen opened 1 year ago
I really like where you are going with this! This is exactly what I was brainstorming as well about the datastore/generator-like functionality
From: Joris @.> Sent: Monday, 30 October 2023 13:58 To: MATLAB-Community-Toolboxes-at-INCF/DeepInterpolation-MATLAB @.> Cc: Subscribed @.***> Subject: [MATLAB-Community-Toolboxes-at-INCF/DeepInterpolation-MATLAB] Read files in chunks from remote storage (Issue #36)
Currently, all workflows require the complete dataset to be available on the local disk. However, services like AWS may also support reading single frames from dataset at a time. With this 'streaming' approach the user can directly start training, without having the download the full file first.
— Reply to this email directly, view it on GitHubhttps://github.com/MATLAB-Community-Toolboxes-at-INCF/DeepInterpolation-MATLAB/issues/36, or unsubscribehttps://github.com/notifications/unsubscribe-auth/A6JZROKKWFG3Z44UDKQPGM3YB6P4BAVCNFSM6AAAAAA6V7EXM6VHI2DSMVQWIX3LMV43ASLTON2WKOZRHE3DQMRZHAYTEOA. You are receiving this because you are subscribed to this thread.Message ID: @.***>
Currently, all workflows require the complete dataset to be available on the local disk. However, services like AWS may also support reading single frames from dataset at a time. With this 'streaming' approach the user can directly start training, without having the download the full file first.