Presently the way to download the raw image data is one at a time using the format http://spectralworkbench.org/system/photos/<id>/original/capture.png. If the format is wrong, the filename is in the metadata for each spectrum.
If a bulk download of 60+ images is required for offline processing experiments, iterating this format will be a major strain on the server. Worse yet would be if the data wasn't downloaded, so that each experimental modification required a new download of 60+ images.
Desired is a system which will gather the raw images for a set of spectrum ids submit by the user. The backend will grab the images, compress them, and send them to the user in a single push.
For all images ever, it might be good if we back up the images to S3, DropBox, or some photo sharing site. Many external services have a bulk download button which does the process of compressing a large set of files and sending them as a single file to the user. This would certainly eliminate effort on our part.
I don't think S3 has that capability built in by default, but it will statically host files. Monthly caches could be put there, but that feels far too arbitrary.
Presently the way to download the raw image data is one at a time using the format
http://spectralworkbench.org/system/photos/<id>/original/capture.png
. If the format is wrong, the filename is in the metadata for each spectrum.If a bulk download of 60+ images is required for offline processing experiments, iterating this format will be a major strain on the server. Worse yet would be if the data wasn't downloaded, so that each experimental modification required a new download of 60+ images.
Desired is a system which will gather the raw images for a set of spectrum ids submit by the user. The backend will grab the images, compress them, and send them to the user in a single push.
For all images ever, it might be good if we back up the images to S3, DropBox, or some photo sharing site. Many external services have a bulk download button which does the process of compressing a large set of files and sending them as a single file to the user. This would certainly eliminate effort on our part.
I don't think S3 has that capability built in by default, but it will statically host files. Monthly caches could be put there, but that feels far too arbitrary.