Closed eroux closed 6 years ago
Thanks Élie,
This is something that we could run on the machines that are making the batches, as a background task after the batch has been built.
(There are other, less intrusive architectures, such as Some comments:
a. will the running machine need virtualenv? b. I (or the running background process) need to be in your cool AWS membership group.
Jim Katz
Buddhist Digital Resource Center
mailto:jimk@tbrc.org jimk@tbrc.org
+1 781.254.7537
From: Elie Roux [mailto:notifications@github.com] Sent: Wednesday, February 28, 2018 11:19 AM To: BuddhistDigitalResourceCenter/drs-deposit drs-deposit@noreply.github.com Cc: Subscribed subscribed@noreply.github.com Subject: [BuddhistDigitalResourceCenter/drs-deposit] triggering json export to s3 (#41)
I think it could be a nice workflow to execute the tojsondimensions.py https://github.com/BuddhistDigitalResourceCenter/drs-deposit/blob/master/contrib/tojsondimensions.py script on the RS3 server directly, as it would minimize the overhead of sending the METS data to me. It uses the aws credentials of the user (stored in ~/.aws/). Could that be envisioned?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/BuddhistDigitalResourceCenter/drs-deposit/issues/41 , or mute the thread https://github.com/notifications/unsubscribe-auth/Af65RQn4vEVVP4hwfPVZ9pPup6H6cAo-ks5tZXppgaJpZM4SW6iz . https://github.com/notifications/beacon/Af65RUYnwjwUgei6URSRwN_uaMCIWLWiks5tZXppgaJpZM4SW6iz.gif
boto3
magically looks at ~/.aws/
Out of curiosity why the need to access S3? The same image set is available locally to Windhorse
Oh, the s3 would be where the output .json
file will be uploaded, not from where the images would be read
Ah. I forget these little things
I inserted a call to Elie's contrib/tojsondimensions.py into make-drs-batch.sh, which automatically creates the json files and uploads them to aws.
I think it could be a nice workflow to execute the tojsondimensions.py script on the RS3 server directly, as it would minimize the overhead of sending the METS data to me. It uses the aws credentials of the user (stored in
~/.aws/
). Could that be envisioned?