Open nickdos opened 2 weeks ago
For bonus marks, I want to sync the s3 bucket with files on the local disk for sensitive-data-service (/data/sds
) and it seems the awscli
tools can do this. E.g., via aws s3 sync s3://your-bucket-name /path/to/local/directory
(s3 being source). Thinking the command could be put in a crontab entry (would need to be added to Ansible).
Ideally, the sensitive-data-service would read files from s3 directly but that is not likely to happen soon, due to the app being in a framework no one knows (except Doug). So this seems like a reasonable workaround but I have no experience with it, so might be a bad idea. Comments welcome.
As part of the decommissioning of the
sds-webapp2
applications, the generation of thesds
XML file (defines all sensitive taxa, used by other apps) is being moved to be a stand-alone process (likely run in Airflow). The ala-sensitive-data-service uses the XML file as well as serving it as a static resource to other applications via/data/sds/sensitive-species-data.xml
.ala-sds-test
ala-sds-prod
ala-sds-dev
sensitive-data-service
servers: