Closed mr-niels-christensen closed 9 years ago
If the data load operation takes more than 60 seconds (max for serving a user request on GAE), an on-demand solution can use the deferred library https://cloud.google.com/appengine/articles/deferred?hl=en
There is no need, at least for now, to use a deferred or a cronjob, as the changes to the vocabularies are pretty rare and the script can be triggered from a local machine.
I still think that in the long run, it would be more robust to pull the data, i.e. get rid of the local machine and let the app urlfetch the data itself.
I agree, but first we need to have things working, then we can think to optimization and full automation. The script now pulls from the remote repository. For the automation part we can open another issue with lower priority.
In the current version, the RDF vocabularies (semantic web definitions of e.g. "Saturn") are pushed from a developer's command-line, by pulling
.ntriples
-files fromhttps://github.com/SpaceAppsXploration/RDFvocab
then runninguploadvocabularies.py
.Most of this data is already online on
http://ontology.projectchronos.eu/
. A more robust solution would be to let the application itself pull all data from that site, either on-demand ("push of a button" from an admin) or at a fixed interval (using cron jobs).