Closed aqw closed 3 years ago
Uhhh this would be cool!
okay, quick update: I will tackle this issue and #10 in conjunction. The plan is to write a tool that scrapes PubMed for publications that cite Studyforrest, and write them in a structured way (I propose JSON) to a database (i.e., file) if they do not exist in there yet. The structured data can be categorized by a human into "provides data", "uses data", "citation only". The tool also reads in the structured metadata and outputs it as html to include on the webpage.
IMO, this is sufficiently covered by the tools/pubs.py
tool that Adina wrote. Inserting this into the webpage itself can be manual for now, and is still a leap forward compared to the previous status quo.
Thanks Adina!
This should be part of the website build, rather than via the browser (for SEO purposes).
The purpose is to have a single source of truth for publications, rather than updating multiple locations.
I am told that GitHub Actions has a cron-like feature.
The website build should run automatically, periodically, in order to update this.