Our website prominently showcases our data sharing activities, and we have good metrics that we often quote in reporting and public presentations.
We could consider doing the same for AI development support activities.
Like for data sharing, any advertisement/metrics efforts should be value creating (to ensure we benefit) and low effort (to ensure it gets done).
We could consider:
put up "event pages" for events, with description / registration / link to materials etc, tag them according to some scheme, which we can then list and make stats on. (Do we want to make such a page for #426 AIDA Data Hub seminar for SciLifeLab Linköping seminar series? )
putting up a list of "support projects", start / end / org receiving support / optional name of person or PI receiving support
This way we can just pull stuff from the webpage instead of having to rethink every time someone wants stats, plus we get continuous advertising of our efforts.
The event pages should have metadata on them compatible with SciLifeLab reporting.
Our website prominently showcases our data sharing activities, and we have good metrics that we often quote in reporting and public presentations.
We could consider doing the same for AI development support activities.
Like for data sharing, any advertisement/metrics efforts should be value creating (to ensure we benefit) and low effort (to ensure it gets done).
We could consider:
This way we can just pull stuff from the webpage instead of having to rethink every time someone wants stats, plus we get continuous advertising of our efforts.
The event pages should have metadata on them compatible with SciLifeLab reporting.
Look into integrating with the Training Metrics Database https://github.com/elixir-europe-training/Training-Metrics-Database