SciCodes / Planning

Repository for planning
1 stars 0 forks source link

best practices for metrics gathering #11

Open alee opened 2 years ago

alee commented 2 years ago

Identify and develop strategies for gathering and using metrics for repository / registry managers.

These include but are not limited to software citation metrics (who’s using the software, at what career stage, what industry, where, etc) while being respectful of privacy concerns, GDPR, etc. Best practices for adopting analytics packages (matomo vs GA for example) and UI/UX best practices for gathering analytics in the least annoying way possible

  1. what are some general purpose things we should be looking to collect?
  2. what can repositories/registries do to help support software citation metrics more broadly (use DOIs, disseminations in COiNS, zotero support ...?)
  3. anything else?
alee commented 2 years ago

I'm happy to help organize but if someone would like to step up would be glad to defer to them due to a conflicting meeting for the Open Modeling Foundation that week.

Still, I should be able to sneak away for a morning or afternoon. I'll update this availability slot ASAP when the agenda for that meeting is finalized

I only have availability between 1000 to 1600 MST (UTC-7) unfortunately.

mbjones commented 2 years ago

My unsolicited 2 cents. This is closely related to the effort standardizing the collection and reporting of Data Citation and Usage metrics that has been shepherded via Make Data Count and Scholix. Metrics are challenging. Adoption of comparable methods for collecting metrics is even more challenging, as evidenced by the issues with uptake of the COUNTER Code of Practice for Research Data. You might be interested that several of the folks from MakeDataCount (e.g., @chodacki, @mfenner, @dlowenberg) are currently involved in the effort to create v2 of the COUNTER Code for RD, and some of the main issues being considered are how to handle repositories that have mixes of data, publications, and software, and how to handle mixed research objects that contain both data, software, text, and other scholarly products. Seems like an opportunity for collaboration here between the data and the software worlds, and to have some degree of consistency between those worlds for metrics.

alee commented 2 years ago

Thanks @mbjones! We've been loosely following Make Data Count and the other initiatives you mentioned (https://github.com/comses/comses.net/issues/374) and this is an excellent idea to harmonize our efforts.

Hopefully some of the interested parties will be able to participate in this session!

mutanthumb commented 2 years ago

Hi I'm interested in this topic! I'm in the Software Citation #Hackathon

alee commented 2 years ago

Great! Apologies for the late reply, but we'll be starting at 1000 MST (1700 UTC)

Zoom link: https://asu.zoom.us/j/88391150994?pwd=T0FZTXZOc0JZZzc3L3VIK1BqRWJ0Zz09

alee commented 2 years ago

Shared notes: https://docs.google.com/document/d/1-2y-0qvdpzRctRQMuTBBxCgB_QtyG9t1ZQ1bPchr7sA/edit?usp=sharing

alee commented 2 years ago

Key takeaways / action items

It's important to ensure that if someone takes the time to properly archive their work that they receive credit when downstream products cite that work. Getting these scholarly metrics right is critical to establish incentives for people to Do the Right Thing.

However, some of this appears to be out of our hands and more in the purview of the aggregators / DataCite / Crossref / Scopus / PubMed Central / ISI web of science / Google Scholar / etc. A core part of the issues below are to identify concrete things scientific software registries and repositories can do to make their jobs easier and create the citation / knowledge graph for scientific publications of the future.

  1. focus on scholarly metrics related to citation, references, reuse
  2. come up with a short checklist for registries / repository maintainers on how to integrate with make data count / project counter
  3. create a support group / collect user experiences from early adopters trying to make this work with DataCite, Crossref, etc
  4. identify how we as repository / registry managers can share things like views and download metrics with DataCite e.g., https://support.datacite.org/docs/displaying-usage-and-citations-in-your-repository
  5. identify the minimal set of scholarly metadata necessary for aggregators: ORCID for all contributors (along with role? https://github.com/codemeta/codemeta/issues/240), ROR, DOI or other PID to referenced resource, resource type, license, open-access vs closed-source open-metadata, others?
  6. establish workflows and guidance for authors, curators, journals to ensure that publications have cited their data and software properly and mechanisms to augment published articles with new citations to data or software https://support.datacite.org/docs/contributing-data-citations#
mutanthumb commented 2 years ago

Hi again Allen, I'm not seeing a way to officially join SciCodes. Any advice would be great! -susan