CLARIAH / clariah-plus

This is the project planning repository for the CLARIAH-PLUS project. It groups all technical documents and discussions pertaining to CLARIAH-PLUS in a central place and should facilitate findability, transparency and project planning, for the project as a whole.
9 stars 6 forks source link

User assessment of tools and their metadata #144

Open proycon opened 1 year ago

proycon commented 1 year ago

At the tech day today @roelandordelman raised the point that the current metadata descriptions for the tools at https://tools.clariah.nl are still a bit all over the place and not necessarily informative enough for end-users. (This is aside from the fact that the current interface can be improved with faceted search to help reduce the noise, but that is tracked separately in https://github.com/proycon/codemetapy/issues/23 )

This problem is to be remedied first and foremost in the call to all developers to check, correct and enrich their software metadata (#143).

But even after that we may feel the need to qualify the metadata more from a user perspective. We even spoke the option of users offering feedback, like via a review system, but this would be overkill for the tool discovery registry and more at place in something like Ineo (but even there I'm doubtful it'd get really used by users, I don't think it's worth considering until a system gains a certain traction).

We also need to distinguish two types of feedback:

  1. Feedback for the tool developers on the metadata or the tool in general. This is best submitted to the tool developers themselves using their software's issue tracker (the link to which should be in the metadata). After all, we merely harvest metadata and are not a party in doing any redaction on it.
  2. Feedback that help other users in their tool discovery process, like an assessment of the usefulness of the tool for a specific purpose. These would be more like 'reviews'. This does open up a can of worms regarding moderation (as @menzowindhouwer rightly pointed out). As said, I think this makes more sense at the level of Ineo.

If we can't get good enough quality metadata after our call to developers (#143), I'd be more in favour of selecting one or two communication people to look over metadata descriptions and actively poke the respective tool developers to improve it. They can even do active pull requests to the upstream tool developers to help them along.

Of course in this entire process we can also identify parts where the harvester could do a better job and improve it.

ddeboer commented 1 year ago

We also need to distinguish two types of feedback:

Agreed. Perhaps we can call them:

  1. report an issue with the tool’s description
  2. tool review
jblom commented 1 year ago

Agreeing on all points.

Ideas on finding/electing these communications people? I'd say food for thought for @roelandordelman

roelandordelman commented 1 year ago

Opening the can of worms at the right time is the challenge I think. Firstly I would like to see a first batch of information flowing into INEO from the tools registry. We anticipate already on the need of manual curation but we need some hands-on whats-a-happening before we can decide (and appoint communication people to tasks).

proycon commented 1 year ago

Opening the can of worms at the right time is the challenge I think.

Indeed.. and I don't think the right time is there yet. At this stage, it's already hard enough to get developers themselves to check and amend their metadata at the source (#143). I've started to explicitly poke some people here and there now to see it that helps 😉

Firstly I would like to see a first batch of information flowing into INEO from the tools registry.

I completely agree yes, so-far there is nothing there yet from the Ineo side (#35). Fortunately, work on the flow to the CLARIN VLO is progressing steadily (#37).