pkp / pkp-lib

The library used by PKP's applications OJS, OMP and OPS, open source software for scholarly publishing.
https://pkp.sfu.ca
GNU General Public License v3.0
307 stars 448 forks source link

Provide display for distribution and journal integrity services #6042

Open NateWr opened 4 years ago

NateWr commented 4 years ago

A comment from @willinksy on https://github.com/pkp/pkp-lib/issues/5980.

I agree with James and Mike on this being such a great addition for managing journals. I'd also like to consider whether it could be a part of our "journal integrity" initiative, which is intended to help journals do more to establish for authors, researcher-readers, and the public the steps they've taken to publish to the highest academic standards in preserving the journal's integrity as a source of knowledge.

In the case of this new Distribution feature, for each service and tool that a journal was operating (if not always complete), OJS could provide a tastefully designed "Journal Services" box on the journal's homepage (perhaps as an option for each service or as a whole), which would include a list with a text along these lines, "[title of journal] distributes article and journal data to the following scholarly publishing services: DOAJ, CrossRef, ORCID, PKP Preservation Network, PKP Beacon, Google Scholar..." (with hyperlinks to each).

Now I realize that there is not going to ever be a fool-proof integrity indicator. Yet part of this effort is for us to take more of a lead with OJS in demonstrating to the public what journals do in their efforts to maintain high academic standards. A big step in that direction might be achieved, I want to believe, by three steps: Displaying these services on the journal's homepage, in addition to providing on article landing pages the peer-review dates and reviewer numbers (with content optionally dependent on authors and reviewers), as well as displaying ORCIDs for all editors and editorial board members (by having an optional "board" generator to which journal managers can add users with ORICIDs for display under About).

NateWr commented 4 years ago

I think this is a good idea. I'd like to keep the display decoupled from the plugins, because there are likely to be cases where journals deposit to Crossref and are part of DOAJ even if they don't use our plugins to do so. A decoupled approach gives them the flexibility to accurately describe their services.

That said, for the integrity part, it is probably important that there is an automated component so the journal can demonstrate that they are being honest. Perhaps there are ways that we can do that through showing some of the results. Some initial ideas:

The other recommendations, such as displaying peer review dates and reviewer numbers, and editorial board ORCIDs, sound to me really good. And I agree that the right way to package them in the UI is through the language of journal integrity. Perhaps we can take some inspiration from existing website/SEO auditing tools, and build a kind of "Journal Integrity Checklist".

Here's a URL inspection audit from Google Search Console: url-report

And a more complex one from Chrome's Lighthouse auditing tool: lighthouse

Both are probably more complex than we want to get. But a similar kind of page which undertook a list of checks to see what was running, and recommended improvements, could be useful. Some quick examples:

NateWr commented 4 years ago

Related post in the forum on bad-faith actors mimicing journals: https://forum.pkp.sfu.ca/t/about-journal-predation/61755

marcbria commented 4 years ago

Opening a hackmd page to capture ideas of items that could be taken in consideration: https://hackmd.io/@marcbria/quality-indicators/edit

Feel free to edit or extend.

marcbria commented 4 years ago

Recent article publishes a summarized list of criteria for journals according to Plan S. This list could be a good basis to develop the "journal integrity" plugin.

I visualize a "checklist" (or list of traffic lights) with all these elements (and more) as a dashboard for the editors. We can add some graphics at the top (lighthouse style) that tell you if you meet the Plan-S, Amelica, PKP or any other criteria. And if you want to make it cooler, a Kiviatts diagram showing your journal's strengths and weaknesses.

1.1 Common requirements for all publishing environments
Basic mandatory conditions
1.1.1 Quality review standards following the Committee on Publication Ethics (COPE) and others
1.1.2 Detailed description of editorial policies, with annual statistics
1.1.3 Author retention of copyright
1.1.4 Allowing immediate publication under an open license and allowing repository deposit
Mandatory technical conditions
1.1.5 Use of persistent identifiers such as DOI (digital object identifier), URN (uniform resource name), or Handle
1.1.6 Deposit of content in digital preservation environments
1.1.7 High-quality article metadata, in interoperable, non-proprietary formats, including funding
1.1.8 Computer-readable information from the OA state and the license in the article
Highly recommended additional criteria
1.1.9 Support for persistent identifiers for authors and entities such as Orcid
1.1.10Serpa/Romeo registration of self-archiving policy
1.1.11 Allowing download of full text in standard computer-readable formats such as JATS XML
1.1.12 Direct deposit of publication by publisher to an OA Plan repository
1.1.13 OpenAIRE-compliant metadata
1.1.14 Links to data, code, and other outputs in external repositories
1.1.15 Open citation data according to Initiative for Open Citations (I4OC) standards
1.2 Specific conditions required for open access journals and platforms:
1.2.1 Journal/platform must be part of the Directory of Open Access Journals (DOAJ) or in the process of being registered
1.2.2 No parallel paid replicas
1.2.3 Cost and transparent pricing
1.2.4 Exemptions and discounts for low income economy authors (edited)

The statistics requirements according to Plan-S are discussed here: https://github.com/pkp/pkp-lib/issues/6130