Open eleventh19 opened 2 years ago
we met as a group on Monday, created a DM group for the project team to keep the conversation going. next steps are looking into various open peer review projects so that we can come up with what we mean by open and what basic process we want to use for peer review.
links we've come across so far. still gathering more info on peer review so if you know of any good research/studies/etc, please add here! we'll be working on a notable works in peer review and will explore adding any of the links we have here
What do we mean by open peer review?
We have been thinking abstractly about the purpose of peer review in order to determine how to best implement peer review. We have outlined the steps of peer review that we need to address and have been documenting how those steps should be implemented. As well as capturing notes about other efforts to do open peer review.
Details can be found in our Coda document
The steps of the peer review process that we have identified are:
Our next objective is to define how each of these steps will occurs (tooling and processes required)
We are also doing a community call swap with ResearchHub this week. We joined their call on Monday and they will join our call this Thursday. Also, with the help of Cent at Metagov, got the call info shared with their community
this study examines how transparency of identity affects perception within a learning environment: https://www.frontiersin.org/articles/10.3389/feduc.2019.00129/full
"Strengthening, Hiding or Relinquishing Ethnic Identity in Response to Threat: Implications for Intercultural Relations" https://www-s3-live.kent.edu/s3fs-root/s3fs-public/file/09-Rosita-Albert-Adina-Schneeweis-Iva-Knobbe.pdf
adding articles to support why having some level of pseudonymity may encourage participation by demographics that feel their identities may threaten the perception of their work unfairly: in other words, does "transparency" necessitate the publicizing of all identities or is there a viable alternative that does not compromise the integrity of the transparency while still maintaining some level of occlusion for a peer reviewer's identity?
We have written a proposed flow for the peer review process. We have focused on discussing and refining issues of transparency, payment, and tooling. We are looking into different ways we could ensure good data collection that will allow us to develop this process intelligently across iterations.
adding articles to support why having some level of pseudonymity may encourage participation by demographics that feel their identities may threaten the perception of their work unfairly: in other words, does "transparency" necessitate the publicizing of all identities or is there a viable alternative that does not compromise the integrity of the transparency while still maintaining some level of occlusion for a peer reviewer's identity?
This a great framing of the question. Allowing the reviewer to maintain some privacy while also ensuring there are no ethical conflicts is a challenge normally handled by a trusted third-party (journal editors) but in lieu of taking on an editorial role, there are important questions about if we could do something like this by replacing a trusted third party with cryptographic proofs. I look forward to digging into if/how this is possible
This week we met with Nihar Shah from CMU and were able to ask questions + get feedback on our peer review process. We dug deeper into the literature around peer review and refined the questionnaire for reviewers and authors as well as our final survey.
We set an intention to be diligent about collecting data and releasing publicly. We hope others will also be able to use the data to uncover insights.
Last week, we did a community call where we shared our initial draft of the project and received valuable community feedback. We are very grateful to our community for taking the time to share their ideas for us and we realized we had a need for more feedback. Video: https://www.youtube.com/watch?v=kbVtu3Yshsk Slides: https://docs.google.com/presentation/d/1HDGs5Foq_04c0NbC2Ze1QMxBNHVbkhG1uZ1jd8JTHT4/edit#slide=id.g12640c561fe_0_74
This week, we drafted documents to share with the community to request online engagement and feedback on the forum. Document: https://docs.google.com/document/d/1Poyw3LFmjGmM5pMyEvtLU7Gp9rOsTNcsegh3DkLSi7o/edit
This week we are reaching out to researchers who are looking for peer review so that we will have research papers that can be reviewed.
We are working on the meta post that will give context for individuals who are just learning about our peer review process: https://docs.google.com/document/d/1Poyw3LFmjGmM5pMyEvtLU7Gp9rOsTNcsegh3DkLSi7o/edit
Part of the meta post includes our "Open Questions in Peer Review" which should hopefully get a discussion around our central questions. https://docs.google.com/document/d/14zH9f6sL_1EKd4HMkR3tevJogabTCniIBEo-K4P0pWc/edit
We have one potential candidate and I'm reaching out to a few others in terms of who will get the review. Hoping to have more clarity here by the end of next week.
Also, want to link to some relevant videos from our YouTube:
I reached out to the potential first three folks getting peer reviewed. Umar made a doc for us to start tracking the decision variables we can play around with.
Working on documentation that would be used by multiple parties to curate and share knowledge about peer review. A first view of what an implementation for this collective document might look like can be found here: https://coda.io/@nick-linck/scrf/landscape-survey-34
Are there other ways we could convey this information? For instance, should we make a DeSci Peer Review Wikipedia page and edit the content there?
Reached out to the first three potential folks, 2 maybes and a no. Need to think of another person to ask for getting review and then onto finding reviewers.
an example of an "open-access, peer-reviewed journal" that has legitimately exorbitant application fees for submitting an article with no transparency about whether the fees go to reviewers or not: https://blockchainhealthcaretoday.com/index.php/journal/about/submissions "BHTY’s Article Processing Charge (APC) for the international specialty journal is $1450.00 USD for original research articles, $1650.00 USD for Special Reports (15,000 words and upwards of 100 references), and $800.00 USD for students currently enrolled at an academic institution. Please plan accordingly. THERE IS NO FEE FOR BLOGS.
BHTY is pleased to offer authors the option of paying their APC with digital currency (Bitcoin). Contact the publisher for more information if you would like to use this payment method at t.cenaj@partnersindigitalhealth.com
If changes are required post publication, the author(s) will be charged a flat $350.00 USD for revision(s)."
https://blockchainhealthcaretoday.com/index.php/journal/authors-journal in their statement of process, there is no clarification on whether or not reviewers are paid.
an example of an "open-access, peer-reviewed journal" that has legitimately exorbitant application fees for submitting an article with no transparency about whether the fees go to reviewers or not: https://blockchainhealthcaretoday.com/index.php/journal/about/submissions "BHTY’s Article Processing Charge (APC) for the international specialty journal is $1450.00 USD for original research articles, $1650.00 USD for Special Reports (15,000 words and upwards of 100 references), and $800.00 USD for students currently enrolled at an academic institution. Please plan accordingly. THERE IS NO FEE FOR BLOGS.
BHTY is pleased to offer authors the option of paying their APC with digital currency (Bitcoin). Contact the publisher for more information if you would like to use this payment method at t.cenaj@partnersindigitalhealth.com
If changes are required post publication, the author(s) will be charged a flat $350.00 USD for revision(s)."
https://blockchainhealthcaretoday.com/index.php/journal/authors-journal in their statement of process, there is no clarification on whether or not reviewers are paid.
Thanks for sharing. That's good for us to know as we start putting together different resources for peer review.
The team met today and @isss111 joined us. I am following up with more of my outreach in trying to line up the first projects to review.
Here is the current state of the knowledge repo
In order to clean up our knowledge repo, I moved some old pages to: A coda doc called "Prior Work at SCRF"
The cleaned up version of the knowledge repo still exists here: A coda doc called "Peer Review Knowledge Repo"
We are joined by @erikvanwinkle who led the team in writing out a project north star, deliverables, and tasks with sub-tasks in a slide deck
Completed 4 research summaries:
Two more summaries in progress:
We are tracking tasks on our Github Board: https://github.com/orgs/smartcontractresearchforum/projects/27
Upcoming Peer Review Community Call on July 6th
Last week, we ran our first community call and had a great turnout with folks from ResearchHub, MetaGov, SkepSci, DeSci Labs, and Atoms.
We also completed two research summaries: An Overview of Challenges, Experiments, and Computational Solutions in Peer Review (Extended Version) Current Market Rates for Scholarly Publishing Services
We identified the first cut of reviewers for outreach and are trying to lock them down.
Last week, thanks to @eleventh19 and @nicklinck we locked down our first four reviewers!
We also started modifying our plans to more rapidly move toward running the experiment.
We're working on documentation and timelining this week as well as identifying further papers and reviewers
Last week, we drafted interview questions for participants, started identifying participants for the next round, and created multiple potential project timelines
Since our last update, we have:
Project Description
The goal of this project is to explore what an open peer review for independent researchers experiment can look like. This project will entail:
Work Plan
Resources
General Tips