We need to define our key topic (which we had planned to do anyway in terms of defining accessibility/transparency)
“Determine best practices” part of our research question is problematic, as this would require assessment of which practices are good and bad
Suggests that we change research question to something like “Provide an overview of current practices"
Our aim should be to capture the most important material and describe everything once, rather than capturing all material and counting the number of times mentioned (as in a systematic review)
Searches:
We need to begin by testing our search terms and search strategy
We should start with 4-5 articles
Could get a bit of noise due to the data availability statements that are required by some journals (e.g., document includes the search term “open data” as they have responded to a data availability statement saying that they can be contacted for the dataset, but the journal article itself might not be in any way related to open science/collaboration)
Data sources:
Primary: Journal articles and books
Supplementary searches: Websites, hand searching
Sending emails to subject specialists/major university libraries asking for guidelines on open science/collaboration is a search strategy in itself
Hannah idea: Contact Lex Bouter?
Screening:
Multiple co-authors test search strategy
Since we are likely to be working with so many websites, Lasse suggests that we have one main screener (first author) that does all title/abstract screening
Then could have multiple screeners for full text screening
Data synthesis:
Could use a descriptive analytical approach
Some other scoping reviews do some sort of thematic analysis using software like NVivo
Lasse uses a well-known method for data synthesis in scoping reviews, originally proposed by Arksey & O’Malley (2005) and now built on by Levac, Colquhoun, & O’Brien (2010)
Research question:
Searches:
Data sources:
Screening:
Data synthesis:
Resources:
ACTION POINTS: