alintheopen / SCOPE

A repository for open science communication projects
4 stars 2 forks source link

Public Perceptions of Science and Experiences of Online Science Events - Updates #10

Open olivia-mc opened 3 years ago

jesiathe commented 3 years ago

STUDY: Public Perceptions of Science and Experiences of Online Science Events

What is this study about? As part of the SCOPE group, Ellie and Olivia are researching motivations for participating in online events at National Science Week, and experiences of presenters and audiences of these events. The study was endorsed by Inspiring Australia and run nationally as part of National Science Week.

The study involved two stages of data collection. Stage one was an online questionnaire for people who attended events during National Science Week, and people who presented events during National Science Week. Participants of the questionnaire nominated themselves for stage two of the study, which was a semi-structured interview.

What's this page for? This is where the researchers will share information and publications about the study as it progresses. You can ask questions or comment on the research as we share, or simply follow along as the research progresses.

Participant Information Statements for this study.

Audience

Presenter

If you are concerned about the way this study is being conducted or wish to make a complaint to someone independent from the study, please contact the university using the details outlined below. Please quote the study title and project number (listed below).

The Manager, Ethics Administration, University of Sydney:

Telephone:+61 2 8627 8176
Email: ro.humanethics@sydney.edu.au
Fax: +61 2 8627 8177 (Facsimile)
Study name: Public Perceptions of Science and Experiences of Online Science Events Project number: 2020/508

olivia-mc commented 3 years ago

A quick update on what’s been happening:

After a big effort during National Science Week and the month after, we closed the questionnaire portion of our study at the end of September. After removing incomplete records, 622 audience responses and 87 presenter responses were able to be included in our final dataset. Thank you to everyone who filled out our survey or helped distribute it to get us as many responses as we did!

The significance of this data set is that it is the first time an Australia-wide survey has been run for National Science Week, and responses will capture experiences this year’s experiences of National Science Week, which was run predominantly online. The trends and findings from this study are still to come, but we do know that through the data we will be able to create a snapshot of this unique approach to Science Week.

Over the past month, we have been conducting interviews with audience members and presenters to further explore their experiences of online events. Thank you to everyone who volunteered to do an interview with us. We received many responses from audience members, meaning we couldn’t interview everyone who replied, but we are nonetheless incredibly grateful for your willingness to participate. We have now finished all our interviews with audience members and are hoping to finish up our last interviews with presenters over the next week. Once we’ve completed all our interviews, we will be sending off all the recordings to be transcribed so that we can begin analysing our interview data.

We’ve also been busy analysing the data we collected during the survey. We’re excited to share some of our preliminary results from the surveys soon, so keep your eyes peeled for that.

If you would like to become part of a newsletter list receiving updates on the study, please get in touch with Ellie Downing at edow8720@uni.sydney.edu.au

jesiathe commented 3 years ago

Hello! We hope the end of the year hasn’t been too hectic for you, and you’ve got some breaks coming up as 2020 comes to a close.

We’ve finished stage 2 of the data collection for the project: a huge thank you to all interview participants for sharing your time, knowledge and experience with us. We appreciate it greatly! We’re now getting the interviews transcribed and will start analysing them in the new year.

In the meantime, we’ve been analysing data from the questionnaires. A short presentation was given at an Inspiring Australia NSW briefing which you can see here if you like. The recording is of the entire briefing, and there are some excellent presentations about online events and collaborations which are worth a look. We’ve also provided a few snapshots of what we think are the most interesting bits below.

Where did people participate from?

A quick note on the data we collected – our data for both audience records and presenters were skewed towards the east coast, with a large portion of records coming from NSW. This, along with the comparatively low number of responses we collected, does make it difficult for us to get an accurate nationwide picture of what National Science Week looked like this year. The data we do have will, however, give a small snapshot of the pivot to online. This will inform future studies and importantly, marks the first time a standardised National Science Week evaluation has been run nationally with results made publicly available. NSW IA Briefing_location

We’re getting some interesting results from the questionnaire data and are looking forward to sharing it as we continue our analysis. The geographic accessibility afforded by online events changed the way people participated in National Science Week 2020 and is something we look forward to understanding more about. The graphic above shows where people were participating from; we’re currently working on a graphic to show the relationship between participant place and event place to reflect that many people attended or presented at interstate events. Excitingly, this year’s National Science Week also featured both presenters and audiences from overseas!

Attitudes to science

In general, audiences had very positive attitudes towards science. The graphic below shows the audience responses for the questions “How important do you think science is for society?” and “Science is interesting to me”, both answered on a scale from 1 – 100. Approximately half the audience sample gave a rating of 100% in response to at least one of these questions, and 39% of respondents selected 100% for both questions. Perhaps unsurprisingly, presenters rated community science events as very important, with 100% of respondents giving a rating of 70 or more. Almost half the presenters work in scientific research, showing that National Science Week does give an opportunity for the community to have a direct line to researchers. NSW IA Briefing_GitHub edit We will be looking at how these base ratings related to other questionnaire responses to understand more about the values of and attitudes towards science outreach events online.

Enjoyment

Overall, the data shows that audiences really enjoyed the events that they went to. These numbers are based on a sliding scale where we asked people to rate their enjoyment from 0 – 100. Agreement with statements was collected using a scale from strongly disagree to strongly agree. Agree ratings combine people who selected agree or strongly agree in response to the questions (but exclude people who selected somewhat agree or lower). enjoyment rating

Overall impressions

Overall, audiences and presenters have positive views towards National Science Week. 95% of presenters were satisfied with their experience, and 100% indicated they would present at a future National Science Week. One barrier for online events is connection problems and other technical issues. This affects both audiences and presenters, with 12% and 17% respectively reporting issues during their event. Data from interviews will potentially reveal more about barriers to and/or opportunities created by online events. NSW IA Briefing_GitHub edit 2

What now?

The above snippets and data are from our initial analysis, focused on the quantitative questions answered. We are now going through the qualitative (open-ended) questions to understand overarching themes and trends and will be looking at the relationships between responses to understand more.
Both these analyses will be combined with analysis of the interviews to help us answer our study questions, which at a glimpse are:

Audiences • Are online events more accessible (additional, new audiences), or attract a different audience? • What was the significance of these events for audiences? • Are audiences inspired to take follow up actions after attending? • How does the online format change audience participation in science outreach events?

Presenters • Who is running science outreach events? • What do presenters perceive non-monetary value of science outreach events as? • What motivates presenter to run events? • How does online delivery change and/or impact presentation of science outreach events?

We’ll be making both deidentified datasets publicly accessible in the first half of 2021 – let us know if you’d like to be notified when it goes up by emailing Olivia at omcr7514@uni.sydney.edu.au. Leave us a comment if you have any questions, otherwise keep an eye out in 2021 when we are planning to share data, findings and more. Olivia and Ellie

jesiathe commented 2 years ago

We've finished the analysis and are in the midst of writing up our findings. Before we get lost down that rabbit hole though, we thought we'd make sure the questions from last year were available for those who'd like to use them as well as share some reflections on the evaluation process.

Questionnaires from the 2020 study, Public Perceptions of Science and Experiences of Online Science Events:

Checklist to consider when designing and developing questionnaires:

Additional resources:

olivia-mc commented 2 years ago

Thank you to Inspiring Australia NSW for inviting us to share our findings yesterday. We hope you found them as interesting as we do and useful for future online events!

For those who missed the presentation, or would like to refresh their memories, we've shared the slides here.

We're looking forward to sharing more of our findings soon, as well as presenting some additional evaluation insights to the ACT chapters of Inspiring Australia and the Australian Science Communicators on Monday

olivia-mc commented 2 years ago

Thematic analysis and agreement in social science research

Throughout this project, we have been looking at ways to bring rigor to the qualitative work we do.

You can find the full methods section in our paper, which will be out in JCOM shortly, but we wanted to share a brief outline of our methods, including the working of how we calculated Cohen’s kappa. To understand a little more about what Cohen’s kappa is, we need to explain a bit about our process for analysing our qualitative data.

For our open-ended questions (qualitative data), we used a technique called thematic analysis. Essentially, this is a process for identifying common themes across a data set, and Braun and Clarke have an excellent paper summarising the uses of thematic analysis and the process itself. To determine our themes, we took a sample of the data and had multiple researchers identify what they saw as the themes (also referred to as codes). We would then discuss the themes, identifying which themes were the same or very similar, and which themes were not. Through this process, we were able to create a set of themes that we agreed on, as well as a clear definition for each of these themes, which we summarised in a document called a codebook. This codebook acts as the reference for coding the remainder of the data. You can see the codebooks we used for the data analysis in this paper here.

In addition to discussing the themes to ensure consensus, we also used Cohen’s kappa to determine intercoder reliability. This is a statistical measure of agreement and can be used to ensure that researchers are interpreting and applying the themes and definitions in the codebook in the same way. The higher the value, the higher the level of agreement between the researchers. For this paper, we calculated Cohen’s kappa through SPSS and validated it through manual calculations that you can see here. We’ve also made a visualisation below of how we determined agreement and disagreement. We recorded the frequency of both researcher 1 and 2 coding to the same theme (shown in pink along the diagonal). Disagreements are shown in yellow. We used this table to calculate % agreement, expected agreement, and Cohen’s kappa, shown on the left.

Cohen's kappa

image

Representation of agreement and disagreement between two coders, used to calculate Cohen’s kappa.

Questionnaire data analysis

After data cleaning, we had 611 audience records and 88 presenter records from the questionnaires. In the dot points below, we outline our process for identifying our codes and analysing the data from questions found in the audience sample (full details in the paper Methods here). Each open-ended question had one codebook associated with it, meaning we had 3 codebooks overall (2 for audience questions about the online format and enjoyment, 1 for presenters about the online format)

Interview data analysis

Follow-up interviews were conducted with 22 audience members and 17 presenters. Interviews were analysed through thematic analysis using NVivo Qualitative Data Analysis Software (QSR International Pty Ltd. 2018).

olivia-mc commented 2 years ago

We've uploaded the Supporting Information for our new paper titled "Easy to join in your pyjamas": benefits and barriers of online science engagement at Australia's 2020 National Science Week.

Download the Supporting Information

The paper has been accepted for publication in the Journal of Science Communication, and is available to read on their website here.