Closed asp82 closed 6 years ago
Thanks for the feedback! To answer some of your questions: we can view some data over time, but other surveys have been run in such a way that the data is in different places and no one has bothered to aggregate it (something I've been working on recently). Some reports have only been produced in PDF format, and our old survey tool doesn't export data in a logical or readable way and it requires a lot of reformatting and recoding. This, I suppose, would be part of my proposal in an overall sense.
For privacy concerns, there are some. How the data is stored and linked is something I need to discuss with our database admin team. There should be a way we can link the data via a nonsensical "person ID" and use that to pull demo data but leave off names and emails. But this requires some cross team work in areas I'm unfamiliar with, so I haven't fleshed out many details.
Intervention is something to consider. My first inclination is no, because our surveys are confidential (though not anonymous), and that we would only intervene on the program or cohort level. This is definitely something I should address with my team before proposing this to more of the organization. We would need to decide a stance and justify it, because I'm sure other depts would love to reach out, but in my mind, that violates the terms of our surveys and if students do receive contact based on their survey responses, it could definitely hurt our response rates or the honestly with which students respond.
Great questions! Thank you!
Your problem is that you collect a lot of survey data at different points in a student's journey but are currently unable to draw meaningful trends across time from this data. Your solution is to link survey data together across time and with other stored data using a common identifier for each student.
This is a very simple idea. But it is an extremely powerful one. Data is not useful in a vacuum. I like the overall concept and the arc of your narrative is logical and flows well.
A few questions that maybe I'm not grasping as an outsider to your system:
Can't you currently view trends over time since you have survey data at various points in time? Or is the complication that all the data is mixed such that you can't tell the responses of a cohort from that of another if the survey was completed in the same timeframe? I'd imagine you could tell when a program is suffering already because survey responses should start to be negative regardless of whether you can identify a specific respondent
Are there privacy concerns here? Or maybe rather now that the surveys can be linked to personal data and identified, is there a risk that the participation rate declines or the answers become skewed?
Would you ever intervene? i.e. you see a student having a successful experience, then suddenly things drop dramatically in their surveys. Would you reach out to the student?