IMLS / public-libraries-survey

FY 2026 IMLS Public Libraries Survey: Solicitation of Data Elements Changes
6 stars 3 forks source link

Addition - Program Sessions/Attendance by Format #37

Closed enielsen-air closed 3 years ago

enielsen-air commented 4 years ago

SUBMITTER: Evan Nielsen (AIR) on behalf of IMLS and the LSWG PLS Methods Ad Hoc Subcommittee

NEW DATA ELEMENTS:

RATIONALE: This proposal is one of three interrelated proposals for changes and additions to the library programs data elements. The SDCs and the LSWG have indicated a need to gather additional data about library programming since it is such an expanding and changing library service. Particularly, in response to the COVID-19 pandemic, SLAAs and libraries have been asking for guidance for how to count live-virtual and recorded programs. A series of data elements were developed based on a review of existing state library surveys and the IMLS State Program Report system. The potential data elements were vetted in one-hour interviews with nine respondents from public libraries of various sizes in seven different states. The interviews informed the work of an ad hoc subcommittee of LSWG members and staff from IMLS and AIR to refine the revised data element definitions proposed here.

The eight data elements presented in this proposal represent four exhaustive and mutually exclusive format categories that would sum to the Total data elements for programs and attendance. Thus, they should be approved or rejected as a set.

PROPOSED DEFINITIONS: See attached file (link below) for the proposed definitions. PLS Program Items_Formats.docx

timrohe commented 4 years ago

I am very opposed to making libraries double count programs and attendance by both age range and format. That is so much extra work and my directors will riot. I can foresee getting lots of questions/complaints about any time the instructions say one data element is a total of this AND a total of that.

This would require having four summations, one each for number of programs and attendance by age and one each for number of programs and attendance by format. Then, there would have to be an edit check to make sure that the two sets of totals that are supposed to match actually do total up to the same number. I foresee a multitude of instances where those two totals are not the same and that is going to be a huge headache to untangle.

timrohe commented 4 years ago

Again, I'm opposed to adding prerecorded videos into these numbers and I think taking them out and making them their own separate data elements (number and views) would help alleviate this issue of double counting. However, regarding the question of whether to count views within the fiscal year or only for a limited time after posting, I would be more in favor of counting views at the end of the fiscal year. I believe that to be the lesser of two evils.

Only counting views for a set period of time would cause libraries to feel like they’re not getting credit for views after that time period, although any views after the fiscal year wouldn’t count either, and we all know how much libraries love getting credit for things. I still anticipate getting questions like, “What about that video I posted on December 31?” Live programs, and even live-streamed synchronous programs, don’t have this issue of having views spaced out over time, which is one of the reasons why I think prerecorded videos are fundamentally different and should not be included in the programming numbers.

sdermont commented 4 years ago

I would rather keep the current data elements intact and use them as subtotals. So for example, data element 601 (Number of Children's programs) would be the sum of two new data elements - (Number of in-person children's programs) + (Number of live virtual children's programs). That way we aren't asking them to sum things twice. Just one count for in person and one count for live virtual added up to the total which is equal to the existing elements. This is how I've done it for my FY20 survey and we'll see how that works out.

I'm agreeing with Tim a lot today - I would definitely keep prerecorded program content as a separate data element and would not include it in 601 or the rest of the existing programming elements.

yanademireva commented 4 years ago

While I think the reporting burden here is incredibly high, I do think that this could yield some really solid numbers and allow for a more rigorous analysis of what programming really looks like nationwide. Libraries I think want to go more in-depth, and may be willing to do the work. My main concern for virtual programming is that, depending on platform. stats are not available beyond a certain date, so _counting them for the entire fiscal year is not possible__. As such, we need to set a suggested limit. In Maryland we chose 24 hours because, again, the reporting burden is low. Although I suppose a month or so would be ok too - too far out in my opinion, but much more do-able than entire fiscal year.

I agree with Scott's suggestion to break out by age group then mode.

Regarding the onsite/offsite distinction, I do see incredible value in collecting this data. Going along with Scott's suggestion, this could be another sub-element in each age group that contributes to the total for that age group. It would be straightforward to then also auto-sum all onsite/all offsite programs, etc.

timrohe commented 4 years ago

@yanademireva You said that, depending on the program/application, stats are not available beyond a certain date. For "live-virtual program sessions," we would just be counting the attendance while it's live, right? So, if stats for prerecorded program views are not available past a certain date for some applications, in my mind, that would just be another reason not to include prerecorded programs in the stats for programs. They are fundamentally different.

timrohe commented 4 years ago

Just to lay out the additional reporting burden, it's been suggested a few times that we make these new format data elements sub-elements in each proposed age group and then sum them. We currently track 6 data elements related to programs (600-605). Most of us, even myself, have a handful more. However, sticking to the official numbers, making each of these 8 proposed format related data elements sub-units of the 10 age group categories would up the official number to 80 data elements, plus their accompanying summations, at least one per format. That's an increase of 1,367% from 6 data elements to 88 data elements and would increase the number of questions on my report by almost 40%. I can't sell that here in NH. My libraries won't do it. I have no carrot or stick to make them do it either. I just don't think that approach is feasible.

yanademireva commented 4 years ago

@timrohe That's an excellent point, and I agree that tracking pre-recorded programs can get messy. I will note that, depending on the platform we allow libraries to pull "live" stats at 24-hours because there isn't always someone who is able to pull the stats while the program is literally "live." If we wanted to get the most bang for our buck, we would ask for "peak live views," but this can't be replicated across platforms (Facebook/Youtube/Crowdcast/etc), so we compromised and said 24-hours counts as live.

enielsen-air commented 4 years ago

@yanademireva, can you be more specific about which platforms do not provide a count of peak live views? We looked into Facebook and Youtube specifically, and they both have that metric. I just looked into Crowdcast, and it appears they also have a count of the number of live attendees, which seems like the same thing. I'm not sure how long people have to view the live stream to be counted, but I imagine different libraries have different thresholds for how long people have to stay in the event space to be counted as an attendee for in-person events, so that seems comparable to the current inherent variability in program attendance measurement.

yanademireva commented 4 years ago

@enielsen-air We did not find peak live views specifically with Crowdcast. One aspect of our reasoning was that Youtube and Facebook do have different metrics for how long a view is - Youtube's I believe is 30 seconds, while Facebook's is 3 seconds, so peak live views means different things. As such, while the peak live views are the easiest metric to get, we felt more confident in the one-minute at 24 hours views, even with live programs. One of our main goals was to measure engagement consistently across platforms. And, I suspect this could all change in the blink of an eye, as we're at the mercy of each platform's metrics.

angelakfox commented 4 years ago

As far as breaking each individual programming format into multiple age groups (which, as Tim pointed out, would bring our programming elements to 88), I do wonder if the data collected will be worth the burden of collecting it. Who is looking for this level of a granular breakdown, and why? I don’t get the feeling that our directors want this – and if they do, I would certainly encourage them to track it for their own library. But who is going to ask about how their live-virtual program attendance for kids ages 6 – 11 numbers stack up against their peers?