Professional Documents
Culture Documents
Simpson Final Research Proposal
Simpson Final Research Proposal
Public libraries have long provided more for their users than just physical items,
historically books. They have also become gathering centers providing programming for the
community. These programs run the gamut from traditional library groups like books club and
storytimes to cutting edge programs focusing on STEM topics and wellness like yoga. It can be
hard for public libraries to determine what metric is most appropriate to decide whether these
programs are worth the time and resources it takes to produce them. One of the more
This metric then begs the question of what measures can be taken to ensure that
programs are well attended. It’s a mystery that many libraries have tried to solve. The proposed
study will attempt to resolve one facet of that mystery. In this study, the following research
1. Are public library patrons more likely to attend a program or event that requires
registration?
a. The proposed study aims to prove that patrons are more likely to attend
programs that require registration than programs that don’t require registration.
libraries because as stewards of the public’s tax dollars, public libraries have a responsibility to
utilize those dollars well. Underattended programs can be considered a waste of taxpayer funds
and of staff time, therefore, knowing how to maximize library program attendance would be
Literature Review
PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 2
As stated above, the struggle to determine the best way to consistently produce
well-attended programs is not a new challenge for public libraries. Prior to the early 2000s, there
was very little research done on library program evaluation. Nitecki discusses this in her 2004
The 2011 Survey of Public Libraries conducted by the Institute of Museum and Library
Services (IMLS) noted that library programs, in general, were being better attended than in the
past. After the survey result’s release in 2014, Chant summarized the findings in regards to
program attendance. The IMLS survey notes the rising rate of library programming attendance,
and Chant (2014) attributes this to higher levels of funding. This means that libraries are able to
create more programs for patrons to attend, and since there are more programs to attend in
The rate of programs offered versus the rate of program attendance is not growing at the
same rate. In 2015, Morely compared the results of the same MLIS study that Chant (2014) and
dug into the nitty gritty of the actual programming numbers. There he discovered “the 61 percent
surge in programming nationwide seems lopsided when placed in context with smaller gains
made in program attendance. Data reveals that attendance at library programs nationwide only
Morely (2015) also discusses an ongoing study that was initiated in 2014 by the
American Library Association (ALA) with a grant from IMLS. The National Impact of Library
Public Programs Assessment is currently in its first phase, with the white paper being released
Essentially, as libraries have become better and more consistently funded, there has
been a boom of programs offered, and while attendance has also risen, it hasn’t met the growth
attendance, there have been a number of studies that explore alternate methods to ensure
framing libraries and their programs as cultural centers to expand the horizons of the
community. Traveling exhibitions booked through the ALA helped increase cultural program
A librarian in Canada published a paper in 2000 about the unconventional choice of her
library to charge for library programs. The library began charging $2-6 for library programming in
1995 and saw such a consistent show of support and attendance that most programs generate
waitlists. (Saczkowski, 2000) The article also discusses how colleagues within libraries around
the continent of North America seem alarmed by this concept. It is mentioned that charging a
minimal fee for programming is standard at libraries around Ontario, but is foreign and alarming
“Since we started to charge fees, the level of commitment to our programs has soared.
Our attendance rate is now 90 percent of those registered, compared to 50 to 60 percent before
One additional alternative method to improve program attendance was collaborating with
community partners. In the example discussed by Laddusaw (2018), the Texas A&M library
shared items, promotional materials, and planned programs with a nearby public library and
they saw a 60% attendance increase at their adult programs. Partnerships have proven to lend
There is also the consideration that attendance isn’t necessarily the only measure of
“At the end of the day, program attendance numbers don't always tell the whole story. A
program with a mediocre turnout can still be very worthwhile for those who attended. Listen and
watch the exchange between the audience and the presenters during the events. Are the
attendees enjoying themselves? Are they learning something? Are they engaged with the
speaker? Programs with great interaction and social connectivity can be considered a great
Methodology
program attendance at public libraries. Data will be collected via records taken at the program
by researchers, as well as collected via the repository of registrations for these programs. The
analysis will be conducted by comparing the results of program attendance of sets of identical
programs.
This method is appropriate for this study because it allows for organic collection of data
through natural behaviors at a public library. It also minimally interferes with research subjects,
by only adding the registration element. This will hopefully show a correlation between program
registration and program attendance, even if it’s not the correlation predicted in the hypothesis.
The study population would entirely depend on who shows up to programs. Since there
is no control over the actual people who would attend these programs, programs to be
evaluated would be selected as a stratified random sample. If the library being used during the
proposed study hosts 100 programs a quarter and the programs are broken down as such: 30%
Adult, 15% Teen, 35% School-Aged Children, 20% Preschool/Baby, a selection of programs to
match that stratification will be chosen to duplicate and apply the registration requirement.
PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 5
Although this will take time to identify, a stratified sample lends more reliability, minimize bias,
and help obtain as representative a sample population as possible. (Dr. Nic’s Maths and Stats,
2012)
Each program selected will take place twice, and upon random assignment, the first
occurrence or the second one will require that patrons who want to attend pre-register for the
program. Programs that require registration will collect the registration via Google Forms and
will export it to a Google Sheet. Registration will open for programs six weeks before the
programs take place. The form will be available on the library’s web calendar and librarians will
have access to it to manually add patrons who would like to register in person or over the
phone. At both instances of each program, attendance will be taken via a sign-in sheet and a
headcount from the research team member in attendance. Data from the programs will be
transcribed into separate tabs on the Google Sheet which will allow for easy access by
researchers and will make analysis easier as well. Calculations will have to be tabulated by
hand, and results would need to be put into a table for analysis.
To analyze the data, the attendance rate of programs with registration requirements
would be compared to the attendance rates of programs without registration requirements. This
will be done in multiple comparisons. Overall attendance across all programs will be compared
as well as age group comparisons within the preidentified segments of adults, teens,
school-aged children and preschoolers. There may be age groups that resonate more with
registration than the other. As stated above, the data will be reported in charts or bar graphs
Hopefully, this research will help public libraries begin to solve the mystery of program
aid.
PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 6
Timeline
Week 1:
Weeks 2 - 8:
● Create Google Form sign-up sheets to embed on the library’s web calendar
Weeks 9 - 16:
● Run programs
Week 17:
● Create graphs
There are a few obvious limitations to the proposed study. First and foremost there are
programs are truly identical. There may not be enough interest in a program to fill both sessions
of it. There are also outside factors like weather and other large community events. These could
impact how many people want or are able to attend the library's program, and could be a
deciding factor over registration for attending the library’s program. The proposed study will
attempt to mitigate this limitation by giving a month’s time between programs at a time of year
when weather is unlikely to radically change, the early summer for example, and at the same
time of the month and day of the week. Programs will be identical in content and schedule as
PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 7
well, but there is a chance staff could be ill or leave the library’s employ for some reason within
the proposed study’s time limit. To account for these, researchers will note weather conditions
for each program, as well as potentially noteworthy community events that may potentially pull
the audience for programs. There is also a built in element to the proposed study which may
help mitigate these occurrences. The stratified sample will hopefully include a representative
number of regularly occurring programs like book clubs or tai chi classes. Because these types
of fo programs happen regularly, it is less likely that weather or another factor would affect
attendance.
The prior research on this topic is minimal and therefore poses a limitation to the
proposed study. This means, however, that there is a gap in the collective knowledge about
what truly increases program attendance at public libraries. The fact that the NILPPA research
Aside from the limitations mentioned above, the quality of the research will be ensured
through several measures. The proposed study will have practically no interaction with patrons
involved in the study and therefore there will be minimal to zero researcher interference in the
results. The data collected is absolutely objective as it collects qualitative data and aims to
Quality is also guarded by the sampling process of choosing which programs to include
in the proposed study. A stratified random sample of programs should result is a reliable sample
of patrons who attend library programs. This sampling should lead to highly reliable data to be
References
American Library Association. (2017, October 10). NILPPA. Retrieved from https://nilppa.org/
), 41.
Brandehoff, S. (1997). Turning libraries into cultural centers. American Libraries, 28(3
Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=
a9h&AN=9703134548&site=ehost-live
Chant, I. (2014). Public: IMLS: Program attendance still rising. Library Journal, 139( 13), 13.
Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=l
lf&AN=97350084&site=ehost-live
Dr Nic’s Maths and Stats. (2012, March 13). Sampling: Simple Random, Convenience,
systematic, cluster, stratified - Statistics Help [Video file]. Retrieved from
https://www.youtube.com/watch?time_continue=293&v=be9e-Q-jC-0
Kendzior, N. (2015). Programmed for Success. Public Libraries, 54(6), 9–10. Retrieved from
http://search.ebscohost.com.proxy.library.kent.edu/login.aspx?direct=true&db=llf&AN=1119466
45&site=ehost-live
Laddusaw, S., & Wilhelm, J. (2018). Yours, mine, ours: A study of a successful academic &
public library collaboration. Collaborative Librarianship, 10(1), 30–46. Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=l
lf&AN=130353130&site=ehost-live
Morley, G. (2015). Are we lost in orbit? Louisiana Libraries, 77( 4), 19–35. Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=l
lf&AN=118762051&site=ehost-live
Nitecki, D. (2004). Program Evaluation in Libraries: Relating Operations and Clients. Archival
Science, 4, 17-44.
Public Libraries in the United States Survey: Fiscal Year 2011 (Rep.). (2014, March 12).
Saczkowski, I. (2000). The buck stops here. School Library Journal, 46(11), 29. Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=l
lf&AN=502857325&site=ehost-live