Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 1

Public Library Program Registration and Its Effect on Program Attendance

Introduction and Research Questions

Public libraries have long provided more for their users than just physical items,

historically books. They have also become gathering centers providing programming for the

community. These programs run the gamut from traditional library groups like books club and

storytimes to cutting edge programs focusing on STEM topics and wellness like yoga. It can be

hard for public libraries to determine what metric is most appropriate to decide whether these

programs are worth the time and resources it takes to produce them. One of the more

consistently used metrics is program attendance.

This metric then begs the question of what measures can be taken to ensure that

programs are well attended. It’s a mystery that many libraries have tried to solve. The proposed

study will attempt to resolve one facet of that mystery. In this study, the following research

question and hypotheses will be tested:

1. Are public library patrons more likely to attend a program or event that requires

registration?

a. The proposed study aims to prove that patrons are more likely to attend

programs that require registration than programs that don’t require registration.

Determining what factors contributed to well-attended programs is beneficial to all public

libraries because as stewards of the public’s tax dollars, public libraries have a responsibility to

utilize those dollars well. Underattended programs can be considered a waste of taxpayer funds

and of staff time, therefore, knowing how to maximize library program attendance would be

beneficial to libraries and their patrons.

Literature Review
PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 2

As stated above, the struggle to determine the best way to consistently produce

well-attended programs is not a new challenge for public libraries. Prior to the early 2000s, there

was very little research done on library program evaluation. Nitecki discusses this in her 2004

paper on the subject.

“There is no single theoretical basis by which the purpose of libraries or librarianship

underscores evaluations of library programs, nor is there a common professional mandate or

cultural expectation for librarians to undertake program evaluations.” (Nitecki, 2004)

The 2011 Survey of Public Libraries conducted by the Institute of Museum and Library

Services (IMLS) noted that library programs, in general, were being better attended than in the

past. After the survey result’s release in 2014, Chant summarized the findings in regards to

program attendance. The IMLS survey notes the rising rate of library programming attendance,

and Chant (2014) attributes this to higher levels of funding. This means that libraries are able to

create more programs for patrons to attend, and since there are more programs to attend in

general, attendance has grown alongside the number of programs.

The rate of programs offered versus the rate of program attendance is not growing at the

same rate. In 2015, Morely compared the results of the same MLIS study that Chant (2014) and

dug into the nitty gritty of the actual programming numbers. There he discovered “the 61 percent

surge in programming nationwide seems lopsided when placed in context with smaller gains

made in program attendance. Data reveals that attendance at library programs nationwide only

increase 38 percent from 2004 to 2012.” (Morley, 2015, pg 23)

Morely (2015) also discusses an ongoing study that was initiated in 2014 by the

American Library Association (ALA) with a grant from IMLS. The National Impact of Library

Public Programs Assessment is currently in its first phase, with the white paper being released

sometime in 2019. (American Library Association, 2019)


PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 3

Essentially, as libraries have become better and more consistently funded, there has

been a boom of programs offered, and while attendance has also risen, it hasn’t met the growth

of the number of programs.

Although the proposed study concerns registration as an aid to increase program

attendance, there have been a number of studies that explore alternate methods to ensure

program attendance. An early discussion of creating successful programs centers around

framing libraries and their programs as cultural centers to expand the horizons of the

community. Traveling exhibitions booked through the ALA helped increase cultural program

attendance over attendance at regular library programs. (Brandehoff, 1997)

A librarian in Canada published a paper in 2000 about the unconventional choice of her

library to charge for library programs. The library began charging $2-6 for library programming in

1995 and saw such a consistent show of support and attendance that most programs generate

waitlists. (Saczkowski, 2000) The article also discusses how colleagues within libraries around

the continent of North America seem alarmed by this concept. It is mentioned that charging a

minimal fee for programming is standard at libraries around Ontario, but is foreign and alarming

to librarians in the states and elsewhere in Canada.

“Since we started to charge fees, the level of commitment to our programs has soared.

Our attendance rate is now 90 percent of those registered, compared to 50 to 60 percent before

we introduced fees.” (Saczkowski, 2000)

One additional alternative method to improve program attendance was collaborating with

community partners. In the example discussed by Laddusaw (2018), the Texas A&M library

shared items, promotional materials, and planned programs with a nearby public library and

they saw a 60% attendance increase at their adult programs. Partnerships have proven to lend

more horsepower to attendance levels in library programming.


PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 4

There is also the consideration that attendance isn’t necessarily the only measure of

success for a library program. As Kendzior (2015) states:

“At the end of the day, program attendance numbers don't always tell the whole story. A

program with a mediocre turnout can still be very worthwhile for those who attended. Listen and

watch the exchange between the audience and the presenters during the events. Are the

attendees enjoying themselves? Are they learning something? Are they engaged with the

speaker? Programs with great interaction and social connectivity can be considered a great

success even with an average program attendance.”

Methodology

The proposed study aims to be conclusive as to the effect of program registration on

program attendance at public libraries. Data will be collected via records taken at the program

by researchers, as well as collected via the repository of registrations for these programs. The

analysis will be conducted by comparing the results of program attendance of sets of identical

programs.

This method is appropriate for this study because it allows for organic collection of data

through natural behaviors at a public library. It also minimally interferes with research subjects,

by only adding the registration element. This will hopefully show a correlation between program

registration and program attendance, even if it’s not the correlation predicted in the hypothesis.

The study population would entirely depend on who shows up to programs. Since there

is no control over the actual people who would attend these programs, programs to be

evaluated would be selected as a stratified random sample. If the library being used during the

proposed study hosts 100 programs a quarter and the programs are broken down as such: 30%

Adult, 15% Teen, 35% School-Aged Children, 20% Preschool/Baby, a selection of programs to

match that stratification will be chosen to duplicate and apply the registration requirement.
PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 5

Although this will take time to identify, a stratified sample lends more reliability, minimize bias,

and help obtain as representative a sample population as possible. (Dr. Nic’s Maths and Stats,

2012)

Each program selected will take place twice, and upon random assignment, the first

occurrence or the second one will require that patrons who want to attend pre-register for the

program. Programs that require registration will collect the registration via Google Forms and

will export it to a Google Sheet. Registration will open for programs six weeks before the

programs take place. The form will be available on the library’s web calendar and librarians will

have access to it to manually add patrons who would like to register in person or over the

phone. At both instances of each program, attendance will be taken via a sign-in sheet and a

headcount from the research team member in attendance. Data from the programs will be

transcribed into separate tabs on the Google Sheet which will allow for easy access by

researchers and will make analysis easier as well. Calculations will have to be tabulated by

hand, and results would need to be put into a table for analysis.

To analyze the data, the attendance rate of programs with registration requirements

would be compared to the attendance rates of programs without registration requirements. This

will be done in multiple comparisons. Overall attendance across all programs will be compared

as well as age group comparisons within the preidentified segments of adults, teens,

school-aged children and preschoolers. There may be age groups that resonate more with

registration than the other. As stated above, the data will be reported in charts or bar graphs

that will illustrate the results clearly.

Hopefully, this research will help public libraries begin to solve the mystery of program

attendance by enlightening library staff as to whether registration is a barrier to attendance or an

aid.
PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 6

Timeline

Week 1:

● Select programs for upcoming programming cycles to include in the study

Weeks 2 - 8:

● Create Google Form sign-up sheets to embed on the library’s web calendar

● Add programs to the web calendar

● Allow patrons to register for programs

Weeks 9 - 16:

● Run programs

● Tabulate attendance records as the seconds run of each program ends.

Week 17:

● Analyze data collected from the programs

● Create graphs

● Write the report

Limitations and Research Quality

There are a few obvious limitations to the proposed study. First and foremost there are

potential limitations based on duplicating programs. It is impossible to ensure that these

programs are truly identical. There may not be enough interest in a program to fill both sessions

of it. There are also outside factors like weather and other large community events. These could

impact how many people want or are able to attend the library's program, and could be a

deciding factor over registration for attending the library’s program. The proposed study will

attempt to mitigate this limitation by giving a month’s time between programs at a time of year

when weather is unlikely to radically change, the early summer for example, and at the same

time of the month and day of the week. Programs will be identical in content and schedule as
PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 7

well, but there is a chance staff could be ill or leave the library’s employ for some reason within

the proposed study’s time limit. To account for these, researchers will note weather conditions

for each program, as well as potentially noteworthy community events that may potentially pull

the audience for programs. There is also a built in element to the proposed study which may

help mitigate these occurrences. The stratified sample will hopefully include a representative

number of regularly occurring programs like book clubs or tai chi classes. Because these types

of fo programs happen regularly, it is less likely that weather or another factor would affect

attendance.

The prior research on this topic is minimal and therefore poses a limitation to the

proposed study. This means, however, that there is a gap in the collective knowledge about

what truly increases program attendance at public libraries. The fact that the NILPPA research

(American Library Association, 2017) is being conducted further validates this.

Aside from the limitations mentioned above, the quality of the research will be ensured

through several measures. The proposed study will have practically no interaction with patrons

involved in the study and therefore there will be minimal to zero researcher interference in the

results. The data collected is absolutely objective as it collects qualitative data and aims to

derive conclusive proof of the hypothesis.

Quality is also guarded by the sampling process of choosing which programs to include

in the proposed study. A stratified random sample of programs should result is a reliable sample

of patrons who attend library programs. This sampling should lead to highly reliable data to be

used in the analysis of registration’s effect on library program attendance.


PROGRAM REGISTRATION EFFECTS ON ATTENDANCE 8

References

American Library Association. (2017, October 10). NILPPA. Retrieved from ​https://nilppa.org/

​ ), 41.
Brandehoff, S. (1997). Turning libraries into cultural centers. ​American Libraries, 28(3
Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=
a9h&AN=9703134548&site=ehost-live

Chant, I. (2014). Public: IMLS: Program attendance still rising. ​Library Journal, 139(​ 13), 13.
Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=l
lf&AN=97350084&site=ehost-live

Dr Nic’s Maths and Stats. (2012, March 13). ​Sampling: Simple Random, Convenience,
systematic, cluster, stratified - Statistics Help ​[Video file]. Retrieved from
https://www.youtube.com/watch?time_continue=293&v=be9e-Q-jC-0

Kendzior, N. (2015). Programmed for Success. Public Libraries, 54(6), 9–10. Retrieved from
http://search.ebscohost.com.proxy.library.kent.edu/login.aspx?direct=true&db=llf&AN=1119466
45&site=ehost-live

Laddusaw, S., & Wilhelm, J. (2018). Yours, mine, ours: A study of a successful academic &
public library collaboration. ​Collaborative Librarianship, 10​(1), 30–46. Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=l
lf&AN=130353130&site=ehost-live

Morley, G. (2015). Are we lost in orbit? ​Louisiana Libraries, 77(​ 4), 19–35. Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=l
lf&AN=118762051&site=ehost-live

Nitecki, D. (2004). Program Evaluation in Libraries: Relating Operations and Clients. ​Archival
Science, 4, ​17-44.

Public Libraries in the United States Survey: Fiscal Year 2011​ (Rep.). (2014, March 12).

Saczkowski, I. (2000). The buck stops here. ​School Library Journal, 46​(11), 29. Retrieved from
https://proxy.library.kent.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=l
lf&AN=502857325&site=ehost-live

You might also like