library statistics - Public Libraries Online https://publiclibrariesonline.org A Publication of the Public Library Association Wed, 05 May 2021 01:34:37 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.5 Learning From Our Statistics https://publiclibrariesonline.org/2021/05/learning-from-our-statistics/?utm_source=rss&utm_medium=rss&utm_campaign=learning-from-our-statistics Wed, 05 May 2021 01:34:36 +0000 http://publiclibrariesonline.org/?p=16897 To utilize statistics effectively, librarians need to have an understanding of the underlying principles. An oft-neglected area of study in librarianship, statistical fundamentals are approached here in a simple rules format with examples. The purpose is to help librarians gather and use statistical information in new and better ways. This is of particular concern at this point in time when traditional library statistics like circulation and visitation are dropping nationwide due, in part, to the proliferation of convenient digital information sources

The post Learning From Our Statistics first appeared on Public Libraries Online.

]]>
Twelve Simple Rules for Understanding and Using Our Numbers Better

To utilize statistics effectively, librarians need to have an understanding of the underlying principles. An oft-neglected area of study in librarianship, statistical fundamentals are approached here in a simple rules format with examples. The purpose is to help librarians gather and use statistical information in new and better ways. This is of particular concern at this point in time when traditional library statistics like circulation and visitation are dropping nationwide due, in part, to the proliferation of convenient digital information sources.

Libraries are great at counting things. We count visits, user registrations, circulation, collection size, number of programs, attendance at programs, computer and Wi-Fi uses, website visits, and questions asked. We survey our communities to learn more about their needs and the impact of our services, and we tally their responses. And under pandemic conditions, we are counting all sorts of new things: reserves placed, curbside attendance, Zoom attendance, recorded program views, craft kits supplied, and newsletter opened.

Collectively, these numbers may contribute to persuasive infographics and annual reports demonstrating library usage to our communities, Boards, and funding agencies. This is important, since we are the recipients and stewards of public funds to be used for the greater good. However, as statistics in traditional areas like circulation and visitation drop nationwide (1) in the wake of the proliferation of convenient digital formats and pandemic-enforced closures, this is a good time to look more closely at statistics to see what we can learn.

Statistics demonstrate how our communities are using our resources to help us make good decisions about what to do next. These numbers are an important part of our library’s story, telling us what’s being used, what’s changing, what’s working, and what could maybe work better. When we make changes, the numbers before and after can tell us if we are moving in the right direction.

But statistics are often misused. They are sometimes treated as a scorecard where higher means better, encouraging library staff to focus on increasing numbers instead of on meeting needs. We misread data and draw incorrect conclusions. We make bad comparisons, look for trends with insufficient data, and miss connections. We count what is easy to count, and we don’t necessarily count what is important to know. Some of the most important information is simply not countable, and it may get ignored. We take surveys and allow the opinion of a few to stand for the majority (2).

For people who don’t love math, or who were never drawn to study it, using statistics to evaluate library services can be a bit perplexing. To be used effectively, we need to understand their limitations and best uses. Twelve simple rules can help us think about statistics more holistically and strategically.

1. Understand What You Are Measuring When You Count

Counting only measures one thing, so we really need to understand what we are measuring when we count. Our door counter tells us how many people passed in front of the device during a particular time span, and that is all. It does not tell us why they came, how long they stayed, how often they visit, or whether they found what they were looking for during their visit. It includes people who came and went multiple times, staff members and working volunteers, visitors from outside our population base, and people just avoiding bad weather.

Our circulation statistics tell us how many items were checked out or renewed. The don’t tell us if books were useful or even read. They don’t tell us a renewal was due to continued use or the inconvenience of an earlier return. And with the advent of autorenewals, these numbers tell us even less as we preemptively renew items ourselves, lumping potential overdue items in with actual requested renewals.

Tip: The numbers we collect are points of information, which when combined with other information, can help us make informed decisions. On their own, they are not a measure of whether or not we are meeting our mission or creating an impact. On their own, they tell us very little. Collected over time, they can help us see trends and tell a story.

Tip: Whenever you create a statistical report, it is good practice to include a legend of exactly what was counted. Stating exactly what you are measuring will help you avoid drawing unwarranted conclusions. It may help you rethink how you are counting to see if there is a better way.

2. The More Accurate The Numbers You Collect, The Better Conclusions You Can Draw

Math is a precise science. When you apply math to estimated numbers, your accuracy decreases with each operation. This affects sums and differences, averages, percentages, services per capita, and turnover rates. The more you estimate, the less useful your statistics are for the purposes of evaluation.

Tip: When at all possible, count instead of estimate. If you need to estimate, employ counting as much as possible in the process. Count the attendees in a quarter of the room and multiply by four or count reference questions for one week each quarter and extrapolate to fifty-two weeks.

Tip: Be consistent with your counting. If you are running a report in your library software to get circulation data, make sure you are using the exact same parameters every time. If you are counting reference questions several weeks each year, try to do it the same weeks each year when activity levels will be similar.

Tip: If you think Census Bureau estimates since the last official count are incorrect for your population, check to see how student enrollment is changing in the local school district, which counts students every year, to get a better idea of how your population has changed.

3. Beware of Drawing Conclusions from Small or Overlapping Data Sets

If you have three people at your technology program one week and four the following week, you could claim an increase of 33%. From a statistical standpoint, though, this data set is too small to produce reliable data. The chance that the increase would be replicated or even maintained week after week is statistically small, so using this data to make decisions about what to do next would be ill-advised.

If you had 30 people at Monday’s storytime and 45 at Tuesday’s, you could claim that 75 people came to storytime, when in reality some people may have come to both. The decisions you make about storytime schedules may affect way fewer people than your totals would have you believe.

Tip: Increase the size of your data set by aggregating it across multiple instances. Compare one year of attendance with another instead of one month to the next. Compare all of your technology program attendance collectively instead of by individual program from month to month.

Tip: When you have overlapping data sets, it is especially important to use accurate language. You are counting how many people attended storytime on a given date, not how many different people came to your storytimes.

4. Averages Can Be Misleading

If your legal population size is 25,000 and your annual book circulation is 500,000, it would be tempting to say that on average, your community members read twenty books per year. You might picture this whole community of people, each with a stack of twenty books. And you would likely be picturing no actual person in your community. Why? Because circulation includes renewals, because checkouts do not necessarily mean books were read, and because many of your community members checked out one or zero books, while a few voracious readers borrowed ten items a week.

That same circulation data set might have a median of four (number of checkouts ranked lowest to highest has a middle value of four). This median is closer to the typical behavior of your community, but it still misses the point that every member of your community is unique.

Tip: You are planning services for a diverse community, so thinking about the average user may narrow your thinking about their needs. [3]

Tip: Using the median instead of the average can help reduce the effect of outliers. For example, the average program attendance may be skewed by your summer reading kickoff attendance of 1,000 people when other programs are in the 5 to 25 attendance range.

Tip: When tracking attendance numbers, the average number of people who come to a weekly program is less useful for space planning than knowing the minimum and maximum.

5. Statistics are Interrelated

Libraries serve their communities in multifaceted and increasingly complex ways, and state agencies each year add more fields to the data they want us to collect. Where we once counted computer uses, we now also collect Wi-Fi uses. Where we once counted programs, we now count children’s, teen, and adult programs separately.

What we notice as we measure more aspects of service is that a policy change or a change in the local environment may increase one statistic while making another statistic drop. For example, as your e-book collection use grows, people can check out from home, and your library visitation and physical collection circulation may drop. As readers’ advisory aids are integrated in your catalog, questions at the desk may decline. The more accessible your databases are through your website, the less demand there should be for reference help. The more intuitive your collection layout and signage is, the less people should need help finding things.

Tip: When you improve services, be prepared to find new ways to count, even ways that the state doesn’t recognize yet. Leaving an e-resources consortium to create a stand-alone collection may result in a smaller collection but it may also reduce wait times and eventually increase usage.

Tip: When a statistic trends upward or downward, look for changes in other statistics that may be systemically related.

6. Stop Chasing Statistics (More is Not Necessarily Better)

When circulation drops, we may be tempted to rearrange the collection, add display shelves, create additional signage, weed, buy newer materials, and increase the number of shelves since materials are crowded when fewer things are being checked out.

Any one of these strategies might make the collection more attractive and invite more usage, which would be a good thing. However, sometimes a decrease in statistics means there has been a reduction in community need for that service. If the demand for health books has dropped because information in databases and the internet is easier to navigate and more current, then trying to increase the circulation of the physical collection will be a waste of time and money and will not serve the community.

Tip: Whenever a statistical measure drops, think about your goals for the community first. Your community doesn’t care whether your collection circulations 100,000 times or 200,000 times. They care whether they can find the information they need in a format and timeframe convenient to them.

Tip: When circulation drops, your shelves get more crowded. The answer to these crowded shelves is not adding more shelves. The answer is weeding and curating better. You may also need to reduce shelving to create space for something the community needs more!

Tip: Digital circulation may never make up for losses in physical collection circulation. Many digital resources are easily accessible and reasonable inexpensive for patrons to access on their own, while the same resources are expensive or unavailable for libraries, subject to licensing expiration, and circulated sequentially with wait times. Where libraries could enthusiastically recommend that patrons check out physical materials, we are more cautious with digital resources that diminish in availability with each additional circulation. A digital collection operates differently than a physical collection and we will need to think about new ways to measure the effectiveness of these collections.

7. Don’t Look Too Often

We count and record every day, and we tally every month. But if we look that often for meaning, we give disproportionate attention to detail because there is so much of it. In statistics, we are said to be experiencing a high ratio of noise (data we should not be paying attention to) to signal (the important pieces of information we should be paying attention to). [4]

The greatest strength of statistics gathering is the view it gives us of trends over time. If you look at too short an interval of time, you may be comparing a library in the throes of a digital revolution with a library that relied on now-declining formats like CDs and DVDs for a large portion of its circulation.

Tip: In monthly reports to the Board, putting comparative data in a table may suggest that it is more important than it is. Numbers have the aura of indisputable facts, after all. Skip the comparison and report the month’s total in narrative form, simply telling of this month’s activity. Add table periodically to focus attention on particular shifts in usage that you want to explore with the Board.

Tip: Sometimes you need to take the longest possible view by comparing your trends to the trends of libraries in general and libraries of similar size and demography. If everyone’s circulation is dropping, it may not be anything you are doing or not doing that is causing the change.

8. Compare with Caution

It’s tempting when the state or national statistics are released each year to start comparing our libraries to others. In almost every case, we will be comparing apples to oranges. While public libraries may have similar missions, the communities we serve are unique. We have difference in building sizes, revenue levels, governance, population density, area served, diversity, education levels, local economies, public transportation, etc., all of which may affect our statistics.

We also collect our statistics differently than each other. A library’s legal service population may be very different than the actual population being served. Libraries count circulation and computer uses using different parameters within the limits of their library software and policies. What is recorded as a reference question varies from library to library and even employee to employee.

Tip: If you peruse the IMLS or your state’s annual library data, filter the data for libraries with a population within about 10% of your and expenditure within about 10% of yours, aiming for a benchmark set of at least ten libraries and at most about 200. Then compare their usage trends over time with yours. If a library stands out as similar to your or particularly high-performing, you might inspect their website to see if there is anything you can learn that might be useful.

9. Beware of Unwarranted Conclusions

If thirty people come to storytime on Mondays and half that come on Tuesdays, you might think that your Tuesday storytime leader needs improvement. In actuality, it may be that your parking lot is too full on Tuesdays, your Tuesday storytime is too close to lunch, or the church down the street has a competing program on Tuesdays. It could be that the kids that typically come on Tuesday are aging out of storytime, and it may be that the smaller size group is just perfect for this group of children.

Tip: When a statistic seems to point to an issue, look for all possible causes. Consider that there may be multiple causes, random interference, or no cause at all. Consider that it might not even be a problem.

Tip: When looking at differences in statistics, consider data from other sources that might lend an explanation or offer insight.

10. Survey for Stories

The surveys we do are almost always statistically insufficient. For a survey to be statistically useful, you need well-designed questions asked of a random sampling of people in a number significant to the whole of the population. There are online calculators to help you determine your sample size, for a population of 25,000 you would need about 1,000 respondents to be 95% assured of a 3% margin of error.[5] That may sound like gobbledygook, but the point here is that you need a large sampling of respondents to reach any sort of reliable conclusions, statistically speaking.

The surveys libraries typically do provide anecdotal information which may still be helpful. If you send out a survey and get 160 responses from your population of 25,000 people, you don’t have the numbers for statistical accuracy. If fifty of those people tell you that your late fees are too high, this may be worth paying attention to. If one person tells you that the electrical outlet in your study area is causing a shock, that is really good to know, despite the lack of statistical relevance.

Tip: Use surveys to gather anecdotal information. Look for common themes in the stories you are hearing. Don’t over-value tallied results.

Tip: Use surveys to share information as well. When you provide a list of services asking people to check which ones they use, you are simultaneously informing them of those services.

11. Periodically Take a Deep Dive on Some of Your Statistics

Go below the surface by counting more than the state asks you to in order to learn other things. When you are counting reference questions, create a table to tally research, technology, readers’ advisory, and other questions separately. Total your questions answered by hour and day of the week. Learning what percentage of your questions come from different categories and knowing when you get the most questions may help you with staffing and training decisions.

You can track first circulation separately from renewals. You can track circulation by area of collection or new items separately from existing items. On reference count week, you might also ask staff to track hours spent on other task to help determine the cost of an ILL, fees to charge for processing on lost materials, or the amount of time to allot for program preparation.

Tip: As part of your strategic planning process, look at the trends over three to five years in every count available to you to paint as complete a picture of operations as you can. A membership review, for example might include the percentage of your community that has library cards, the percentage of your users that live within your service area, the percentage of your accounts that have checked out a physical item in the year, the number of new and expiring each year, and of course, the trends in those areas.

Tip: Whenever you are making a change, take measurements before and then measure again at six months and a year to see what affect the change had.

12. If You are Making Operational Decisions, Ask Yourself How Numbers Might Help

Trying to decide whether to add autorenewal? A table that compares first circulation, renewals, and reserves over time will help you calculate how many autorenewals (renewals of un-reserved items that weren’t already being renewed) your library is likely to process as a percentage of your circulation. It’s an imperfect number, but at our library, we would expect about a 20-25% increase in overall circulation count, allowing for one autorenewal for each item. This is one piece of information to help us make this decision.

Tip: Try calculating cost per circulation for different parts of your collection. Divide annual expense by annual circulation/use for your physical collection, e-books, e-magazines, streamed videos, and each database. This may help you with collection expenditure allocation decisions, keeping in mind that physical collections require much greater staff time than digital collections.

Tip: When preparing for a renovation, try head count studies. Divide your current facility into functional regions and have staff count the users in each area every half hour for a week, then repeat a couple more times at different seasons. Look for patterns in how your spaces are being used at different times of the day and days of the week to help you decide what kinds of space you need.

Tip: Learn Excel and practice. Your library probably offers an online resource to help you learn it if you haven’t already. A few easy formulas can be so helpful for comparing data and spotting anomalies.

Conclusion

Libraries that underutilize statistics are missing out on a valuable tool for informed decision-making. Understanding the logic behind statistics gathering can lead to the intentional collection of specific data to aid in the evaluation of services and resources. This is critical as we focus on building digital collections, developing our space for multiple uses, developing programs to provide for changing community information needs, and providing essential services during an epidemic.

Just as critical though, is to realize that our statistics measure usage rather than performance. Twentieth century management theory, born on the factory floor, taught us to count task completed per time period and to set related goals to improve efficiency, but libraries are not factories and efficiency is rarely our goal. Our focus should be on services, meeting community needs, and building relationships delivered in a sustainable way, and the statistics we gather should help us better understand how we are meeting those goals.

References and Further Reading

  1. “Public Libraries Survey (PLS) Data and Reports,” Institute of Museum and Library Services, November 19, 2018, www.imls.gov/research-evaluation/data-collection/public-libraries-survey/explore-pls-data/pla-data. Between 2010 and 2018, nationwide circulation dropped 11.7% from 2.47B to 2.18B and visitation dropped by 17.9% from 1.57B to 1.29B. Population served grew during the same period from 308M to 325M for an increase of 5.5%.
  2. Jerry Z. Muller, “The Tyranny of Metrics” (Princeton, NJ: Princeton University Press, 2018).
  3. Todd Rose, “The End of Average: How We Succeed in a World that Values Sameness” (New York: Harper Collins, 2016).
  4. Nassim Nicholas Taleb, “Antifragile: Things That Gain from Disorder” (New York: Random House, 2016), 125-27.
  5. “Sample Size Calculator: Understanding Sample Size,” SurveyMonkey, www.surveysystem.com/sscalc.htm.

The post Learning From Our Statistics first appeared on Public Libraries Online.

]]>
Statistics Season https://publiclibrariesonline.org/2015/05/statistics-season-2/?utm_source=rss&utm_medium=rss&utm_campaign=statistics-season-2 https://publiclibrariesonline.org/2015/05/statistics-season-2/#respond Wed, 20 May 2015 21:41:24 +0000 http://publiclibrariesonline.org/?p=6064 For those operating on a June year end fiscal year, the finish line is in sight. We are cleaning up our records, gathering our data, and readying our reports. It is Statistics Season. Every year I hear the same thing from someone: ‘statistics lie.'

The post Statistics Season first appeared on Public Libraries Online.

]]>
For those operating on a June year end fiscal year, the finish line is in sight. We are cleaning up our records, gathering our data, and readying our reports. It is Statistics Season. Every year I hear the same thing from someone: ‘statistics lie.’

For years I taught statistics. And, yes, it is true, you can lie with statistics. However, you can only lie with statistics to those who don’t know anything about statistics. On that front, one can lie about anything. I could tell you the world is flat. If you knew nothing, you might believe me, and I would have told a successful lie. For years, those businesses that saw libraries as competition have been saying libraries are dead. It’s a lie. But for those who know nothing about libraries, they believe it. Lies with statistics are often intentional and come about when the presenter fails to include all of the information. This is why one can only lie with statistics to those who don’t know statistics. When looking at numerical information, there are two basic rules:

1. Always know the total real number. Most stats use percentages. This is a convenient tool that makes comparisons easy. It allows whatever one is looking at to be viewed in terms of 100. 50% is half. 33% is a third. But a half or a third of what amount? Think about this in terms of cash. I offer you a half of a dollar, you might yawn. I offer you a half of 10 million dollars – that would get your attention. Without some indication of how many items/cases were included, a percent is vague at best.

It is also an easy way to lie. For example, say I ask 4 people how they like the library’s new pet snake. If three of them say yes and one says no, I can honestly report that 75% like our new pet snake. But if I do not tell you I’ve only asked 4 people, is my assertion that most people like our pet a lie? Many would say yes. This lie is easily uncovered by asking how many people were asked. When the actual total numbers are not offered, I’m skeptical.

2. Always know who or what the numbers are coming from. In libraries, hard numbers can be difficult to come by and there will always be a level of acceptance (or not) of how data is gathered. Most library surveys are taken by people in the library—a slightly biased group, often with no guarantee that one devoted person has not stacked the deck. Circulation numbers are generally gleaned from our ILS and we are at the mercy of our programs and appropriate scanning. There will always be concerns, but generally ones we accept despite a margin of error.

Still, going back to our pet snake, consider that instead of asking 4 people, I asked 12 people. Again, 4 of them said they did not like our pet, but 8 said they did. Again, my report says 75% like our pet. But what if those 8 people were all from the local herpetology club? What if they were all personal snake owners, surveyed from the local pet store? How one gets their data is just as important as the numbers themselves. Who was answering the survey and where they were asked should always be known.

There are certainly other elements to be aware of, but these two elements can take one a long way. Armed with this information, it makes it very difficult to be lied to; presenting this information can make it less likely the accusation can be made. If this information is not shared, it always raises a red flag. When presenting statistics, I am always certain to have the total ‘real’ numbers on hand.

The post Statistics Season first appeared on Public Libraries Online.

]]>
https://publiclibrariesonline.org/2015/05/statistics-season-2/feed/ 0
Library Journal Salary Survey 2014 https://publiclibrariesonline.org/2014/08/library-journal-salary-survey-2014/?utm_source=rss&utm_medium=rss&utm_campaign=library-journal-salary-survey-2014 https://publiclibrariesonline.org/2014/08/library-journal-salary-survey-2014/#respond Mon, 11 Aug 2014 19:49:51 +0000 http://publiclibrariesonline.org/?p=4614 Library Journal Releases Results of 1st Salary Survey

The post Library Journal Salary Survey 2014 first appeared on Public Libraries Online.

]]>
Library Journal’s inaugural salary survey for U.S. librarians and paralibrarians may be the deepest look we’ve had at the range of salary potential.  Much of the findings are old news:  school librarians earn the highest salaries and public librarians earn the lowest salaries.  Other findings are perhaps more surprising, such as  women “make roughly 89% of what their male counterparts earn.”

The survey shows fairly large discrepancies between the major job functions and titles.  Among all full-time public librarians surveyed, the median 2013 income was $47,446, with salaries of library directors ranging between $20,000-$310,000.  Library Journal noted that compensation generally increases with the size of the library system, as does the number of staff with an MLIS degree.  Here’s a breakdown of the major public library job categories and their median income:

  • Assistant Library Director–$65,825
  • Library Director—$59,392 (lower because Assistant Library Director is generally only at larger library systems)
  • Library/Branch Manager–$55,383
  • Electronic Resources/Digital Content Management–$52,000
  • Technical Services/Systems–$52,000
  • Collection Development/Acquisitions–$51,334
  • Adult/Public Services–$47,000
  • Children’s Services–$47,000
  • Reference/Information Services–$43,000
  • Youth Services–$40,947
  • Teen/YA Services–$40,000
  • Circulation/Access Services–$33,000

Although there has been much discussion on the value of the MLIS degree, the survey concluded that those holding the degree made nearly 50% more than those working in academic or public libraries without the degree.   Also worth noting, although public libraries perennially deal with budget challenges, the average pay increase in 2013 was 2.9%.  Those who find that statistic shocking are likely amongst the 27% of library staff who reported no pay raise at all.

Perhaps the best news is how satisfied public librarians are with their jobs.  Seventy percent of public library workers are either “very satisfied” or “satisfied” with their jobs, with only 2% saying they’re “not at all satisfied” and 6% claiming to be “not too satisfied.”  Of those who aren’t satisfied, the top causes are lack of advancement opportunities, low salary, and lack of recognition.

Library systems who simply cannot afford staff pay raises should note that Library Journal stressed that raises aren’t what librarians want most.  Other things the study revealed would go a long way to increasing job satisfaction are as follows:  full-time jobs with benefits (many respondents cited their frustration with only finding part-time work); security in their positions and the chance to grow within the organization; non-monetary recognition for good work; stronger relationships with management; and an end to increased workloads.

So, that’s a summary of what 3,210 librarians and paralibrarians reported for their 2013 compensation.  How does your salary align with the averages?  Are any of the findings particularly surprising?

Source:

http://lj.libraryjournal.com/2014/07/careers/payday-lj-salary-survey-2014/

The post Library Journal Salary Survey 2014 first appeared on Public Libraries Online.

]]>
https://publiclibrariesonline.org/2014/08/library-journal-salary-survey-2014/feed/ 0
The 2013 Public Library Data Service Statistical Report: Characteristics and Trends https://publiclibrariesonline.org/2014/05/2013-plds/?utm_source=rss&utm_medium=rss&utm_campaign=2013-plds https://publiclibrariesonline.org/2014/05/2013-plds/#respond Fri, 09 May 2014 19:03:15 +0000 http://publiclibrariesonline.org/?p=4291 This report presents selected metrics for FY2012 PLDS data and previous year results in tables and figures with related observations. The results in this report were compiled using PLAmetrics.

The post The 2013 Public Library Data Service Statistical Report: Characteristics and Trends first appeared on Public Libraries Online.

]]>
The Public Library Data Service (PLDS) is an annual survey conducted by PLA. This 2013 survey of public libraries from the United States and Canada collected fiscal year (FY) 2012 information on finances, resources, service usage, and technology. Each year, PLDS includes a special survey highlighting one service area or public library topic. This year these supplemental questions focused on facilities.

PLA and Counting Opinions (SQUIRE) Ltd. continue to partner to provide the service for capturing the data and for the PLA metrics online portal subscription service—offering access to the longitudinal PLDS data sets going back to FY2002, and data from the Institute of Museums and Library Services (IMLS) going back to FY2000. PLAmetrics provides public libraries real-time access to meaningful and relevant public library data for comparing and assessing their operations using a variety of custom report formats and customizable report templates.

This report presents selected metrics for FY2012 PLDS data and previous year results in tables and figures with related observations. The results in this report were compiled using PLAmetrics.

Research Method and Context

Participation in the PLDS is voluntary and participants have the option of providing responses to any or all of the questions that comprise the survey. Similar to previous years, public libraries in the United States and Canada were invited to participate in the survey. Emails (3,430) were sent to launch the survey in January 2013, postcards were handed out at the 2013 ALA Midwinter Meeting in Seattle, follow-up letters and emails were sent throughout March and April 2013, and the deadline for submission was extended from March 15 to April 15. State data coordinators from the U.S. and provincial/association coordinators in Canada were contacted about promoting the survey to their libraries. Their involvement again led to increased awareness and participation, with 1,949 of American and Canadian public libraries partially or fully responding to the request for data, a response rate of 21 percent (a 1.3 percent increase over the previous year). However, due to the voluntary nature of this survey, several libraries had to be contacted for additional data resulting in 1,897 libraries included in the final data analysis. This is an increase over 1,579 from FY2011 and 1,461 from FY2010.

Please refer to the online PLDS Survey site for copies of the survey and definitions of questions.

Overall Service Summary

The PLDS Survey includes questions that effectively characterize the operations (input and output measures) of each responding library. Table 1 includes a selection of summary data representingall libraries that provided non-zero values for each selected measure.

Descriptive Statistics of Participating PLDS 2013 Public Libraries (FY2012 Results)The FY2012 results include 1,897 responding libraries that reported their population of legal service area, a 20 percent increase compared to the FY2011 response count. Table 1 shows that the population served ranged between 143 to 3,819,702 with a mean and median population of 101,607 and 21,256 respectively. The results are characteristic of the overall composition of the PLDS FY2012 data set and these statistics are consistent with reality, whereby more than 82 percent of the reporting libraries serve populations less than 50,000. New this year is an increase in the number of participating libraries that serve populations of 50,000 or more (a 37 percent increase). For libraries serving populations less than 50,000, the increase in response rate is 14 percent more than last year. As a result of this increase in smaller libraries reporting data in FY2012, the mean and median values listed in table 1 have decreased. However, there is an exception of mean electronic circulation, which shows an increase of 63 percent overall and interlibrary loans (ILLs) to/from other libraries with modest increases in average and mean values compared to last year.

Population

Throughout this article, the population of legal service area is used as the basis for grouping results and for per capita ratios. It is important to note that the sample of responding libraries is variable year-over-year and within each population grouping. As such, we also include analysis of continuous responder data. This discussion includes trends and comparisons for the data segmented into either:

  1. Distribution of FY2012 and FY2011 Public Libraries by Population of the Legal Service Areanine population of legal service area groupings (shown in figure 1); and/or
  2. as a group of libraries (N=352) that have consistently participated in each PLDS survey over the most recent three and/or five years (FY2008 to FY2012).

Figure 2 shows population by legal service area (Pop LSA) reported over the past five years. The trend shows an increase in participating public libraries that serve smaller populations, as evidenced by the lower mean and median values depicted in the last two years. The second part of figure 2 displays the Pop LSA data for the continuous participants, which highlights that the population for this group has not changed much over the past five years and therefore yield more consistent and comparable per capita metrics.

Trend of PLDS Public Libraries by Mean and Median Population of the Legal Service AreaRegistered Borrowers

For the continuously reporting libraries, table 2 shows a 2.2 percent increase in the average number of registered borrowers per capita in FY2012 for libraries serving populations fewer than 50,000 compared to a 4.1 percent average decrease in FY2011. Libraries serving populations of 50,000 or more reported an average 0.2 percent increase compared to a 1.5 percent decrease last year. Overall registered borrowers increased by just over 1 percent for this group of libraries in FY2012.

Three-Year Trend for the Percentage Registered Boorowers Per Capita by Population Group-COntinuously Reporting Libraries (N=352)

 

Three-Year Trend by Population Group for the Percentage of Mean Registered Borrowers per CapitaFor all libraries reporting both their population of legal service area and the number of registrations, figure 3 shows a three-year trend for mean registered borrowers per capita by population group. For FY2012:

  • Overall 1,897 libraries offer services to a total population of 192,748,171 including 102,759,178 registered borrowers (>71 percent of the population)
  • For those libraries with populations less than 25,000, these 1,010 libraries offer service to a population of 8,008,103 including 4,976,573 registered borrowers (>82 percent of the population)
  • For those libraries with populations more than 25,000, 887 libraries offer services to a population of 184,740,068 including 97,627,462 registered borrowers (>61 percent of the population)

Some libraries, particularly those serving fewer than 10,000, reported a higher number of registrations than the actual number of people in their population of legal service area. Differences in some instances are explained by:

  • 2010 census figures are often no longer accurate especially in communities with rapid expansion or contraction;
  • libraries may serve surrounding communities outside their LSA; or
  • influx of temporary and/or semi-permanent migrant workers.

While library registrations showed a small overall contraction in mean registered borrowers per capita in FY2011, FY2012 results show an increase for libraries serving populations less than 50,000, but a decrease for libraries serving populations of 50,000 or more. The most significant marginal change occurred in population groups under 5,000, with a 36 percent increase. For continuous reporting libraries, the most significant increase is for populations between 10,000 and 49,999. Figure 4 shows the five-year trend for all libraries, those libraries serving populations of 50,000 or more (i.e., excludes those serving populations less than 50,000), and the continuous responding libraries. Filtering out the smaller libraries indicates that registrations per capita has risen and fallen ever so slightly during the last five years for the larger libraries (varies between 56 and 60 percent). Registered borrowers per capita for continuous reporting libraries shows a stable trend (varies between 57 and 62 percent).

Five-Year Trend for the Percentage of Registered Borrowers Per CapitaHoldings

The three-year trend chart for mean holdings per capita, for continuous respondents, is shown in Figure 5 (note: reverse chronological order).

Three-Year Trend by Population Group for Mean Holdings per Capita-Continuously Reporting Libraries (N=350)The three-year trend for mean expenditures on holdings and e-materials, for continuous respondents, is shown in figure 6 (note: reverse chronological order).

Three-Year Trend by Population Group for Percentage Materials Expenditure Spent on E-Materials (N=346)Despite an average 20.98 percent increase in expenditures on e-materials as a percentage of total materials expenditure, holdings per capita for the continuous respondent group increased overall by only 2.9 percent.

When viewing the results for all respondents, the average holdings per capita show a very similar pattern as in previous years (see figure 7). The average overall FY2012 holdings per capita for all reporting libraries is 10.29 (N=1,592). This value is 81 percent greater than last year. This is likely due to the increased number of respondents serving smaller populations (< 25,000). As shown in figure 7, filtering out these libraries (populations < 25,000) the mean and median holdings per capita over the past five years is very stable with slightly more than 2 percent growth, which is similar to the continuous responding libraries (2.9 percent).

Five-Year Trend for Holdings Per Capita by Mean and Median ValuesCirculation

Continuous respondent libraries circulated about eleven items per capita on average in FY2012, 1.6 percent fewer than previous year’s average, as shown in table 3 by population groupings.

Three-Year Trend and Percentage Difference in Mean Annual Circulation per Capita by Population Group-Continuously Reporting LibrariesAlthough 53 libraries within the continuous respondent group did not report electronic circulation figures, the 0.41 e-circulations per capita (an 86 percent increase from previous year) were insufficient to offset the apparent lower reported circulation per capita of physical materials. This reduced level of circulation activity likely coincides with the decrease in library visits (see Library Visits).

Figure 8 shows a similar pattern of lower circulation per capita for all libraries except for those serving populations of less than 25,000. Within this group, a 2.5 percent increase in circulation per capita was reported by continuous responders.

Three-Year Trend for Mean Annual Circulation per Capita with Summary Stats by Population GroupTable 4 summarizes the circulation per capita results for continuous respondents that reported circulation by item type, including electronic circulation (N=291). Print circulation accounted for more than 58 percent, CD/DVDs accounted for more than 34 percent, and “other” accounted for more than 5.8% of circulation. These results are similar to the proportions found in the FY2011 survey.

FY2012 Circulation per Capita Summary for Libraries Reporting the COntribution of Circulation by Item Type-Continuously Reporting LibrariesTable 5 shows electronic circulation per capita for all libraries reporting each item type and circulation activity for their library. Table 6 shows electronic circulation for all libraries that reported this activity in FY2011 and/or FY2012. In FY2012 more than twice the number of libraries reported electronic circulation contributing to a 161 percent increase in total e-circulations (0.40 e-circulations per capita).

FY 2012 Circulation per Capita Summary for Libraries Reporting the COntribution by Item Type-All LIbrariesElectronic Circulation per Capita for All Libraries for FY2011 and FY2012Table 7 includes circulation per capita results for 242 continuous reporting libraries that reported both total annual circulation and renewals (renewals represents 27.3 percent of total annual circulation).

Annual Circulation and the Contribution of RenewalsCollection turnover rates (circulation/holdings) are depicted in figure 8 (FY2012 results for all libraries and the continuous reporting libraries).

The rates calculated for each library, summarized in figure 9, show the effect of a higher number of reporting libraries giving rise to lower mean and median collection turnover rates compared to previous years. The collection turnover rate for the continuous reporting libraries shows a continuing softening over the past three years. Collection turnover rates are likely also impacted by the current transition to new formats of holdings (e-materials) and new ways to consume information (circulation) and the ways in which these are counted.

Five-Year Trend for Collection Turnover Rates for All Libraries and Continuously Reporting LibrariesAnnual Visits

The continuous library responder group shows fewer library visits per capita (1.5 percent fewer). Table 8 shows results for continuous respondents.

Mean Library Visits per Capita for FY2011 and FY2012 for each Population Group-Continuously Reporting LibrariesTable 9 shows results for all responding libraries. Libraries serving populations of less than 25,000 recorded more visits per capita, between 4.7 and 25.5 percent, an average of 1.6 more visits per capita than libraries serving communities of 25,000 or more. Libraries serving populations below 100,000 saw an average of at least 7.03 visits per capita very similar to the previous year.

Mean Library Visits per Capita for FY2011 and FY2012 for Each Population Group-All LibrariesAverage library visits per capita for all reporting libraries was 7.05 (N=942) (>11 percent increase over last year). Figure 9 shows that this increase can be accounted for among smaller libraries serving populations of less than 25,000 people where more libraries in this segment contributed data this year (959 libraries reported 535,057 mean annual visits in FY2012 compared to 377 having mean annual visits of 648,273 in FY2011). Libraries serving population groups of 25,000 or more reported a decrease in average library visits per capita; a trend continuing from the previous year.

Figure 10 shows the percentage change of library visits in the past two years for each population grouping and figure 11 shows the three-year trend for the mean annual visits per registered borrower for each population grouping. This pattern of declining registrations suggests a relationship between the decreasing library visits for libraries serving populations of 25,000 or more.

Percentage Chain in Mean Library Visits per Capita by Population of Legal Service AreaMean Visits per Registered Borrower Three-Year Trend for Each Population Group-Continuously Reporting LibrariesChanges in hours of operation (total hours open and convenient hours open) likely impacts the number of library visits and other in-library service usage, including circulation, program attendance, and reference questions asked/answered (where staff involvement is required). Table 10 shows the three-year trend for hours open per week by population grouping. Consistent with other observations, in four out of the nine population groups, the mean public service hours per week has reduced. This reduction in hours likely explains reductions in the numbers of library visits and other activity counts.

Three-Year Trend for Mean Public Service Hours per Week for Each Population GroupTables 11 and 12 show the three-year mean activity counts for in-library visits and reference questions. While fewer hours of operation are not the only factor affecting visits and related service usage, the pattern is consistent for libraries in population groups showing reduced hours of operation. The mean in-library use of materials rates per capita are 6.98 percent lower (248,766 in FY2011 to 231,396 in FY2012) and mean reference transactions are 17.89 percent lower (160,261 in FY2011 to 131,587 in FY2012).

Three-Year Trend from Mean In-Library Use of Materials by Population GroupThree-Year Trend for Mean Reference Transactions by Population GroupGiven the availability of remote online library services (including reference services, downloadable materials, and online databases) it might be reasonable to assume that physical visits have been displaced by remote/online visits. However, as figure 12 shows, an expected increase in web visits per capita has not occurred. Instead web visits have declined an average of 13 percent. It is difficult to ascertain the cause, but the variability of systems and methods used to count website visits is likely a factor. The count methods combined with an updated definition for how to count website visits as well as difference in systems and tools used to count this activity are likely explanations for some of the differences from the previous year.

Two-Year Trend Mean Web Visits Per Capita by Population Group-Continuously Responding LibrariesOperating Finances

Income and expenditure measures continue to provide useful insights and therefore are a major section within the PLDS survey. For the continuous respondent group, the average overall annual library income was $14,001,457 or $53.20 per capita of the legal service area (N=351), a decrease of $0.18 from last year’s average per capita income of $53.38 (N=352).

Overall annual library expenditures per capita is $49.91 (N=351). This is an increase of $0.17 per capita from the average of $49.74 (N=352) per capita in FY2011.

As shown in figures 13 and 14, the most notable patterns for the continuous responding libraries are found in the population groups serving fewer than 50,000 and those serving 50,000 and more where average income per capita and operating expenditures per capita are reported compared to the previous two years. Increases in both per capita income and expenditures were reported for the fewer than 50,000 population groups, and the 50,000 and more group reported mostly lower income and expenditures—unchanged from the previous years, although the 500,000–999,999 population group did report higher income.

Three-Year Trend Mean Income ($) per Capita by Population of Legal Service Area-Continuously Responding LibrariesThree-Year Trend Mean Expenditures ($) per Capita by Population of Legal Service Area-Continuous Responding LibrariesOverall average income and expenditures per capita increased in FY2012. However libraries serving populations between 25,000 and 499,999 continue to experience reduced funding and thus continue to make cuts to expenditures per capita. Figures 15, 16, 17, and 18 depict the patterns of income and changes in the expenditures over the past five years for the medium-sized libraries. The graphics show a relationship between funding and expenditure per capita levels each year and the pattern of variability in the budget among competing categories of expenditures.

Changes in Mean Expenditures per Capita by Type and Five-Year Trend for Mean TOtal Income per Capita for Population Served 25,000-49,999Changes in Mean Expenditure per Capita by Type and Five-Year Trend for Mean TotalIncome per Capita for Population Served 50,000-99,000Changes in Mean Expenditure per Capita by Type and Five-Year Trend for Mean TotalIncome per Capita for Population Served 100,000-249,999Changes in Mean Expenditure per Capita by Type and Five-Year Trend for Mean TotalIncome per Capita for Population Served 250,999-499,999These patterns of income and expenditure per capita are similar for all libraries and appear to depend on the sources of funding. For FY2012, all libraries serving populations of fewer than 25,000 reported increases in income from state/provincial and other sources, including the federal government. These libraries show higher income levels per capita and correspondingly higher expenditures per capita. This is most significant in the fewer than 5,000 population group.

One thing common for all libraries serving populations of fewer than 500,000 in FY2012 is that each has experienced cuts in income from local government per capita, often the most significant funding source for such libraries. The result of these cuts in spending is depicted in figures 15, 16, 17, and 18 for the population groups from 25,000–499,999.

These figures show the impact of cuts to expenditures and specifically reduced expenditure on staff. Interestingly, for the group of continuous responding libraries, the portion of total expenditures spent on staff has tended to grow (1.67 percent from 2008 to 2011, and -0.14 percent in 2012) relative to the other areas of spending and since 2007 the percentage of librarians on staff has been increasing while the percentage of non-librarians on staff has been decreasing; a reversal of the trend between 2002 and 2007 for the composition of staff.
Table 13 (see page 38) summarizes various library outputs as a function of expenditures per capita in each population grouping for the continuous responding libraries (N=349). Icons depict the change in value relative to previous year values. In FY2012 per $1,000 spent, continuous reporting libraries realized on average per $1,000 spent:

  • 1.63 percent fewer visits
  • 1.41 percent fewer circulations
  • 3.36 percent more program attendees
  • 6.25 percent fewer reference transactions
  • 22.63 percent fewer in-library uses
  • 4.36 percent more registered borrowers

Table 14 on page 38 (also reported last year) represents the overall use of funds by the libraries (activity per expenditure). As compared with results published last year, most figures have increased. Each of the population groups show similar relative changes in activities and expenditures. For example, the population groups between 50,000 and 499,999 overall incurred lower expenditures per capita (-8.56 percent) between FY2011 and FY2012 and had fewer registered borrowers (-0.29 percent) and library visits (-7.47 percent) per capita and simultaneously recorded fewer activity counts. Overall the measures show more up arrows (34) than down arrows (19). This suggest that the respondent libraries in general in FY2012 were accomplishing more with fewer dollars (or more with more dollars) and the implication being
that activity levels are proportionately higher than the operational expenditures that support these activities.

FY2012 Average Library Output Characteristics per $1,000 of Expenditures by Population Group-Continuously Responding LibrariesFY2012 Average Library Output Characteristics per $1,000 of Expenditures by Population Group-All LibrariesTechnology

Use and availability of technology in libraries is an important part of the PLDS survey. This set of questions was unchangedfrom the previous year and provides useful comparative results, listed in descending ranked order according to the percentage of libraries that confirmed they provide the technology service.

Technology equipment available in libraries showed an increase in each category except automated systems. Tablets (127 percent), Video game consoles (60 percent), e-book readers (55 percent), other equipment (e.g., wattage readers) (15 percent), and laptops (10 percent)  posted the largest increases in the percentage of libraries confirming they offer these technologies compared to last year’s results.

Among the many website offerings, library apps for mobile devices (32 percent) showed the largest increase and user-driven content (10 percent) and streaming live programs made modest increases in the percentage of libraries confirming they offer these services.

Meanwhile, in terms of responding libraries, a smaller percentage (12 percent) indicated they offer Wi-Fi inside. Statistics concerning Wi-Fi outside, tracking of subscription databases, and access to local digitized content were unchanged in proportion of libraries offering these services.

Special Section: Facilities Survey

A report summarizing results from the Facilities Survey questions included in the PLDS 2013 special section is posted online at www.plametrics.org. If you would like to be notified of additional information about these results and future surveys, please contact pla@countingopinions.com or fill out the notification form on the PLAmetrics website.

2014 PLDS Survey

Results of the 2014 PLDS survey (FY2013 results) will be available soon. For more information, please visit the PLAmetrics website or send an email inquiry to pla@countingopinions.com. The PLDS survey continues to capture timely and relevant data about public library trends. PLA encourages libraries to use this data to enhance their decision-making and advocacy efforts. We also encourage your comments and feedback. And once again, thank you to all of the responding libraries who took the time to participate.

The post The 2013 Public Library Data Service Statistical Report: Characteristics and Trends first appeared on Public Libraries Online.

]]>
https://publiclibrariesonline.org/2014/05/2013-plds/feed/ 0
A Look at Library Data https://publiclibrariesonline.org/2013/12/a-look-at-library-data/?utm_source=rss&utm_medium=rss&utm_campaign=a-look-at-library-data https://publiclibrariesonline.org/2013/12/a-look-at-library-data/#respond Thu, 12 Dec 2013 19:18:18 +0000 http://publiclibrariesonline.org/?p=3659 Following the German BIX, recently “Library Journal” and the International Federation of Library Associations (IFLA) Metropolitan Libraries Section each published their rankings of public library services. Libraries can see how they rated, nationally or globally.

The post A Look at Library Data first appeared on Public Libraries Online.

]]>
Following the German BIX, recently “Library Journal” and the  International Federation of Library Associations (IFLA) Metropolitan Libraries Section each published their rankings of public library services. Libraries can see how they rated, nationally or globally.

What do the German cities of Dresden, Erlangen, Jena, Regensburg, and Würzburg have in common? Their public libraries all got four stars for the year 2013 [1], that is the top rating of the BIX benchmarking system. Four stars mean a gold rating (the best) in each of the four groups of indicators, or, as they say, Zieldimensionen, (target dimensions). The participant libraries are mostly German, due to the fact that German must be accepted as the project language, so the only exceptions are from Switzerland and Austria. BIX was born in 1999, but looks very up-to-date if we consider the 18 indicators for public libraries, divided into: services (6), usage (5), efficiency (4),and development (3). Services refers to the core assets of the library: collection, space, staff, computers, programs, and Internet services, whose indicator sums up the number of services provided online, such as homepage, OPAC, user account management, virtual reference, Web 2.0 tools, and electronic resources. Usage includes virtual visits per capita, including homepage and OPAC sessions. Efficiency considers the relationship between expenditures and loans or visits. Development focuses on expenditures on buildings or the training of staff (including conference visits).

Created in 2006,  perhaps more relevant indicators are now needed for the Library Journal Index, whose benchmarking scheme is based on the data of the Institute of Museum and Library Services (IMLS). In it, U. S. public libraries, divided into 9 groups according to the annual expenditure, are assigned from three to five stars, based on four core output indicators regarding circulation, visits, programs, and Internet sessions. The 2013 LJ Index (year 2011) did not count new services, such as Wi-Fi access, e-books and database usage, or new ways of interacting with patrons. For instance some public libraries are starting to record “inreach,” services and collaborations with community agencies, in their statistics [2]. Electronic circulation per capita will be added starting with 2013 data. Simplicity is the main objective at the expense of the measurement of efficiency or of some classic service outputs like reference transactions.

At the international level a global evaluation of public libraries has been conducted by the IFLA Metropolitan Libraries Section since 2000 [3]. The last report (year 2011), published in November on IFLANET, was compiled by Helsinki City Library. 56 libraries participated from Asia, Europe, North America, and Oceania. The survey adopts more than 20 indicators regarding inputs, collections, expenditures, staff and outputs. The 2007-11 trends present a stabilization in staff and acquisitions (after the 2008-09 drop, probably due to the U. S. economic crisis), but a decline in weekly opening hours. After a boom in 2010, e-book collections are moderately increasing. For the second year, data about “hot” topics were collected: electronic services and resources, social networking, and programming. All the libraries (except three) have a Facebook account and the page of the National Library of Singapore generated more than 785,000 activities in a year! Fans of rankings will find something to sink their teeth into. In this edition Cleveland (Ohio) Public Library collected seven top positions, particularly in input measures and financial/staff ratios. Columbus (Ohio) and Seattle (Washington) earned some high rankings in output measures, while, among the European libraries, Copenhagen (Denmark) and Helsinki (Finland) got to the podium (Helsinki for the highest number of visits per capita).tThis survey was used as a management tool by Auckland (New Zealand) Libraries staff when 7 separate library systems merged into one. “We became a library system serving 1,5 million people, – Allison Dobbie of the Auckland Council wrote – “so used the statistics as a benchmark to check our resourcing levels relative to other libraries of a similar size. This was useful as we were then able to justify our levels of resourcing to our Council” [4].

If you are confused by the big national and international data projects, go back to the local level and have a look at the aspects that a single public library’s open data can reveal, such as the rise of e-book checkouts or the renewed interest in a novel due to a movie release in the Chicago public library system. [5]

———-

[1] In the group of cities with over 100000 inhabitants. Libraries are divided into five peer groups, according of the number of inhabitants of the served community.

[2] “Inreach” services are considered those “miniprograms that arise spontaneously between staff and patrons.” The last two reports present the profiles of some top-rated libraries, or of new star libraries, with their big strategies and small recipes to earn the 5-stars.

[3] The Section is the network of libraries of cities with 400,000 or more inhabitants

[4] E-mail to author (07/31/2013). Other library managers, such as Judith Hare (Halifax Public Library, Canada) and Siobhan Reardon (Free Library of Philadelphia), reported to me about the use of the so-called MetLib Statistics to evaluate the library’s progress in comparison with other institutions of the same population size (e-mails to author, 07/17/2013 and 07/21/2013). After three consecutive years of funding by IFLA, now the survey is looking for new funds to continue.

[5] Elliott Ramos, “Perusing Chicago Public Library Data: Rogers Park ranks high among bookworms, Great Gatsby flies off shelf and eBook checkouts on the rise”, accessed November 13, 2013,
http://www.wbez.org/blogs/bez/2013-06/perusing-chicago-public-library-data-rogers-park-ranks-high-among-bookworms-great

The post A Look at Library Data first appeared on Public Libraries Online.

]]>
https://publiclibrariesonline.org/2013/12/a-look-at-library-data/feed/ 0