This is the 13th year of the LJ Index of Public Library Service and Star Library ratings. The 2020 scores and ratings are based on FY18 data from the Institute of Museum and Library Services (IMLS) Public Library Survey (PLS). Because of that delay, they don’t reflect the impact of the coronavirus; that won’t be reflected in the data until 2022. The big news in this year’s edition is that successful retrievals of electronic information (e-retrievals)—measuring usage of online content, such as databases, other than by title checkout—joins the six other measures that determine the LJ Index.
This is the 13th year of the LJ Index of Public Library Service and Star Library ratings. The 2020 scores and ratings are based on FY18 data from the Institute of Museum and Library Services (IMLS) Public Library Survey (PLS). Because of that delay, they don’t reflect the impact of the coronavirus; that won’t be reflected in the data until 2022.
The big news in this year’s edition is that successful retrievals of electronic information (e-retrievals)—measuring usage of online content, such as databases, other than by title checkout—joins the six other measures that determine the LJ Index: physical circulation, circulation of electronic materials, library visits, program attendance, public internet computer use, and Wi-Fi sessions. The addition builds on last year’s new measure, Wi-Fi sessions, and is likely to be followed by library website visits next year, as PLS began collecting that data in 2018. Together with 2016’s addition of electronic circulation, they will build a much more complete picture of digital library use than was previously possible. Now that electronic circulation has been well established as a separate statistic, as of this year we have also replaced total circulation with physical circulation, so that e-circulation is not counted twice.
This year’s edition of the LJ Index also incorporates, for the first time, a new criteria for libraries to achieve Star Library status. To address the longstanding, growing problem of extremely high statistical outliers, we have introduced a bar on the extent to which a highly extraordinary outlier on a single per-capita statistic will be allowed to earn a library star status. A library’s LJ Index score will not earn it star status if any of its seven per-capita figures is three or more standard deviations above its expenditure category average (the widely accepted statistical standard for an outlier) and if the standard score (a translation of the reported figure to its number of standard deviations from its group mean) for any such statistic accounts for half or more of the library’s total raw LJ Index score (which, currently, is the sum of the standard scores for the seven per-capita statistics).
While a figure that is more than three standard deviations from a spending peer group’s average is highly likely to be erroneous—or at least non-comparable to other figures for its group—that is not assumed by the application of this new criterion. Unusual circumstances often generate exceptional statistics. The greater concern in determining Star Library status is to avoid circumstances in which a single highly extreme outlier for a library, even if it is accurate, is allowed to overcompensate for six other relatively lackluster figures.
Given the combined impact of these changes, it is little surprise that there are a record number of new Star Libraries this year.
One-hundred-and-fifteen—more than two out of five—of 2020’s 262 Star Libraries were not Star Libraries in 2019. There was a dramatic drop in the number of libraries scored on the Index this year, as there was in 2019. The new new per-capita output measure—e-retrievals—was not reported by almost 1,400 libraries. Almost 1,200 additional libraries that reported e-retrievals did not report last year’s new per-capita output measure, Wi-Fi sessions. Combined, those more than 2,500 non-reports were the biggest factor lowering the number of libraries scored on the LJ Index to 5,608. Other disqualifying factors included, but were not limited to, four: not meeting the IMLS definition of a public library, having a legal service area population of less than 1,000, spending less than $10,000 in total operating expenditures, or declining to report any of the other data required to calculate the LJ Index score.
MEAN AND STANDARD DEVIATIONS (SD) OF LJ INDEX STATISTICS BY EXPENDITURE CATEGORY, 2020 (BASED ON FY18 DATA) |
||||||||||||||||
PER CAPITA SERVICE OUTPUT EXPENDITURES CATEGORY |
PHYSICAL CIRCULATION | CIRCULATION OF ELECTRONIC MATERIALS | ELECTRONIC INFORMATION RETRIEVALS | LIBRARY VISITS | TOTAL PROGRAM ATTENDANCE | PUBLIC INTERNET COMPUTER USE | WIFI SESSIONS | |||||||||
MEAN | SD | MEAN | SD | MEAN | SD | MEAN | SD | MEAN | SD | MEAN | SD | MEAN | SD | |||
$30M+ | 8.08 | 4.92 | 1.63 | 1.07 | 1.95 | 3.38 | 4.62 | 1.73 | 0.39 | 0.21 | 0.88 | 0.41 | 1.93 | 3.11 | ||
$10-29.9M | 8.56 | 5.31 | 1.53 | 2.15 | 1.71 | 2.41 | 4.74 | 2.03 | 0.47 | 0.35 | 0.94 | 0.56 | 1.91 | 3.16 | ||
$5-9.9M | 9.29 | 7.36 | 1.19 | 1.07 | 2.40 | 5.44 | 5.88 | 5.10 | 0.56 | 0.46 | 1.00 | 0.94 | 2.62 | 6.34 | ||
$1-4.9M | 7.91 | 6.24 | 0.93 | 1.19 | 4.05 | 26.85 | 5.91 | 4.91 | 0.65 | 0.73 | 0.90 | 0.93 | 1.91 | 9.59 | ||
$400-999.9K | 7.25 | 5.90 | 0.81 | 1.18 | 10.84 | 189.90 | 6.49 | 6.10 | 0.75 | 0.85 | 0.99 | 1.34 | 1.53 | 3.89 | ||
$200-399.9K | 5.92 | 5.60 | 0.62 | 0.93 | 9.16 | 95.68 | 5.35 | 4.44 | 0.65 | 0.78 | 0.92 | 1.09 | 1.24 | 4.27 | ||
$100-199.9K | 6.02 | 5.96 | 0.59 | 0.87 | 4.63 | 65.86 | 5.43 | 5.32 | 0.70 | 0.95 | 1.09 | 1.90 | 1.69 | 16.34 | ||
$50-99.9K | 4.94 | 4.04 | 0.52 | 0.84 | 0.54 | 8.99 | 4.26 | 3.47 | 0.58 | 0.66 | 0.91 | 1.19 | 1.04 | 2.60 | ||
$10-49.9K | 3.15 | 3.01 | 0.34 | 0.96 | 0.10 | 0.76 | 2.61 | 1.97 | 0.37 | 0.60 | 0.68 | 0.85 | 1.11 | 4.99 | ||
TOTAL | 6.41 | 5.75 | 0.72 | 1.09 | 5.37 | 96.75 | 5.32 | 4.88 | 0.63 | 0.78 | 0.94 | 1.26 | 1.53 | 8.49 | ||
KEY: M–Millions K–Thousands |
||||||||||||||||
Note: These figures are for the 5,860 public libraries originally scored on the LJ Index for 2020. In some cases, they are affected dramatically by extreme outliers that, for the first time this year, disqualified the LJ Index scores of 252 libraries from consideration as Star Libraries. |
Nationwide, 85.2 percent of U.S. public libraries reported electronic information retrievals for FY18. In individual states, however, levels of reporting for this newest LJ Index statistic varied dramatically.
For 17 states, e-retrievals were reported for FY18 by all public libraries: Arizona, Delaware, Georgia, Idaho, Kentucky, Louisiana, Montana, North Carolina, New Hampshire, Nevada, New York, Ohio, Oklahoma, Oregon, Rhode Island, South Carolina, and Wyoming. E-retrievals were also reported by the District of Columbia and Hawaii.
Conversely, 13 states had double-digit percentages of libraries that did not report e-retrievals for FY18, despite it being the second year of collecting such data for all states and, for most, the third year. These are Maine (91.6 percent), Utah (77.0 percent), New Jersey (63.4 percent), Connecticut (60.9 percent), Minnesota (48.5 percent), Wisconsin (38.1 percent), Florida (37.0 percent), Tennessee (30.6 percent), California (29.8 percent), Illinois (24.7 percent), Washington (23.0 percent), Virginia (21.5 percent), and Vermont (17.9 percent). There were no states in which no libraries reported e-retrievals.
Several factors probably account for most of these differences in reporting. States in which databases are licensed at the state level, rather than by individual libraries, are more likely to report data for all or most libraries. Similarly, states in which the state library agency or one or more library consortia organize cooperative purchasing of e-resources or negotiate discounted licensing costs for individual libraries are likely to have more comprehensive reporting on e-retrievals. Where there is any higher-level organization of e-resource purchasing, it is more likely that local libraries have ready access to statistics on e-retrievals by their users.
An excellent example is the Ohio Public Library Information Network (OPLIN), which provides a database statistics portal for its public libraries (oplin.ohio.gov/statistics). Other states, like North Dakota and Pennsylvania, provide usage data to individual public libraries for databases provided by the state library. Local libraries must supplement these data with comparable data for e-resources to which they subscribe individually. (See library-nd.libguides.com/PLS and bit.ly/PAcounter5.) In still other states, like Texas, individual public libraries are not expected to report e-retrievals for databases licensed by the state; the state library agency parses those statistics for them. (See the 2019 Texas Public Libraries Annual Report: Issues and Explanations.)
Notably, a study conducted in North Carolina indicates that database usage tends to be higher for public libraries for which database usage statistics (available monthly from NC Live) are downloaded more frequently. The authors offer two possible explanations for this relationship: “Libraries are downloading their usage reports and then intentionally promoting resources to patrons,” and “Libraries that download statistics from the NC LIVE website are simply more familiar with the databases that are available and are more likely to use them and suggest them to patrons.” Other predictors of database use include two capacity indicators—the number of databases to which the library provides access and the number of public internet computers in the library—and one community indicator, the percentage of the legal service area population with a bachelor’s degree. (See Jill Morris & Emily Guhde, “Making Usage Data Meaningful.”)
PUBLIC LIBRARIES ELIGIBLE FOR THE LJ INDEX AND REPEAT AND NEW STAR LIBRARIES, 2009–19 (BASED ON FY06–17 IMLS DATA) |
|||||||||||||
EXPENDITURE |
NUMBER OF LIBRARIES |
||||||||||||
DECEMBER 2020 |
DECEMBER 2019 |
NOVEMBER 2018 (2016 DATA) |
DECEMBER 2017 (2015 DATA) |
NOVEMBER 2016 (2014 DATA) |
NOVEMBER 2015 (2013 DATA) |
NOVEMBER 2014 (2012 DATA) |
NOVEMBER 2013 (2011 DATA) |
NOVEMBER 2012 (2010 DATA) |
NOVEMBER 2011 (2009 DATA) |
OCTOBER 2010 (2008 DATA) |
NOVEMBER 2009 (2007 DATA) |
FEBRUARY 2009 (2006 DATA) |
|
$30M + | 58 | 55 | 54 | 49 | 49 | 51 | 47 | 46 | 44 | 48 | 45 | 36 | 31 |
$10M - $29.9M | 104 | 112 | 127 | 116 | 107 | 112 | 113 | 112 | 114 | 107 | 106 | 98 | 88 |
$5M - $9.9M | 173 | 192 | 220 | 219 | 222 | 209 | 209 | 198 | 191 | 211 | 186 | 176 | 159 |
$1M - $4.9M | 1132 | 1,251 | 1,445 | 1,436 | 1,401 | 1,397 | 1,381 | 1,367 | 1,349 | 1,307 | 1,282 | 1,209 | 1,125 |
$ 400K - $999.9K | 1113 | 1,221 | 1,451 | 1,443 | 1,414 | 1,446 | 1,394 | 1,395 | 1,373 | 1,377 | 1,333 | 1,278 | 1,247 |
$200K - $399.9K | 915 | 1,030 | 1,169 | 1,186 | 1,171 | 1,209 | 1,208 | 1,174 | 1,170 | 1,129 | 1,087 | 1,113 | 1,089 |
$100K - $199.9K | 920 | 1,044 | 1,204 | 1,212 | 1,180 | 1,257 | 1,237 | 1,251 | 1,258 | 1,236 | 1,204 | 1,191 | 1,173 |
$50K - $99.9K | 756 | 888 | 1,011 | 1,002 | 1,055 | 1,088 | 1,122 | 1,111 | 1,126 | 1,145 | 1,128 | 1,152 | 1,115 |
$10K - $49.9K | 437 | 540 | 680 | 746 | 750 | 894 | 875 | 919 | 945 | 953 | 1,036 | 1,015 | 1,088 |
Total Libraries Rated | 5608 | 6,333 | 7,361 | 7,409 | 7,349 | 7,663 | 7,586 | 7,573 | 7,570 | 7,513 | 7,407 | 7,268 | 7,115 |
Repeat Stars (from prior year) | 147 | 172 | 197 | 205 | 199 | 207 | 198 | 196 | 203 | 195 | 195 | 208 | |
New Stars (not starred prior year) | 115 | 89 | 60 | 54 | 61 | 54 | 60 | 67 | 59 | 67 | 63 | 50 | |
TOTAL STARS | 262 | 261 | 257 | 259 | 260 | 261 | 258 | 263 | 262 | 262 | 258 | 258 | 256 |
KEY: M–Millions K–Thousands |
If the PLS is to represent fully the important role that e-resources play in today’s public libraries, there is a pressing need for IMLS to address the high levels of non-reporting on e-retrievals with the Chief Officers of State Library Agencies and State Data Coordinators in underreporting states. Those State Data Coordinators and continuing education providers should redouble their efforts to train and support staff of local libraries in collecting and reporting this statistic.
State Data Coordinators offer a uniquely well-informed perspective about the challenges public libraries face in collecting and reporting data on e-retrievals. Among those who responded to an inquiry about the biggest obstacles their libraries face in reporting e-retrievals, there was unanimity on two points, confusion and burden. According to Evelyn Lindberg and Kathleen Sullivan of the Washington State Library, “Public libraries in Washington State subscribe to dozens of electronic collections and must determine for each one what constitutes a user having ‘examined, downloaded, or otherwise [been] supplied…full content units or descriptive records.’ This is especially complicated for collections that use streaming models, or include content other than traditional documents, music or video (e.g., curricula). All but one of our public libraries host at least 22 collections (the average is 39), so generating a measure of e-retrievals means determining the correct metric and then extracting that figure from many sources. Non-response occurs disproportionately among our smallest libraries (serving fewer than 7,000 people), where the task likely overwhelms capacity.”
That collecting and reporting e-retrieval data is not even more challenging can be credited largely to the efforts of Project COUNTER, which originally stood for Counting Online Usage of NeTworked Electronic Resources. Building on previous efforts of the International Coalition of Library Consortia and the Association of Research Libraries, Project COUNTER created, and periodically updates, a Code of Practice that dozens of online content vendors subscribe to and comply with. As with many such large-scale, collaborative efforts, however, the “devils in the details” are the extent to which vendors actually comply with COUNTER’s Code of Practice and the extent to which library managers are sufficiently well-trained to understand and exploit it. Most state library agencies and library consortia do not offer training to library managers to take advantage of COUNTER’s data standards and to navigate the remaining challenges associated with noncompliance by some vendors. In the absence of such training, it is little wonder that many library managers—especially those from very small public libraries—feel confused and burdened by the PLS request for a count of e-retrievals.
At least seven state library agencies provide local data reporters with specific guidance about how to report e-retrievals. They crosswalk from each electronic resource to its particular unit of measurement (e.g., full-text retrievals, abstract requests, sessions, checkouts) to the PLS item, e-retrievals, or to circulation of electronic materials—the other PLS item with which it is most likely to be confused.
CALIFORNIA Counting Successful Retrieval of Electronic Information for CAPLS
COLORADO What Goes Where in the Colorado Public Library Annual Report (PLAR) (2020)
INDIANA What Goes Where 2019
MINNESOTA Minnesota Public Library Report, Successful Retrieval of Information from Electronic Collections
NORTH CAROLINA Public Library Survey/Changes to the Survey
NORTH DAKOTA Electronic Library Resource Usage 2019
If you know of another state library that provides similar guidance, please contact the author so this list can be updated.
Eligible libraries are grouped by total operating expenditures and rated based on their differences from the average for the seven per-capita statistics. For answers to basic questions about the LJ Index and its Star Library ratings, visit the Star Library/LJ Index FAQ.
If your library does not appear on the 2020 list of Star Libraries, don’t assume that there is nothing in this project for your library. See the 2015 Star Libraries/LJ Index article (bit.ly/indexscore2015) for several examples of ways you can reanalyze LJ Index scores for your library and a subset of public libraries and come up with your library’s own claim to distinction.
If your library is still struggling with collecting and using statistics on Wi-Fi sessions, it may be worthwhile to consider the variety of factors that can affect how high or low your library’s Wi-Fi session count is. These factors are discussed in the 2019 Star Libraries article.
Keith Curry Lance (keithlance@comcast.net) is an independent consultant based in Boulder, CO. He also consults with the Louisville, CO–based RSL Research Group. In both capacities, he conducts research on libraries of all types for federal and state library agencies, state library associations, and other library-related organizations. For more information, visit www.keithcurrylance.com.
We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing
Add Comment :-
Comment Policy:
Comment should not be empty !!!
Jackie Nelson
I have been searching for a straight list of libraries in Kansas. We cover several small communities and I have not found a simple comprehensive list of State, Community, Library anywhere on the site. It would be very helpful for us to give these small libraries some free publicity and love if we knew which ones they were.
Posted : Jan 13, 2021 10:10
Donna Loehner
I have been trying to find out if our library was a Star Library, but nowhere does it tell me. I found it in the ABC list, but I can't find anything that states which libraries received how many stars. The map says there are 7 in Missouri, but no idea which 7. This is not user friendly, no does it give the info the title states.
Posted : Jan 05, 2021 08:25
Tom Horton
I have just spent 30 minutes on this site trying to determine the star libraries in Utah and how many stars they have. I have been unsuccessful because your information presentation is extremely unfriendly to the general public and one of the worst I have ever encountered. I had hoped to find a simple list of the star libraries, but it does not exist, apparently. The best you offer is mountains of complex indices and statististics that would only be of use to a library professional, but are useless to someone who is just searching for the very basics. You have clearly gotten bogged down measuring trees and forgotten there is a forest. Please rethink this.
Posted : Dec 29, 2020 05:03