Measuring outcomes can be a vital aid to justifying library work to voters, funders, and stakeholders—as well as determining strategic direction—but it can also be overwhelming.
“Libraries are very good at counting outputs…it’s more difficult to count outcomes,” says Stacey Wedlake, research and communication coordinator for Impact Survey, one of several national projects developed over the past ten years to help public libraries jump that hurdle. “It takes a different way of thinking and approach to understand and then count how people were changed due to the access and use of your services.”
Some statewide efforts have had success with outcomes. New York and Oregon state libraries, for instance, offer outcomes-based evaluation (OBE) training and support for their members. The Job and Career Services Department of the Cuyahoga County Public Library, OH, has measured outcomes for decades, using paper and online surveys for participant evaluations and an electronic records system designed for counseling centers for secure data storage. However, many libraries don’t have staff with time to learn this approach, and, for the most part, only some of the most recent graduates have these skills.
Fortunately, efforts to provide that capacity on a national scale are rising to meet the need, but there is still plenty of room for growth—in adoption, in use of the data, and in taking outcomes-based assessment to the next level.
time has come FOR OUTCOMES
What explains the current national focus on outcomes measurement? Wedlake is specific: “[We’ve] been talking about the importance of outcomes for many years, but [it’s] reached a tipping point. There seemed to be a shift post-2008. Local governments and taxpayers started taking closer looks at how their money was being spent and wanting institutions to ‘prove’ their value. Outcomes help tell that story [and] give valuable information to librarians who want to improve their services.”
Tim Cherubini, executive director of Chief Officers of State Library Agencies (COSLA) and a leader of the Measures That Matter project, concurs. “We’ve talked about [outcomes measurement and standardized data collection] for a long time—for whatever reason, there’s an energy now, and we need to take advantage of that,” he says.
The reason may very well relate to concrete support. “I think that technology accelerates change and, in this case, expectations as well,” says Gretchen Pruett, library director, New Braunfels Public Library, TX. “With technology, we should be able to have metrics faster and better and more and all the time. [M]any library staff, especially those in smaller libraries, do not have the expertise to develop and administer the tools needed to capture the data. So [tools provided by large initiatives], which are easily administered by small libraries and which give great information on the local level [plus] national comparability, are much more likely to be used by librarians at all levels.”
Many tools, one goal
Three initiatives—Project Outcome, from the Public Library Association (PLA); the Impact Survey, from the University of Washington (UW) iSchool, with original support from the Bill & Melinda Gates Foundation; and the Edge Initiative, from the Urban Libraries Council (ULC)—all offer libraries ways to assess their ability to provide programs and services (especially those focused on patrons needing access to and instruction about technology), analyze the data, and communicate the real-world results of library efforts to decision-makers.
Other projects support different aspects of the shift toward outcomes measurement.
Measures That Matter (MTM), a project from the Institute of Museum and Library Services (IMLS) and COSLA, is designed to examine, evaluate, and map public library data collection in the United States. Rather than providing tools directly, MTM hopes to build bridges among existing initiatives, facilitating collaboration and standardization. Linda Hofschire, director of the Library Research Service at the Colorado State Library and a 2017 LJ Mover & Shaker, explains, “[Measures That Matter will] develop an action plan that reenvisions how data can be collected, stored, and used more efficiently and productively…[and] consider what measures should be collected to tell the most meaningful story of the 21st-century library.”
Uptake of Project Outcome
Project Outcome resulted from work done by PLA’s Performance Measurement Task Force, founded by then-PLA president Carolyn Anthony. “The task force knew libraries were measuring outcomes, but none were doing so consistently across the field,” says Emily Plagman, Project Outcome program manager. “[They] wanted to provide a set of standardized, easy-to-use outcome measures for libraries and aggregate the results to talk about impact at a national level.... The goal of Project Outcome is to create a field-wide shift toward outcomes measurement.”
The Project Outcome free toolkit consists of standardized outcome surveys that are easily distributed on paper or online to patrons immediately after a program or service or as follow-up a few weeks later. Most of the questions use a rating scale, but the concluding two open-ended questions for comments have been a gold mine for both positive and change-inducing feedback.
The combination works. In its 2016 Executive Summary, Project Outcome reports that libraries are more likely to use outcome data and see benefits when they use Project Outcome tools. Some 51 percent of respondents said that it helped the library have a bigger impact, and among smaller libraries, 28 percent more were using outcomes measures as a result of Project Outcome.
Between 250 and 300 libraries are currently using Project Outcome. On May 1, Project Outcome launched a redesigned website and a PLA-owned survey and data management tool, featuring more versatility and a particularly useful worksheet for analyzing the open-ended question responses. Owning the survey and data management tool allows for future expansion and partnership opportunities and keeps the toolkit free to all U.S. and Canadian libraries.
Scaling the Impact Survey
In 2009, the UW iSchool conducted a national survey to gather data on the impact achieved by patrons using library technology services. “After the research project concluded, libraries began asking us if they could use our survey in their local communities,” Impact Survey’s Wedlake says. “We developed a survey tool that allows public libraries to run our technology surveys on their own schedule; at the conclusion of the survey, the library receives easy-to-share, automatically generated report narratives.”
In addition to offering the preset surveys, Impact Survey researchers are available to help libraries understand outcomes measures, customize data collection tools, and use the data they gather to tell better stories. Impact Survey results are added to customized presentation slides, handouts, and news article templates that articulate the impact of library technology on the community. Since October 2013, 1,751 libraries have registered for Impact Survey, and 85,374 library patrons have submitted the patron technology survey. In 2017, the Impact Survey team hopes to expand survey subjects and reach out to organizations beyond public libraries.
WIDENing the Edge Initiative
The Edge Initiative enables libraries to achieve better outcomes by assessing their technology, communicating its impact, and planning for the future. Using an online assessment tool, libraries identify strengths, gaps, and areas of improvement for technology infrastructure and services by measuring against 11 benchmarks in providing community value, engaging the community, building partnerships, and offering effective organizational management. With their Edge results, librarians can create an action plan to strengthen technology services and approach local leaders to strengthen alliances. “One of the goals of Edge is for library leaders to use [our] data to begin conversations and strategically work with community and government leaders to position the library as a pivotal partner,” says Edge program director Kristi Zappie-Ferradino.
Approximately 2,100 public libraries participated in Edge in 2016. In 2017, Edge added two peer cohorts—small/rural libraries and urban libraries—and is updating the benchmarks, working with experts from across technology, library, and digital inclusion fields.
Measuring the success of measures
Representatives from all three initiatives joined last fall for the webinar “Measurement Matters: Using Edge, Project Outcome, and the Impact Survey to Assess and Improve Community Outcomes.” They detailed how each project contributes to outcomes and their measurement, offered suggestions on tailoring an implementation plan to suit any library, and highlighted two success stories.
Pruett’s New Braunfels library was one. “My library started by using the Impact Survey, and that yielded all sorts of surprising data about how our library patrons were using our public computers. We used this data to argue for more IT support and more computers, which was then bolstered by the data from the Edge survey,” Pruett said in her presentation. “Project Outcome just continues the data-gathering process we started with the other tools. [With results from all three initiatives] I now have data about the importance of library services, especially technology, that helps me bust the myths in my city and county administration in order to argue for a more robust funding stream. I could not have come up with the tools or the data on my own. And in the age of the Internet—‘everything is on Google’ and ‘why do you still need libraries’—these tools are more important than ever.”
For other libraries, adding outcomes measurements augments what they’re doing with output data. Public Services Project Manager Christa Werle of Sno-Isle Public Libraries, WA, describes implementing Project Outcome as part of a strategic plan. “We are starting with a premise, the impact that we want to contribute to, and working backward through planning to the data collection (outputs and outcomes) that, through analysis, will help us understand if we’re on the right track, then adjust and plan accordingly,” she says. “Our goal is to make a difference that otherwise couldn’t be made and prove it.”
The results have occasionally been surprising, Werle notes. “We can test…assumptions that were based on outputs. Where we may have thought that customers would rather get new/popular ebooks faster, we learned that they would rather have a broader selection to choose from.” The next steps are to develop a measurable service plan for each element of the strategic plan.
Project Outcome’s Plagman also stresses the complementary nature of various measurements. “When you combine outcome and output data with anecdotal evidence, you are able to better demonstrate the library’s benefit to the community.” Brent Bloechle, Plano Public Library, TX, agrees. “We still measure outputs, and I don’t see us stopping in the future,” he says. “Outcomes measures are beneficial because they…ensure that we see library users as individuals and not just statistics.”
Project Outcome’s rating responses are useful, but the open-ended questions contribute great insights into possible improvements, according to Plagman. For example, Werle says, “We find that our customers are quite honest about what we can do to improve their experience, right down to the quality of our toilet paper.”
During a May 2017 webinar, Plagman shared that outcomes demonstrated to the Sacramento Public Library (SPL), CA, that its staff-intensive knitting program significantly improved the lives of participants. So rather than eliminate it, SPL retooled the program to be participant-run, freeing staff for other things.
Improving the measures
In addition to using ready-made tools provided by the national initiatives, there are a few tips on how to improve the outcomes measures process. One is to involve several staff to increase investment and spread out the work. “It’s important to [put together] a team…that fits the size of your library,” advises Edge’s Zappie-Ferradino. She also recommends not letting the results stop at the library door. “Engaging local leaders, specifically mayors and city/county managers, creates greater value…than using the information for internal purposes only.”
Sno-Isle’s Werle acknowledges that change can be hard for staff. “Even with the best communication plan, [staff] need to go through their own individual change acceptance. This takes time, energy, and strong leadership.”
For Pruett of New Braunfels, it’s simple. “Just start,” she urges. “Whichever of the tools matches your most pressing need—start with it and work into the others.”
We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing
Add Comment :-
Comment Policy:
Comment should not be empty !!!