The evolution of Global Libraries’ performance measurement and impact assessment systems

Jeremy Paley (The Bill & Melinda Gates Foundation, Seattle, Washington, USA)
Julia Cottrill (GMMB, Washington, District of Columbia, United States)
Katherine Errecart (FSG,Seattle, Washington, United States)
Aimee White (Custom Evaluation Services, Mukilteo, Washington, United States)
Carrie Schaden (Community Attributes, Seattle, Washington,United States)
Tyler Schrag (Community Attributes, Seattle, Washington, United States)
Robert Douglas (Community Attributes, Seattle, Washington,United States)
Beeta Tahmassebi (EnCompass LLC, Rockville, Maryland, United States)
Rachel Crocker (IREX – Beyond Access, Washington, District of Columbia, USA)
David Streatfield (Information Management Associates, Twickenham, United Kingdom)

Performance Measurement and Metrics

ISSN: 1467-8047

Article publication date: 13 July 2015

3525

Abstract

Purpose

The purpose of this paper is to describe the evolution of a common approach to impact assessment across the Global Libraries (GL) portfolio of grants. It presents an overview of two systems, the Performance Metrics (PMs) and the Common Impact Measurement System (CIMS). By providing a standard set of definitions and methods for use across countries, these systems enable grantees to collect data that can be compared and aggregated for the purpose of collective learning, improvement, accountability, and advocacy.

Design/methodology/approach

The PMs offer a standard methodology to collect library project performance management data, whereas the CIMS is a standard survey of public library users. The paper describes how the PM and CIMS data are being visualized and used, with examples of findings and lessons learned.

Findings

The paper cites examples of the type of PM and CIMS data available, with a focus on employment, gender, and case studies from Botswana and Indonesia. These highlights illustrate how libraries’ user demographics differ from other types of public internet access venues and how libraries can contribute to strong employment and growth.

Research limitations/implications

The measurement systems rely on different partners collecting data for the same metrics across different countries; while each grantee adheres to a standard methodology, small procedural, and methodological differences are inevitable. Future research could focus on conducting similar studies elsewhere, outside the cohort of countries in the GL portfolio of grants.

Practical implications

The paper offers insights and lessons for library agencies or institutions interested in implementing a common measurement system. Recognizing that few library projects have the resources to track a comprehensive set of indicators, a case study is presented about how smaller initiatives can adapt these systems to their needs.

Social implications

The indicators described in this paper enable public libraries to shift their focus from services provided to the outcomes they help individuals and communities realize, potentially increasing the potency of their programming and advocacy.

Originality/value

Common measurement systems are not new, but their application in the public library field is novel, as is the Data Atlas, a platform grantees use to compare results across metrics, track progress, and conduct advocacy.

Keywords

Citation

Paley, J., Cottrill, J., Errecart, K., White, A., Schaden, C., Schrag, T., Douglas, R., Tahmassebi, B., Crocker, R. and Streatfield, D. (2015), "The evolution of Global Libraries’ performance measurement and impact assessment systems", Performance Measurement and Metrics, Vol. 16 No. 2, pp. 132-158. https://doi.org/10.1108/PMM-04-2015-0010

Publisher

:

Emerald Group Publishing Limited

Copyright © 2015, Authors. Published by Emerald Group Publishing Limited. This work is published under the Creative Commons Attribution (CC BY 3.0) Licence. Anyone may reproduce, distribute, translate and create derivative works of the article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licenses/by/3.0/legalcode .


Introduction

The Global Libraries (GL) initiative of the Bill & Melinda Gates Foundation provides access to information through technology in public libraries across entire countries. GL’s multi-year grants (or “country grants”) fund efforts to understand local information needs, purchase equipment for libraries that can help meet those needs, train library staff, and help libraries build public support for long-term funding (For more information about the GL initiative and its goals, see “Guest Editorial: Impact Planning and Assessment of the Global Libraries initiative of the Bill & Melinda Gates Foundation,” which begins this special issue of Performance Measurement and Metrics).

Each Country Grant project includes staff and financial resources for a body of work known as “Impact Planning and Assessment (IPA).” The core principle of IPA is the importance of understanding local needs, designing services to meet those needs, and measuring progress for the purposes of learning, improvement, accountability, and advocacy. GL grantees design and implement programs tailored to local environments by targeting individuals’ and communities’ needs as well as local governments’ priorities. By addressing local funders’ particular priorities and promoting libraries’ impact in these domains, library development programs are demonstrating the important contributions that they make and are attaining increased, sustainable funding (see “Paper 1: Global Libraries Impact Planning and Assessment Progress” for more information about the GL approach to IPA).

The IPA approach theoretically means library programs should focus on measuring only what is locally relevant. For several years, this is what the set of grantees did, but eventually the lack of standardization created problem for both advocacy and management of the portfolio of similar grants. Without standard measures across projects, GL could not communicate about aggregate achievements to an internal Foundation audience or to the external public library field. Advocacy and communication efforts were further limited by GL’s inability to situate any single grantee’s work within an international context. Several grantees measured similar concepts, such as public libraries’ contribution toward reducing unemployment, but did so in dissimilar ways, making comparison impossible.

This paper describes an evolution from a collection of sui generis measurement schemas toward a common approach to measurement across the GL portfolio of grants. It presents an overview of two such systems, the Performance Metrics (PMs) (introduced in 2009 and refined in 2013) and the Common Impact Measurement System (CIMS) (introduced in 2013). Each of these systems represents a distinct link in GL’s theory of change: GL and its grantees aim to increase access to information (measured by the PMs) to improve people’s lives (measured by the CIMS).

The paper then describes how the PM and CIMS data are being visualized and used, with a summary of some of the type of interesting, useful findings that have emerged so far. Recognizing that GL programs are unique, and that few library projects have the resources to track such a comprehensive set of indicators, a case study is presented about how smaller initiatives can take the theory behind CIMS, borrow the key principles and a subset of indicators, and adapt the approach to their own needs. The paper concludes with lessons learned from the experience of designing these common measurement systems.

It is important to note that despite the move toward standardization over the past three years, other data points remain important to grantees. GL still encourages grantees to collect information that is relevant to their local efforts and priorities. For example, grantees often collect data about capacity of library staff, the strength of library partnerships, the efficacy of the policy environment, and people’s perceptions about libraries – none of which are covered by the standard measurement systems introduced by GL.

Aggregating achievements: the evolving GL PMs

Purpose and background

The GL PMs are a set of indicators that measure the achievements and monitor the progress of GL’s Country Grant programs (The PMs closely follow the international standards in ISO 2789, 2006 and ISO 11620, 2008)[1]. Released in 2009, the initial version of the Global Libraries Guide to Impact Planning and Assessment contained a set of required and recommended PMs perceived as mainstream in the library field and desirable by managers, funders, and policy makers. In 2012-2013, after finding it difficult to aggregate and compare the data that had begun flowing in from several countries, GL consulted with grantee impact specialists to refine the metrics and to ensure that all grantees collect data in standard, comparable ways, using identical definitions and methods.

By providing a standard set of definitions and methods for use across countries, the PMs enable grantees to collect data that can be compared and aggregated for the purpose of collective learning, improvement, accountability, and advocacy. Today, all GL Country Grantees use these metrics.

PMs components

The PMs are grouped according to the following key categories:

  • Public library service points metrics tell us how many libraries the GL program reaches:

    1. Public library service points include the total number of public library service points providing public access computing. A public library service point is any library facility, fixed or mobile, through which the public library provides a service to the general public. Central libraries, branch libraries, and mobile libraries are all each individual service points.

    2. Public access computing means providing at least one workstation available to the public regardless of whether access is free.

  • Computers and workstations metrics tell us the amount of new technology available as result of the GL program. A workstation is a computer connected to the internet.

  • Use of workstations metrics tell us whether library visitors are using the new technology the GL program provided.

  • Visits to public libraries metrics tell us whether library use changes over time, particularly after the GL program provides new technology.

  • Spending metrics tell us whether public investment in the country’s libraries changes over time.

  • Training metrics tell us how many library staff and users receive training during the GL program.

  • Library activities metrics tell us what library visitors are doing.

The following components accompany each metric:

  • A specific definition for the metric.

  • How to count: the sources, frequencies, and methods for data collection. These guidelines encourage proper sampling and estimating to avoid extensive measurement costs and level of effort.

  • Required/Optional: if a metric is required, a grantee must collect the data. If the metric is optional, grantees may choose whether to collect this data.

  • Target population: for some metrics, data should be collected from GL-supported libraries only. For other metrics, GL requests that data be collected (or estimated) for all libraries in the country.

See Appendix 1 for a complete list of the PMs.

From shared outputs to shared impact: the GL initiative’s CIMS

As described above, all GL Country Grantees collect data to track and inform their work. For several years they have used the PMs to measure progress as they increase public access to information through technology and training. While the PMs have allowed these grantees to measure the growth of their technology and services, this is just one part of the picture. In order to continuously improve their services, evaluate the global scope of their impact, seek new types of partnerships, and advocate for more government support, Country Grantees need a standardized way to demonstrate how providing access to technology in libraries improves people’s lives. In turn, to advocate effectively for funding that sustains access to information through technology, individual libraries must also be able to communicate the benefits they provide to individuals and communities.

The purpose

Today, the GL initiative seeks to equip public libraries with evidence of their ability to drive development – not just through traditional performance indicators, but in measurable results like job skills developed, education attained, employment found, money saved, and livelihoods improved. GL Country Grantees now employ a CIMS to quantify their work’s impact on public library users.

Data collected through CIMS enables public libraries to shift their focus from the services they provide to the outcomes they help individuals and communities realize. By agreeing to report the same measures using standardized definitions and methods, the GL initiative and Country Grantees are able to:

  • aggregate data to determine the total impact of GL Country Grantees and enhance their ability to advocate for the importance of public libraries;

  • track data over time to identify and monitor trends in public library use and reach, and incorporate this information into Country Grant programs and library services;

  • compare data across countries to allow grantees to learn from one another’s successes and challenges;

  • refer to a central, definitive source in communications and advocacy activities, so there is no confusion about where the numbers come from or how they are calculated; and

  • leverage a new online reporting system, the Data Atlas, to visualize public library data, giving the GL initiative, and Country Grantees insight into dynamic results as they are reported.

The CIMS framework is a shared set of indicators that GL Country Grantees measure to understand their individual and collective impact on the lives of library users. These indicators – and the positive outcomes libraries hope to realize – span seven common issue areas (Table I).

See Appendix 2 for the complete list of required CIMS indicators across these seven categories.

Guiding principles behind CIMS

CIMS helps demonstrate progress toward a set of desired outcomes. For each of the seven issue areas, the GL initiative has identified a set of outcomes, or positive changes in the lives of individuals and communities, that libraries can help achieve. CIMS provides the indicators that help grantees collect data about whether these outcomes are occurring. Before the indicators were developed, a set of guiding principles were established to govern how the new measurement system would operate:

  1. CIMS was designed with practical validity in mind, seeking a balance between what would be ideal to know and what is feasible to measure: each Country Grantee must feel comfortable that the data they collect paints an accurate picture of public library users in their country. To make the system manageable for Country Grantees with different levels of research experience and capacity, GL decided to work closely with Country Grantees and evaluation experts to develop the system.

  2. CIMS emphasizes contribution, not attribution: CIMS is designed to help GL Country Grantees understand whether the efforts of grantees are one of the causes of improvements in the lives of library users – not whether (or how much) these efforts are directly or solely responsible. In practice, this means GL does not require grantees randomly assign their intervention to libraries and conduct experiments using control groups. In the contexts in which grantees work, choosing a random sample of libraries with which to work is often impossible (or extremely difficult). This sacrifice in the level of rigor is balanced by virtues such as practicality and grantee endorsement. Some grantees have chosen to compare data with findings from control groups, but this is not a requirement.

  3. The CIMS framework includes both required and optional indicators: CIMS includes 41 required indicators for which all Country Grantees gather data (see Appendix 2 for a full list), and 53 optional indicators that Country Grantees may use if relevant to their program’s focus area. The GL initiative also encourages Country Grantees to collect additional, custom data to demonstrate the impact of their programs in areas of interest to specific stakeholders.

  4. CIMS was designed to be manageable: the 41 required indicators translate into only 20 survey questions, a fraction of the size of surveys that Country Grantees already conduct. Grantees determine the format of the survey (paper or electronic), and the GL initiative recommends that grantees choose an external organization (like a polling firm or university) to collect the CIMS data (Figure 1).

The collaborative process used to develop CIMS

The CIMS outcomes, indicators, and methodology were designed in collaboration with the GL initiative’s staff, evaluation experts, and several members of each Country Grant team, including program directors and advocacy specialists – in all, over 50 people shaped the framework over a year-long process.

The GL initiative used this inclusive process for four reasons:

  1. to promote a sense of shared ownership and familiarity with CIMS among Country Grantees;

  2. to ensure that the indicators are truly relevant in the context of Country Grants;

  3. to ensure that the methodology is practical and not a burden on Country Grantees; and

  4. to leverage Country Grantee expertise in research, planning, evaluation, and indicator development.

Impact specialists – the Country Grant team members responsible for evaluation of projects and processes – were most deeply engaged in the design process. These grantees provided critical input on every step of the design of CIMS, from indicator selection to methodology guideline development.

To design the indicator system, facilitators used several types of processes to collect feedback. To generate long lists of potential indicators, the primary tools were:

  • direct engagement: brainstorming exercises with grantee Impact Assessment Specialists and Advocacy Specialists at in-person meetings;

  • individual phone calls with Impact Assessment Specialists; and

  • group Skype calls with a cross-country pool of Impact Assessment Specialists.

After this initial set of exercises to brainstorm potential indicators for inclusion, facilitators led the participants through a set of activities designed to winnow down the list of 265 potential indicators into a manageable set that would not over-burden grantees. Techniques included:

  • An online asynchronous voting exercise with all Impact Assessment Specialists, Advocacy Specialists, and Program Directors. Respondents were asked to vote on whether each indicator should be required, optional, or removed from the system entirely. During this phase Impact Assessment Specialists were encouraged to seek input from Country Grant team members, and to report back each group’s consensus votes. Each country provided consolidated results from multiple members of the country team.

  • An online asynchronous voting exercise with GL staff and leadership, including 15 external stakeholders/consultants (e.g. from the Information School at the University of Washington).

  • Facilitators color-coded the indicator votes and synthesized comments to help the GL team hone and prioritize the indicators.

Because 48 percent of the votes were for a proposed indicator to be “required,” GL worked to narrow the number of “required” indicators to a more manageable number of 41 required and 53 optional indicators.

Following a process of indicator refinement, survey questions were developed. The 41 required indicators can be collected through a mere 20 survey questions (11 content questions and nine common demographics questions) (Figure 2).

Overview of CIMS methodology

All grantees and external partners use the GL-provided methodological guidelines to collect the CIMS data using a Survey of Library Visitors and a Pop-Up Survey. These guidelines are provided to ensure that data can be aggregated and compared across countries and that field-accepted processes are employed. The guidelines also encourage proper sampling (and estimating, when necessary) to avoid extensive measurement costs and level of effort.

Question wording

Grantees are required to adopt the survey questions provided exactly as worded and then translate them into the appropriate local languages.

Translation procedure

The accurate translation of survey questions into local languages is integral to the success of CIMS. Therefore, grantees are required to use the following translation procedure:

  • CIMS questions are translated by a professional translator;

  • this translation is reviewed by the impact specialist for accuracy in the library context; and

  • impact specialists adjust language where necessary to enhance accuracy.

In consultation with external data collection partners, grantees determine:

  • the number of languages into which the survey should be translated (based on grantees’ understanding of the demographics in the regions where the survey will be administered); and

  • whether and how much piloting of the translated questions to conduct.

Data collection instruments

Survey of library visitors

To collect the CIMS data, there are two data collection instruments options: an electronic survey and a paper survey. Grantees determine the data collection instrument for the Survey of Library Visitors in consultation with their data collection partners. Regardless of the survey instrument chosen, grantees are required to have an external partner identify survey participants, provide participants with the survey, and have participants complete the survey alone. The external partner should:

  • use random sampling to identify the survey participants;

  • provide the survey participants with the survey by handing it to them in paper format or providing them with a computer or tablet and orienting participants to the survey on the screen; and

  • be available to answer any clarifying questions that arise as participants complete the survey.

“Pop-up”-style survey of library technology users

Grantees are also given guidelines about which survey questions can be collected via a “pop-up” survey that appears on libraries’ public access computers during users’ internet sessions. These questions are based on indicators such as: the number/proportion of public library internet users by gender; the number/proportion of public library internet users by age; and the number/proportion of library visitors using technology at the public library to access information related to each of the seven CIMS domain areas. Grantees are told that they should collect data for these questions only if they have technology to administer a “pop-up” survey. Grantees can also choose to use these surveys to collect other custom (non-CIMS) data related to their programs.

Frequency of data collection

Grantees collect the CIMS data annually. They may schedule data collection to meet their needs and their reporting periods. One good practice is to align/integrate the CIMS data collection with existing plans to conduct impact assessment studies.

When a grant ends, GL considers providing post-grant support to help grantees or other institutions, such as governments or other partners, continue to collect the CIMS data. For example, in 2014 GL commissioned the research firm TNS Global to conduct a CIMS study in four countries where grant programs had ended: Bulgaria, Lithuania, Botswana, and Mexico. Conducting CIMS studies after grants have ended can provide insight into sustainability, offering powerful evidence that libraries are continuing to play an important role in their communities even after a large project ends.

Sampling

A great deal of time and effort went into creating sampling guidelines that achieve practical validity across countries by ensuring that survey responses reflect the diversity of library visitors. Grantees and external partners use the following sampling guidelines to select libraries where the surveys are administered and to determine the sample size for the surveys (Table II).

The Data Atlas: using technology to provide access to comparative PMs and CIMS analytics

After grantees agreed to collect and report the same measures, using common definitions and methods, in 2014 GL worked with a data management and software development firm, Community Attributes, to build an innovative, dynamic online results reporting, and visualization system called the Data Atlas.

This new prototype web site visualizes public library data from across the portfolio of Country Grants, giving the GL initiative and its grantees insight into dynamic results as they are reported. The Data Atlas features customizable dashboards and data visualization tools, which allow users to track and compare quantitative measures of library performance and impact, with additional reporting features and functionality currently in development.

By the end of 2015, the features that will be available in the Data Atlas include:

  • executive and portfolio-level comparative reports;

  • in-depth reports on user outcomes at the country level;

  • a map book of spatial visualization tools, including sub-country level mapping for select countries;

  • embedded qualitative items: users will have the ability to upload content such as videos and photos, tagged to specific indicators or locations;

  • a “Statistics Generator” tool which will allow users to generate key statistics based on data available in the system, including the ability to calculate impact based on set of customizable parameters;

  • a “Storyboard Creator” that will allow users to knit together compelling stories using text, pictures, and videos combined with data-driven elements such as charts, graphs, and maps;

  • data import, export, and management tools by which users can edit their data in the system; and

  • social media sharing.

We hope that over time, the Atlas can become a truly global “library impact data hub,” a platform that individual libraries, National Libraries, library associations, and other library sector organizations use to upload data, compare results across common metrics, tell stories, develop and track the progress of strategies, conduct advocacy, and collaborate.

The Data Atlas, with PM and CIMS data from the GL portfolio, can be viewed at www.glatlas.org (Figure 3).

Illustrative CIMS findings so far

The following CIMS analysis was made possible by the Data Atlas. It is intended to be illustrative, rather than a complete summary of the findings.

Comparing the use of technology at libraries across countries

The information in Figure 4 below focusses on several key digital inclusion CIMS metrics, displaying data by country, as well as a weighted cumulative average for each metric.

Many important data points are displayed in this figure, such as the success story that 57 percent of library visitors across all of the countries report that their use of technology increased as a result of public library services. Other country-specific insights are immediately available: for example, Ukraine has the second highest percentage of library visitors whose use of technology increased as a result of library services (81 percent) and the highest number of library internet users for whom the library is their only access to free internet (64 percent), emphasizing the importance of the role of libraries as a means to increase digital literacy of visitors.

These data also reveal that libraries are no longer simply a holding area for books, but are providing the tools to help patrons produce their own online content: 35 percent of library internet users across all of the countries created online content. Examples such as Latvia, where 77 percent of the library internet users generate online content, show that libraries are becoming places where local content is being created, not just consumed or checked out.

Gender case studies

Data Atlas users can also examine patterns across various demographic and socio-economic attributes of library visitors, including age, gender, race, urbanicity, education level, employment status, income, and disability. Of particular interest is that across this cohort of grantee countries, library users are predominantly female, often by a large margin. On average 58 percent of the library visitor population is female across the 12 countries that have reported CIMS data thus far (see Figure 5).

As many countries look to close the international digital divide, it is difficult to imagine much progress in a country’s information economy without the participation of women. However, according to Orbicom, the International Network of UNESCO Chairs in Communications (Huyer et al., 2005), women’s participation in the information society, particularly in poor countries, lags significantly behind that of men. And according to the Global Impact Study of Public Access to ICTs, conducted by researchers from the University of Washington (Sey et al., 2013), women accounted for just 29 percent of unique visitors to public internet access points such as libraries, cyber cafes, and telecenters in the countries they studied (Brazil, Chile, Ghana, Bangladesh, and the Philippines). (In that study, libraries had the highest proportion of female visitors, at 47 percent of users, compared to 28 percent for cybercafés and 23 percent for telecenters).

Findings from the CIMS data available in the GL Data Atlas are encouraging and reveal that libraries are emerging as an important public access venue for closing the gender digital divide.

A deeper look at Indonesia

According to the Center for Science and Technology Development Studies at the Indonesian Institute of Sciences, only 35 percent of the internet users in Indonesia were female in 2010 (Hermawati and Saari, 2011). Interestingly, the GL Atlas reveals that something different and special is happening at public libraries, with survey results from participating Indonesian libraries showing that 62 percent of library internet users in Indonesia are female (see Figure 6).

While further validation is needed to determine the effects of sample weighting and other factors contributing to these statistics, these initial findings tell a powerful advocacy story that libraries appear to be a positive force for increasing women’s online participation around the world.

The Data Atlas can also reveal findings that can be useful for program design. For example, in contrast to a female majority of internet users in other countries in the GL cohort, the Data Atlas reveals the opposite gender gap in Botswana, with women representing 44 percent of library visitors and only 33 percent of library internet users (Figure 7). Insights like these can hopefully help raise awareness and help grantees, as well as national, regional, and local libraries regarding their outreach and programming.

Employment case study

One critical opportunity offered by public internet access at public libraries is providing access to employment services. The dashboard below displays information about how library users search and apply for job listings at libraries in GL cohort countries (Figure 8). On average, 12 percent of library internet users search for job listings, of which 8 percent go on to apply for a job. On average, 23 percent of these applicants report receiving a job offer after using the library to apply for a job, a powerful example of how libraries can contribute to strong employment and economic growth.

These examples show a small portion of the data made available through the GL Data Atlas, revealing a glimpse of a rich and robust repository of research findings and powerful statistics for advocacy. The prototype web site remains in development, with the objective that it exist for years to come as a powerful tool to help analysts derive meaning from large library – and ICT-related data sets in support of sustaining library impact around the world.

How public library partners and allies can use CIMS

The application and utility of CIMS need not be limited to the universe of GL grantees. Public library partners, agencies, governments, aid organizations, advocates, and coalitions can use CIMS data to demonstrate the value of public libraries and learn more about engaging with libraries to meet community development goals. Potential applications for CIMS data include:

  • Building a case for public libraries as key implementation partners in development projects.

  • Implementing outcomes-based measurement in other library systems or community programs.

  • Choosing impact measures to incorporate into future GL grant applications. Even grantees that are not operating at a national scale can apply the framework to their work.

Adapting PMs and CIMS in resource-constained environments: a case study from IREX’s “beyond access” project

With significant resources and funding from the Foundation, GL’s Country Grant programs have demonstrated the range of benefits that are achieved when libraries are modernized and better aligned with community needs and interests. But large investments at scale, often with a strong partner such as a National Library or a Ministry of Culture, are not always feasible. IREX’s Beyond Access initiative was conceived in 2011 as a model that would consolidate lessons from the Country Grants and opportunistically integrate that approach into work that governments and international development organizations were already doing. Beyond Access set out to build momentum for library-based collaborations with existing development initiatives.

Through advocacy and engagement activities, Beyond Access recognized that there is an opportunity to tie libraries to development quickly and sustainably. Since that time, and without country-based staff, Beyond Access has designed and implemented projects in several countries, including Myanmar, the Philippines, Peru, and Georgia.

Beyond Access projects do not typically support comprehensive evaluation resources in country. Each project operates as a coalition of local organizations including libraries, government agencies, and NGOs. As a result, the program relies on a local partner (usually a local NGO) to provide in-country support for both project implementation and evaluation. While this approach can increase sustainability prospects, these partners are unlikely to have dedicated evaluation staff, and many have limited experience in monitoring and evaluation. The most common funding resources for these partners are sub-awards under international funding mechanisms or through local sources. These funding streams are unlikely to include sustainable funding for evaluation.

As a result, when the PMs and CIMS were established for the Country Grants, Beyond Access was faced with a challenge: how could the comprehensive list of 63 PM and CIMS measures be distilled into a list of essential indicators that would not be burdensome for implementing partners to track? Could an approach be developed that would be useful beyond the life of grant funding? In the end, Beyond Access settled on 16 indicators (shown in Table III) that are tracked across all projects. These indicators were selected because Beyond Access believed they represented reasonable overlap with what libraries already tracked, while pointing toward new ways to articulate data in a way that would resonate with development audiences.

These core indicators draw heavily from the GL PMs (indicators 1-5) with additional indicators that capture the integral role of partnerships at local, national, and global level in the Beyond Access model (indicators 8-16).

The PMP incorporates a single indicator (indicator 7) that addresses the seven CIMS impact domains. This indicator serves as a meta-indicator to capture impact in a variety of development domains across all of the Beyond Access projects.

At the outset of each project, Beyond Access staff work with the implementing partners to identify project level indicators and to train them on monitoring and reporting requirements. The primary goal during this process is to prioritize the data that libraries can easily collect in their libraries and leverage for advocacy, so that the data collection and use is sustainable beyond the life of the project and partners appreciate the value of these data. This starts by identifying data that are already collected in project libraries (e.g. number of library visits) and streamlining how that is reported to the implementing partner (usually via an online survey).

CIMS can then be incorporated at an individual project level. Beyond Access works with each project’s implementing partner to identify targets that align with advocacy goals. These targets are typically output indicators that align with CIMS development domains, such as the number of people that receive basic ICT training (Digital inclusion) or the number of people that receive workforce development training (Economic development). These data are supplemented by a user survey that is structured so that it can easily be administered by library staff. The survey questions can be derived from CIMS survey questions that align with the project’s target domains and advocacy objectives.

Limited resources and partner capacity are real constraints that Beyond Access grapples with, yet by simplifying the indicators and providing training and support at the outset to implementing partners, Beyond Access has designed an M & E approach that allows for essential key learnings to be identified and shared.

For more information about the Beyond Access initiative, see www.beyondaccess.net

Lessons learned: eight pieces of advice for funders or agencies interested in developing common indicator systems

Over the past few years, developing common indicator systems to track progress across projects has become increasingly in vogue among government agencies, philanthropies, and the corporate world. In our three years of work on CIMS, GL learned several things which we now offer as potential insights for others interested in embarking on similar efforts in their fields:

  1. When developing a set of common indicators that grantees around the world will measure, build in plenty of time to the work plan. Grantees will naturally be concerned about accountability, so the stakes are high. While a funder or agency could probably develop a list of common indicators alone in a room in a single afternoon, true engagement takes time! The benefit is that the system will be embraced by grantees, who will be eager to use and sustain it.

  2. Use existing grantees or partners to provide energy and insight to new ones. A couple of brand new grantees were skeptical about the need for a common measurement system; rather than have GL reinforce the new requirement in a “top-down” manner, veteran grantees continued to advocate for the system, explaining to new grantees why it would be valuable, swaying their opinion and building support.

  3. Grantees found it helpful that GL offered a set of criteria to help them prioritize and winnow down the long lists of indicators (e.g. “Would data on this indicator be useful for advocacy?”).

  4. Offer “optional” indicators (and/or some way for participating organizations to add their own custom categories/indicators) from the outset to encourage buy-in. Optional indicators act as a “pressure release valve” to lower the stakes that any single indicator be included in the final framework.

  5. When soliciting and summarizing feedback on potential indicators, use a “decision-making matrix,” a table containing all changes, additions, and subtractions from the system, and include documentation on the reasons why each change was made. This record of edits will ensure that everyone feels heard and can see why certain decisions were taken. It also makes it easy to refer to previous changes and remember why certain edits were made – particularly months later!

  6. Determining an agreed-upon common methodology ended up being half the work and the most contentious part of the process. However, it is also one of the most important aspects of a common measurement system, as without some agreed-upon methodological guidelines, the data on the common indicators will not be comparable and the work will have been wasted.

  7. Expect to make small tweaks to the system when designing and implementing it. A funder’s ability to be 100 percent consistent is limited by certain unique community contexts. It is important to “pick your battles,” knowing what you must stand firm on and where it is possible to be more flexible. For example, GL was flexible with some of the demographic categories; in some contexts, asking about ethnicity or membership in a “minority group” was not politically acceptable, so we waived the requirement that each grantee ask the exact same demographics questions.

  8. Grantees should be given interactive access to all of the data collected by the system. Extracting data from grantees is not sufficient – they must be given the ability to make use of their own data, at a minimum, but ideally they should be able to benchmark and compare themselves with their colleagues and peers who are doing similar work. Success should be measured not only by whether the data prove useful to the funder, but also to the grantees who have collected and provided it.

Conclusion

In the two years since GL introduced the CIMS, innovative libraries around the world – and the institutions that support them – have also begun conducting performance measurement in new ways, frequently emphasizing their contributions to improvements in users’ lives. Several new, similar outcome frameworks have emerged, often with a focus on digital literacy, employment-related skills, and school readiness/early learning. As GL winds down its strategy over the coming years, it is our hope that an ever-greater number of libraries embrace this practice, measuring whatever user outcomes are locally relevant. We believe this to be one of the most potent tools libraries can use as they seek sustainable funding amidst tough competition for scarce public resources. While seeing CIMS adopted and adapted would be auspicious, it is not the framework itself that we see as important, but the practice of user outcome measurement itself.


               Figure 1
             
               Topical breakdown of the required and optional CIMS indicators

Figure 1

Topical breakdown of the required and optional CIMS indicators


               Figure 2
             
               The inclusive CIMS development process, March 2012-April 2013

Figure 2

The inclusive CIMS development process, March 2012-April 2013


               Figure 3
             
               An example of the type of visualization of the CIMS data made possible by the Data Atlas

Figure 3

An example of the type of visualization of the CIMS data made possible by the Data Atlas


               Figure 4
             
               Global libraries Data Atlas report shows several key digital inclusion impact metrics, with weighted averages

Figure 4

Global libraries Data Atlas report shows several key digital inclusion impact metrics, with weighted averages


               Figure 5
             
               Gender distribution of library users by country

Figure 5

Gender distribution of library users by country


               Figure 6
             
               Demographics of internet users at libraries in Indonesia

Figure 6

Demographics of internet users at libraries in Indonesia


               Figure 7
             
               Demographics of internet users at libraries in Botswana

Figure 7

Demographics of internet users at libraries in Botswana


               Figure 8
             
               Key employment CIMS metrics by reporting country, with weighted averages

Figure 8

Key employment CIMS metrics by reporting country, with weighted averages


               Table I
             
               The seven CIMS categories defined

Table I

The seven CIMS categories defined


               Table II
             
               Sampling guidelines for CIMS

Table II

Sampling guidelines for CIMS


               Table III
             
               Beyond access project performance monitoring plan (PMP)

Table III

Beyond access project performance monitoring plan (PMP)

Note

Appendix 1. List of performance metrics

The following list includes a complete list of Performance Metrics. Country Grant grantees are required to report on most of the metrics. Two metrics are optional.

There are some metrics for which it may be challenging for GL to aggregate data; methods across countries are simply too disparate. The priority in these instances is for each grantee to establish a consistent data collection methodology that can be used each year to compare data over time. Metrics that fall into this category are marked with an asterisk (*) in the list.

Complete list of performance metrics

Public library service points:

  1. total number of public library service points;

  2. total number of public library service points providing public access computing;

  3. total number of public library service points providing public access computing that are supported by the GL grant; and

  4. total number of public library service points providing public access computing that are supported by other sources*.

Computers and workstations:

  1. (5) total number of workstations in which the computer was paid for by GL and the (one-time cost of) internet connection or upgrade was paid for by other sources*;

  2. (6) total number of workstations paid for by GL;

  3. (7) total number of computers paid for by GL that are not connected to the internet;

  4. (8) total number of workstations paid for by all other funding sources*; and

  5. (9) total number of workstations in which the computer was paid for by other sources and the (one-time cost of) internet connection or upgrade was paid for by GL*.

Use of workstations:

  1. (10) metrics related to workstation use rate. Metrics 10a and 10b will be used in the reporting tool to calculate use rate:

  2. (10a) total hours all workstations in the GL system are in use; and

  3. (10b) total possible hours all workstation in the GL system could be in use.

  4. (11) number of unique users of workstations in public libraries.

Visits:

  1. (12) metrics related to physical visits to public libraries:

  2. (12a) number of physical visits to all public libraries; and

  3. (12b) number of physical visits to GL-supported libraries.

  4. (13) total number of repeat visitors to public libraries; and

  5. (14) number of virtual visits to public libraries in the GL system – Optional.

Spending:

  1. (15) total amount of GL funding spent by the grantee;

  2. (16) total amount of funding from non-GL sources spent on general library services*;

  3. (17) total amount of funding from non-GL sources spent on public access computing*;

  4. (18) metrics related to in-kind donations:

  5. (18a) number of libraries that receive technology donations (e.g. hardware, software)*;

  6. (18b) number of libraries that receive staff capacity donations (e.g. a person provides assistance to the library willingly and without pay)*; and

  7. (18c) number of libraries that receive capital donations (e.g. buildings, infrastructure)*.

Training:

  1. (19) total number of library staff members;

  2. (20) metrics related to library staff training:

  3. (20a) the total number of library staff who receive formal training;

  4. (20b) total number of library staff who receive formal training in technology (such as basic computer skills, internet skills, e-commerce), whether once or multiple times;

  5. (20c) total number of library staff who receive formal training in advocacy, whether once or multiple times; and

  6. (20d) other, please specify.

  7. (21) metrics related to visitor training:

  8. (21a) total number of individuals trained through formal trainings;

  9. (21b) total incidences of formal training; and

  10. (21c) total incidences of informal assistance.

Library activities:

  1. (22) total number of loans of library materials; and

  2. (23) state of training of public library workers (grantees use their existing methods to track this metric; at this time, data will not be aggregated for this metric) – Optional.

These PMs are used to track the progress of the Country Grant programs individually and collectively. GL recommends grantees time data collection to match either 12-month program periods or the collection of national public library statistics by other sources (usually this will align with the calendar year).

Appendix 2. List of required CIMS indicators

Digital inclusion:

  • no. of library visitors who learn basic computer skills (e.g. turning computer on/of, using a mouse) as a result of public library services (e.g. internet, computers, training or assistance from library staff or outside experts);

  • no. of library visitors who learn intermediate computer skills (e.g. using Office products, conducting advance searches online, using online services like e-banking, paying bills or purchasing goods online) as a result of public library services (e.g. internet, computers, training or assistance from library staff or outside experts);

  • no. of library visitors who learn general internet skills (e.g. navigating web sites, e-mail, online searches, browsing, Skype) as a result of public library services (e.g. internet, computers, training or assistance from library staff or outside experts);

  • no. of library visitors who first used the internet at the public library;

  • no. of library visitors who use internet at the public library;

  • no. of library visitors whose use of technology (e.g. computer, internet, WiFi, e-books) has increased as a result of public library services (e.g. internet, computers, training or assistance from library staff);

  • no. of library visitors who create online content (e.g. posting on a wall or comment board, blogging, updating an online profile, uploading photos, designing web sites, or web content) using technology at the public library (e.g. WiFi, computer, internet);

  • no. of library visitors who save money as a result of technology provided by the public library (e.g. by using WiFi or Skype and saving on technology and communication costs, by purchasing goods or completing government forms online and saving on travel costs or because prices are cheaper online);

  • no. of library visitors from marginalized groups whose use of technology (e.g. computer, internet, WiFi, e-books) has increased as a result of public library services (e.g. internet, computers, training, or assistance from library staff) (Each grantee defines “marginalized group” in a way relevant in local context);

  • no. of public library internet users, by gender and by age; and

  • no. of library visitors using technology at the public library (e.g. WiFi, computer, internet) to access information related to each domain area (i.e. culture and leisure, education, communication, economic development, health, government and governance).

Education:

  • no. of library visitors who use public library services (e.g. technology, physical space for meetings or study sessions, informal training or assistance by library staff or external experts) to participate in informal learning opportunities (e.g. free courses online or in-person, training sessions, study groups, or learning circles);

  • no. of students who use public library services (e.g. WiFi, computer, internet, physical space, tutoring program provided at library) to complete their homework;

  • no. library visitors who read as a result of the technology at the public library (e.g. WiFi, e-books, internet – including use of Facebook or other social media sites);

  • no. library visitors who are qualified to get a job as a result of educational or job-related training opportunities they accessed using public library services (e.g. online education opportunities/programs, training and assistance, workshops, study groups, or learning circles);

  • no. of students whose academic performance has improved as a result of public library services (e.g. WiFi, computers, internet, assistance, or training); and

  • no. of library visitors whose earnings have increased as a result of educational opportunities (e.g. free courses, training, online courses, postsecondary programs) they accessed using public library services (e.g. WiFi, computer, internet, physical space, training, or assistance).

Health:

  • no. of library visitors who find health information that meets their needs (e.g. related to prevention, treatment, health providers) as a result of public library services (e.g. WiFi, computer, internet, training or assistance from library staff or outside expert);

  • no. of library visitors who seek health information for others using public library services (e.g. WiFi, computer, internet, training or assistance from library staff or outside expert);

  • no. of library visitors whose health decisions were informed by the health information they found using public library services (e.g. WiFi, computer, internet, training or assistance from library staff or outside expert); and

  • no. of library visitors whose health improved as a result of the health information they found using public library services (e.g. WiFi, computer, internet, training or assistance from library staff or outside expert).

Communication:

  • no. library visitors who e-mail with family and friends using technology at the public library (e.g. WiFi, computer, internet, Facebook e-mail);

  • no. of library visitors who communicate with family and friends through Skype, instant messaging, Facebook, or other online tools (excluding e-mail) using technology at the public library (e.g. WiFi, computer, internet); and

  • no. of library visitors who communicate more with family and friends as a result of technology at the public library (e.g. WiFi, computer, internet).

Culture and leisure:

  • no. of library visitors who are aware of community or civic activities as a result of technology provided at the public library (e.g. WiFi, computer, internet, Facebook);

  • no. of library visitors who use technology at the public library (e.g. WiFi, computer, internet, Facebook) to learn about the news; and

  • no. of library visitors who are involved in their community as a result of the services provided at the public library (e.g. technology, workshops, or events held at the library).

Economic development:

  • no. of library visitors who use technology at the public library (e.g. WiFi, computer, internet, Skype, Facebook) for business communications;

  • no. of library visitors who search for agricultural information (e.g. farming equipment or techniques, crop prices, weather information) using technology at the public library (e.g. WiFi, computer, internet, Facebook);

  • no.of library visitors who buy a product or service using technology at the public library (e.g. WiFi, computer, internet);

  • no. of library visitors who sell a product or service using technology at the public library (e.g. WiFi, computer, internet);

  • no. of library visitors who use services at the public library (e.g. WiFi, computer, internet, training or assistance from library staff or outside experts) to write a resume or CV;

  • no. of library visitors who use services at the public library (e.g. WiFi, computer, internet, training or assistance from library staff or outside experts) to find job listings or employment opportunities;

  • no. of library visitors who use services at the public library (e.g. WiFi, computer, internet, training or assistance from library staff or outside experts) to apply for a job; and

  • no. of library visitors who receive a job offer after using public library services (e.g. WiFi, computer, internet, training or assistance from library staff or outside experts) to apply for a job.

Government and governance:

  • no. of library visitors who search for government information (e.g. laws or regulations, descriptions of government programs and services, forms, government jobs) using technology at the public library (e.g. WiFi, computer, internet, Facebook);

  • no. of library visitors who use a government service (e.g. download/ fill out/ submit forms, pay taxes, request documents/licenses) through technology at the public library (e.g. WiFi, computer, internet, Facebook);

  • no. of library visitors who participate in governance processes (e.g. research politicians or citizens’ rights, interact with public authorities or elected officials, learn how to volunteer for political events, participate in political movements) using technology at the public library (e.g. WiFi, computer, internet, Facebook);

  • no. of library visitors who save time by accessing a government service using technology at the public library (e.g. WiFi, computer, internet); and

  • no. of library visitors who receive money/subsidies/support owed to them by the government as a result of their ability to access government services using technology at the public library (e.g. WiFi, computer, internet).

Corresponding author

Jeremy Paley can be contacted at: jeremy.paley@gatesfoundation.org

References

Hermawati, W. and Saari, R. (2011), National Assessment on Gender Equality and the Knowledge Society , Centre for Science and Technology Development Studies, Indonesian Institute of Sciences.

Huyer, S. , Hafkin, N. , Ertl, H. and Dryburgh, H. (2005), “Women in the information society”, in Sciadas, G. (Ed.), From the Digital Divide to Digital Opportunities: Measuring Infostates for Development , Orbicom/NRC Press, Montreal, pp. 135-196.

ISO 11620 (2008), “Library performance indicators”, available at: www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=37853 (accessed March 21, 2015).

ISO 2789 (2006), “International library statistics”, available at: www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=39181 (accessed March 21, 2015).

Sey, A. , Coward, C. , Bar, F. , Sciadas, G. , Rothschild, C. and Koepke, L. (2013), Connecting People for Development: Why Public Access ICTs Matter , Technology & Social Change Group, University of Washington Information School, Seattle, WA.

Further reading

Al, U. , Andrade Blanco, P. , Chiranov, M. , Cruz, L. , Dewata, Y. , Dryžaitė, I. , Farquharson, F. , Kochanowicz, M. , Liubyva, T. , Naranjo, A.L. , Ralebipi-Simelane, R. , Soydal, I. , Streatfield, D. , Taolo, R. and Tkachuk, Y. (2015), “Paper 1: global libraries impact planning and assessment progress”, Performance Measurement and Metrics , Vol. 16 No. 2.

Jacobs, D. (2015), “Guest editorial: impact planning and assessment of the global libraries initiative of the bill & melinda gates foundation”, Performance Measurement and Metrics , Vol. 16 No. 2, pp. 1-4.

Streatfield, D. , Andrade Blanco, P. , Chiranov, M. , Dryžaitė, I. , Kochanowicz, M. , Liubyva, T. and Tkachuk, Y. (2015), “Paper 4: innovative impact planning and assessment through global libraries”, Performance Measurement and Metrics , Vol. 16 No. 2.

Acknowledgements

© Authors. Published by Emerald Group Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 3.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial & non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/3.0/legalcode

Funded by the Bill and Melinda Gates Foundation.

Related articles