Performance Measurement and Metrics The evolution of Global Libraries’ performance measurement and impact assessment systems

Purpose – The purpose of this paper is to describe the evolution of a common approach to impact assessment across the Global Libraries (GL) portfolio of grants. It presents an overview of two systems, the Performance Metrics (PMs) and the Common Impact Measurement System (CIMS). By providing a standard set of definitions and methods for use across countries, these systems enable grantees to collect data that can be compared and aggregated for the purpose of collective learning, improvement, accountability, and advocacy. Design/methodology/approach – The PMs offer a standard methodology to collect library project performance management data, whereas the CIMS is a standard survey of public library users. The paper describes how the PM and CIMS data are being visualized and used, with examples of findings and lessons learned. Findings – The paper cites examples of the type of PM and CIMS data available, with a focus on employment, gender, and case studies from Botswana and Indonesia. These highlights illustrate how libraries ’ user demographics differ from other types of public internet access venues and how libraries can contribute to strong employment and growth.

The evolution of Global Libraries' performance measurement and impact assessment systems

Introduction
The Global Libraries (GL) initiative of the Bill & Melinda Gates Foundation provides access to information through technology in public libraries across entire countries. GL's multi-year grants (or "country grants") fund efforts to understand local information needs, purchase equipment for libraries that can help meet those needs, train library staff, and help libraries build public support for long-term funding (For more information about the GL initiative and its goals, see "Guest Editorial: Impact Planning and Assessment of the Global Libraries initiative of the Bill & Melinda Gates Foundation," which begins this special issue of Performance Measurement and Metrics). Each Country Grant project includes staff and financial resources for a body of work known as "Impact Planning and Assessment (IPA)." The core principle of IPA is the importance of understanding local needs, designing services to meet those needs, and measuring progress for the purposes of learning, improvement, accountability, and advocacy. GL grantees design and implement programs tailored to local environments by targeting individuals' and communities' needs as well as local governments' priorities. By addressing local funders' particular priorities and promoting libraries' impact in these domains, library development programs are demonstrating the important contributions that they make and are attaining increased, sustainable funding (see "Paper 1: Global Libraries Impact Planning and Assessment Progress" for more information about the GL approach to IPA).
The IPA approach theoretically means library programs should focus on measuring only what is locally relevant. For several years, this is what the set of grantees did, but eventually the lack of standardization created problem for both advocacy and management of the portfolio of similar grants. Without standard measures across projects, GL could not communicate about aggregate achievements to an internal Foundation audience or to the external public library field. Advocacy and communication efforts were further limited by GL's inability to situate any single grantee's work within an international context. Several grantees measured similar concepts, such as public libraries' contribution toward reducing unemployment, but did so in dissimilar ways, making comparison impossible.
This paper describes an evolution from a collection of sui generis measurement schemas toward a common approach to measurement across the GL portfolio of grants. It presents an overview of two such systems, the Performance Metrics (PMs)

133
The evolution of GL performance measurement (introduced in 2009 and refined in 2013) and the Common Impact Measurement System (CIMS) (introduced in 2013). Each of these systems represents a distinct link in GL's theory of change: GL and its grantees aim to increase access to information (measured by the PMs) to improve people's lives (measured by the CIMS).
The paper then describes how the PM and CIMS data are being visualized and used, with a summary of some of the type of interesting, useful findings that have emerged so far. Recognizing that GL programs are unique, and that few library projects have the resources to track such a comprehensive set of indicators, a case study is presented about how smaller initiatives can take the theory behind CIMS, borrow the key principles and a subset of indicators, and adapt the approach to their own needs. The paper concludes with lessons learned from the experience of designing these common measurement systems.
It is important to note that despite the move toward standardization over the past three years, other data points remain important to grantees. GL still encourages grantees to collect information that is relevant to their local efforts and priorities. For example, grantees often collect data about capacity of library staff, the strength of library partnerships, the efficacy of the policy environment, and people's perceptions about librariesnone of which are covered by the standard measurement systems introduced by GL.
Aggregating achievements: the evolving GL PMs Purpose and background The GL PMs are a set of indicators that measure the achievements and monitor the progress of GL's Country Grant programs (The PMs closely follow the international standards in ISO 2789and ISO 11620, 2008 [1]. Released in 2009, the initial version of the Global Libraries Guide to Impact Planning and Assessment contained a set of required and recommended PMs perceived as mainstream in the library field and desirable by managers, funders, and policy makers. In 2012-2013, after finding it difficult to aggregate and compare the data that had begun flowing in from several countries, GL consulted with grantee impact specialists to refine the metrics and to ensure that all grantees collect data in standard, comparable ways, using identical definitions and methods.
By providing a standard set of definitions and methods for use across countries, the PMs enable grantees to collect data that can be compared and aggregated for the purpose of collective learning, improvement, accountability, and advocacy. Today, all GL Country Grantees use these metrics.

PMs components
The PMs are grouped according to the following key categories: • Public library service points metrics tell us how many libraries the GL program reaches: -Public library service points include the total number of public library service points providing public access computing. A public library service point is any library facility, fixed or mobile, through which the public library provides a service to the general public. Central libraries, branch libraries, and mobile libraries are all each individual service points.
-Public access computing means providing at least one workstation available to the public regardless of whether access is free.

134
PMM 16,2 • Computers and workstations metrics tell us the amount of new technology available as result of the GL program. A workstation is a computer connected to the internet.
• Use of workstations metrics tell us whether library visitors are using the new technology the GL program provided.
• Visits to public libraries metrics tell us whether library use changes over time, particularly after the GL program provides new technology.
• Spending metrics tell us whether public investment in the country's libraries changes over time.
• Training metrics tell us how many library staff and users receive training during the GL program.
• Library activities metrics tell us what library visitors are doing.
The following components accompany each metric: • A specific definition for the metric.
• How to count: the sources, frequencies, and methods for data collection. These guidelines encourage proper sampling and estimating to avoid extensive measurement costs and level of effort.
• Required/Optional: if a metric is required, a grantee must collect the data. If the metric is optional, grantees may choose whether to collect this data.
• Target population: for some metrics, data should be collected from GL-supported libraries only. For other metrics, GL requests that data be collected (or estimated) for all libraries in the country.
See Appendix 1 for a complete list of the PMs.
From shared outputs to shared impact: the GL initiative's CIMS As described above, all GL Country Grantees collect data to track and inform their work. For several years they have used the PMs to measure progress as they increase public access to information through technology and training. While the PMs have allowed these grantees to measure the growth of their technology and services, this is just one part of the picture. In order to continuously improve their services, evaluate the global scope of their impact, seek new types of partnerships, and advocate for more government support, Country Grantees need a standardized way to demonstrate how providing access to technology in libraries improves people's lives. In turn, to advocate effectively for funding that sustains access to information through technology, individual libraries must also be able to communicate the benefits they provide to individuals and communities.

The purpose
Today, the GL initiative seeks to equip public libraries with evidence of their ability to drive developmentnot just through traditional performance indicators, but in measurable results like job skills developed, education attained, employment found, money saved, and livelihoods improved. GL Country Grantees now employ a CIMS to quantify their work's impact on public library users.

135
The evolution of GL performance measurement Data collected through CIMS enables public libraries to shift their focus from the services they provide to the outcomes they help individuals and communities realize. By agreeing to report the same measures using standardized definitions and methods, the GL initiative and Country Grantees are able to: • aggregate data to determine the total impact of GL Country Grantees and enhance their ability to advocate for the importance of public libraries; • track data over time to identify and monitor trends in public library use and reach, and incorporate this information into Country Grant programs and library services; • compare data across countries to allow grantees to learn from one another's successes and challenges; • refer to a central, definitive source in communications and advocacy activities, so there is no confusion about where the numbers come from or how they are calculated; and • leverage a new online reporting system, the Data Atlas, to visualize public library data, giving the GL initiative, and Country Grantees insight into dynamic results as they are reported.
The CIMS framework is a shared set of indicators that GL Country Grantees measure to understand their individual and collective impact on the lives of library users. These indicatorsand the positive outcomes libraries hope to realizespan seven common issue areas (Table I).
See Appendix 2 for the complete list of required CIMS indicators across these seven categories.
Guiding principles behind CIMS CIMS helps demonstrate progress toward a set of desired outcomes. For each of the seven issue areas, the GL initiative has identified a set of outcomes, or positive Term Definition Digital inclusion People use public library services to access technology, build technologyrelated skills and confidence, and make beneficial use of digital content and services that meets their needs. Public libraries are a place where library staff, volunteers and visitors can help an individual or groups become more digitally included Culture and leisure People use public library services to enrich their lives, preserve or promote their cultural heritage, and enjoy recreational or leisure activities. Public libraries are social hubs and catalysts for community engagement Education People use public library services to gain and impart knowledge and skills, improve their academic performance, acquire job-related skills and qualifications, and engage in lifelong learning Communication People use public library services to communicate and connect with others, and enhance their sense of inclusion and community Economic development People use public library services to identify employment opportunities, increase their income and productivity, and improve their livelihoods Health People use public library services to inform health-related decisions and improve their own or others' mental and physical health Government and governance People use public library services to access government information and services, engage in civic activities, and interact with government officials changes in the lives of individuals and communities, that libraries can help achieve. CIMS provides the indicators that help grantees collect data about whether these outcomes are occurring. Before the indicators were developed, a set of guiding principles were established to govern how the new measurement system would operate: (1) CIMS was designed with practical validity in mind, seeking a balance between what would be ideal to know and what is feasible to measure: each Country Grantee must feel comfortable that the data they collect paints an accurate picture of public library users in their country. To make the system manageable for Country Grantees with different levels of research experience and capacity, GL decided to work closely with Country Grantees and evaluation experts to develop the system.
(2) CIMS emphasizes contribution, not attribution: CIMS is designed to help GL Country Grantees understand whether the efforts of grantees are one of the causes of improvements in the lives of library usersnot whether (or how much) these efforts are directly or solely responsible. In practice, this means GL does not require grantees randomly assign their intervention to libraries and conduct experiments using control groups. In the contexts in which grantees work, choosing a random sample of libraries with which to work is often impossible (or extremely difficult). This sacrifice in the level of rigor is balanced by virtues such as practicality and grantee endorsement. Some grantees have chosen to compare data with findings from control groups, but this is not a requirement.
(3) The CIMS framework includes both required and optional indicators: CIMS includes 41 required indicators for which all Country Grantees gather data (see Appendix 2 for a full list), and 53 optional indicators that Country Grantees may use if relevant to their program's focus area. The GL initiative also encourages Country Grantees to collect additional, custom data to demonstrate the impact of their programs in areas of interest to specific stakeholders.
(4) CIMS was designed to be manageable: the 41 required indicators translate into only 20 survey questions, a fraction of the size of surveys that Country Grantees already conduct. Grantees determine the format of the survey (paper or electronic), and the GL initiative recommends that grantees choose an external organization (like a polling firm or university) to collect the CIMS data ( Figure 1).

The collaborative process used to develop CIMS
The CIMS outcomes, indicators, and methodology were designed in collaboration with the GL initiative's staff, evaluation experts, and several members of each Country Grant team, including program directors and advocacy specialistsin all, over 50 people shaped the framework over a year-long process. The GL initiative used this inclusive process for four reasons: (1) to promote a sense of shared ownership and familiarity with CIMS among Country Grantees; (2) to ensure that the indicators are truly relevant in the context of Country Grants;

137
The evolution of GL performance measurement (3) to ensure that the methodology is practical and not a burden on Country Grantees; and (4) to leverage Country Grantee expertise in research, planning, evaluation, and indicator development.
Impact specialiststhe Country Grant team members responsible for evaluation of projects and processeswere most deeply engaged in the design process. These grantees provided critical input on every step of the design of CIMS, from indicator selection to methodology guideline development.
To design the indicator system, facilitators used several types of processes to collect feedback. To generate long lists of potential indicators, the primary tools were: • direct engagement: brainstorming exercises with grantee Impact Assessment Specialists and Advocacy Specialists at in-person meetings; • individual phone calls with Impact Assessment Specialists; and • group Skype calls with a cross-country pool of Impact Assessment Specialists.
After this initial set of exercises to brainstorm potential indicators for inclusion, facilitators led the participants through a set of activities designed to winnow down the list of 265 potential indicators into a manageable set that would not over-burden grantees. Techniques included: • An online asynchronous voting exercise with all Impact Assessment Specialists, Advocacy Specialists, and Program Directors. Respondents were asked to vote on whether each indicator should be required, optional, or removed from the system entirely. During this phase Impact Assessment Specialists were encouraged to seek input from Country Grant team members, and to report back each group's consensus votes. Each country provided consolidated results from multiple members of the country team.
• An online asynchronous voting exercise with GL staff and leadership, including 15 external stakeholders/consultants (e.g. from the Information School at the University of Washington).
• Facilitators color-coded the indicator votes and synthesized comments to help the GL team hone and prioritize the indicators.
Because 48 percent of the votes were for a proposed indicator to be "required," GL worked to narrow the number of "required" indicators to a more manageable number of 41 required and 53 optional indicators.
Following a process of indicator refinement, survey questions were developed. The 41 required indicators can be collected through a mere 20 survey questions (11 content questions and nine common demographics questions) (Figure 2).

Overview of CIMS methodology
All grantees and external partners use the GL-provided methodological guidelines to collect the CIMS data using a Survey of Library Visitors and a Pop-Up Survey. These guidelines are provided to ensure that data can be aggregated and compared across countries and that field-accepted processes are employed. The guidelines also encourage proper sampling (and estimating, when necessary) to avoid extensive measurement costs and level of effort. Question wording Grantees are required to adopt the survey questions provided exactly as worded and then translate them into the appropriate local languages.

Translation procedure
The accurate translation of survey questions into local languages is integral to the success of CIMS. Therefore, grantees are required to use the following translation procedure: • CIMS questions are translated by a professional translator; • this translation is reviewed by the impact specialist for accuracy in the library context; and • impact specialists adjust language where necessary to enhance accuracy.
In consultation with external data collection partners, grantees determine: • the number of languages into which the survey should be translated (based on grantees' understanding of the demographics in the regions where the survey will be administered); and • whether and how much piloting of the translated questions to conduct.
Data collection instruments Survey of library visitors. To collect the CIMS data, there are two data collection instruments options: an electronic survey and a paper survey. Grantees determine the data collection instrument for the Survey of Library Visitors in consultation with their data collection partners. Regardless of the survey instrument chosen, grantees are required to have an external partner identify survey participants, provide participants with the survey, and have participants complete the survey alone. The external partner should: • use random sampling to identify the survey participants; • provide the survey participants with the survey by handing it to them in paper format or providing them with a computer or tablet and orienting participants to the survey on the screen; and • be available to answer any clarifying questions that arise as participants complete the survey.
"Pop-up"-style survey of library technology users. Grantees are also given guidelines about which survey questions can be collected via a "pop-up" survey that appears on libraries' public access computers during users' internet sessions. These questions are based on indicators such as: the number/proportion of public library internet users by gender; the number/proportion of public library internet users by age; and the number/proportion of library visitors using technology at the public library to access information related to each of the seven CIMS domain areas. Grantees are told that they should collect data for these questions only if they have technology to administer a "pop-up" survey. Grantees can also choose to use these surveys to collect other custom (non-CIMS) data related to their programs.

Frequency of data collection
Grantees collect the CIMS data annually. They may schedule data collection to meet their needs and their reporting periods. One good practice is to align/integrate the CIMS data collection with existing plans to conduct impact assessment studies.

141
The evolution of GL performance measurement When a grant ends, GL considers providing post-grant support to help grantees or other institutions, such as governments or other partners, continue to collect the CIMS data. For example, in 2014 GL commissioned the research firm TNS Global to conduct a CIMS study in four countries where grant programs had ended: Bulgaria, Lithuania, Botswana, and Mexico. Conducting CIMS studies after grants have ended can provide insight into sustainability, offering powerful evidence that libraries are continuing to play an important role in their communities even after a large project ends.

Sampling
A great deal of time and effort went into creating sampling guidelines that achieve practical validity across countries by ensuring that survey responses reflect the diversity of library visitors. Grantees and external partners use the following sampling guidelines to select libraries where the surveys are administered and to determine the sample size for the surveys (Table II).
The Data Atlas: using technology to provide access to comparative PMs and CIMS analytics After grantees agreed to collect and report the same measures, using common definitions and methods, in 2014 GL worked with a data management and software development firm, Community Attributes, to build an innovative, dynamic online results reporting, and visualization system called the Data Atlas.
This new prototype web site visualizes public library data from across the portfolio of Country Grants, giving the GL initiative and its grantees insight into dynamic results as they are reported. The Data Atlas features customizable dashboards and data visualization tools, which allow users to track and compare quantitative measures of library performance and impact, with additional reporting features and functionality currently in development.
By the end of 2015, the features that will be available in the Data Atlas include: • executive and portfolio-level comparative reports; • in-depth reports on user outcomes at the country level; • a map book of spatial visualization tools, including sub-country level mapping for select countries; • embedded qualitative items: users will have the ability to upload content such as videos and photos, tagged to specific indicators or locations; • a "Statistics Generator" tool which will allow users to generate key statistics based on data available in the system, including the ability to calculate impact based on set of customizable parameters; • a "Storyboard Creator" that will allow users to knit together compelling stories using text, pictures, and videos combined with data-driven elements such as charts, graphs, and maps; • data import, export, and management tools by which users can edit their data in the system; and • social media sharing.
We hope that over time, the Atlas can become a truly global "library impact data hub," a platform that individual libraries, National Libraries, library associations, and other 142 PMM 16,2 library sector organizations use to upload data, compare results across common metrics, tell stories, develop and track the progress of strategies, conduct advocacy, and collaborate. The Data Atlas, with PM and CIMS data from the GL portfolio, can be viewed at www.glatlas.org (Figure 3).

Illustrative CIMS findings so far
The following CIMS analysis was made possible by the Data Atlas. It is intended to be illustrative, rather than a complete summary of the findings.
Guidelines related to establishing a sample of libraries in which to administer the survey Eligible libraries GL requires that the Survey of Library Visitors and the Pop-Up Survey be administered in libraries that receive any type of GL support, such as funding, hardware, software, internet connectivity, training, or other types of support. Grantees may also administer these surveys in non-GL-supported libraries, perhaps to understand impact across all libraries in the country or to make comparisons between the impacts achieved in GL-supported vs non-GL-supported libraries Sample size, geography, and library size There is no minimum number of libraries in which to administer the Survey of Library Visitors and Pop-Up Survey. Grantees choose a sample of libraries that represent the diversity of GL-supported libraries in their country. For example, if libraries differ by geographic location, urbanicity, and size, grantees include a mix of libraries that is representative of different geographic locations and different sizes Guidelines related to establishing a sample of survey responses Sample size GL requires a minimum sample size of 400 responses in the Survey of Library Visitors for questions related to library visitors' use of technology. Since technology users are a subset of the overall population of library visitor respondents, achieving the minimum sample size for these questions also achieves a sufficient sample size for the other survey questions Randomization Researchers are required to use a random sampling approach. They may use this approach until they achieve the minimum sample size for the Survey of Library Visitors questions related to technology. If technology users are underrepresented after an initial round of surveys administered using random sampling technology, a booster sample may be used to increase their numbers Seasonality Grantees gather CIMS data annually. When possible, grantees administer the survey during seasons likely to get a representative sample of library visitors. For example, if library use changes during the summer months compared to the rest of the year, then GL recommends avoiding collecting the CIMS data during summer months Days of the week and time of day GL requires that grantees consider the different types of visitors likely to use the library on different days of the week or at different times of day. For example, unemployed visitors might be more likely than employed visitors to use the library on weekdays. Or younger visitors might use the library more frequently after school while older visitors might use it in the morning. Grantees ensure that the researchers administer the survey on a variety of days of the week and at various times so that data reflect a diversity of perspectives Library visitor age GL recommends that grantees and external partners survey library visitors who are approximately 13 years and older  Comparing the use of technology at libraries across countries The information in Figure 4 below focusses on several key digital inclusion CIMS metrics, displaying data by country, as well as a weighted cumulative average for each metric. Many important data points are displayed in this figure, such as the success story that 57 percent of library visitors across all of the countries report that their use of technology increased as a result of public library services. Other country-specific insights are immediately available: for example, Ukraine has the second highest percentage of library visitors whose use of technology increased as a result of library services (81 percent) and the highest number of library internet users for whom the library is their only access to free internet (64 percent), emphasizing the importance of the role of libraries as a means to increase digital literacy of visitors.
These data also reveal that libraries are no longer simply a holding area for books, but are providing the tools to help patrons produce their own online content: 35 percent of library internet users across all of the countries created online content. Examples such as Latvia, where 77 percent of the library internet users generate online content, show that libraries are becoming places where local content is being created, not just consumed or checked out.
Gender case studies Data Atlas users can also examine patterns across various demographic and socio-economic attributes of library visitors, including age, gender, race, urbanicity, education level, employment status, income, and disability. Of particular interest is that across this cohort of grantee countries, library users are predominantly female, often by a large margin. On average 58 percent of the library visitor population is female across the 12 countries that have reported CIMS data thus far (see Figure 5).
As many countries look to close the international digital divide, it is difficult to imagine much progress in a country's information economy without the participation of women. However, according to Orbicom, the International Network of UNESCO Chairs in Communications (Huyer et al., 2005), women's participation in the information society, particularly in poor countries, lags significantly behind that of men. And according to the Global Impact Study of Public Access to ICTs, conducted by researchers from the University of Washington (Sey et al., 2013), women accounted for just 29 percent of unique visitors to public internet access points such as libraries, cyber cafes, and telecenters in the countries they studied (Brazil, Chile, Ghana, Bangladesh, and the Philippines). (In that study, libraries had the highest proportion of female visitors, at 47 percent of users, compared to 28 percent for cybercafés and 23 percent for telecenters).
Findings from the CIMS data available in the GL Data Atlas are encouraging and reveal that libraries are emerging as an important public access venue for closing the gender digital divide.

A deeper look at Indonesia
According to the Center for Science and Technology Development Studies at the Indonesian Institute of Sciences, only 35 percent of the internet users in Indonesia were female in 2010 (Hermawati and Saari, 2011). Interestingly, the GL Atlas reveals that something different and special is happening at public libraries, with survey results from participating Indonesian libraries showing that 62 percent of library internet users in Indonesia are female (see Figure 6).

145
The evolution of GL performance measurement  Figure 6.

147
The evolution of GL performance measurement While further validation is needed to determine the effects of sample weighting and other factors contributing to these statistics, these initial findings tell a powerful advocacy story that libraries appear to be a positive force for increasing women's online participation around the world.
The Data Atlas can also reveal findings that can be useful for program design. For example, in contrast to a female majority of internet users in other countries in the GL cohort, the Data Atlas reveals the opposite gender gap in Botswana, with women representing 44 percent of library visitors and only 33 percent of library internet users (Figure 7). Insights like these can hopefully help raise awareness and help grantees, as well as national, regional, and local libraries regarding their outreach and programming.
Employment case study One critical opportunity offered by public internet access at public libraries is providing access to employment services. The dashboard below displays information about how library users search and apply for job listings at libraries in GL cohort countries (Figure 8). On average, 12 percent of library internet users search for job listings, of which 8 percent go on to apply for a job. On average, 23 percent of these applicants report receiving a job offer after using the library to apply for a job, a powerful example of how libraries can contribute to strong employment and economic growth.
These examples show a small portion of the data made available through the GL Data Atlas, revealing a glimpse of a rich and robust repository of research findings and powerful statistics for advocacy. The prototype web site remains in development, with the objective that it exist for years to come as a powerful tool to help analysts derive meaning from large libraryand ICT-related data sets in support of sustaining library impact around the world.

How public library partners and allies can use CIMS
The application and utility of CIMS need not be limited to the universe of GL grantees. Public library partners, agencies, governments, aid organizations, advocates, and The evolution of GL performance measurement conceived in 2011 as a model that would consolidate lessons from the Country Grants and opportunistically integrate that approach into work that governments and international development organizations were already doing. Beyond Access set out to build momentum for library-based collaborations with existing development initiatives.
Through advocacy and engagement activities, Beyond Access recognized that there is an opportunity to tie libraries to development quickly and sustainably. Since that time, and without country-based staff, Beyond Access has designed and implemented projects in several countries, including Myanmar, the Philippines, Peru, and Georgia.
Beyond Access projects do not typically support comprehensive evaluation resources in country. Each project operates as a coalition of local organizations including libraries, government agencies, and NGOs. As a result, the program relies on a local partner (usually a local NGO) to provide in-country support for both project implementation and evaluation. While this approach can increase sustainability prospects, these partners are unlikely to have dedicated evaluation staff, and many have limited experience in monitoring and evaluation. The most common funding resources for these partners are sub-awards under international funding mechanisms or through local sources. These funding streams are unlikely to include sustainable funding for evaluation.
As a result, when the PMs and CIMS were established for the Country Grants, Beyond Access was faced with a challenge: how could the comprehensive list of 63 PM and CIMS measures be distilled into a list of essential indicators that would not be burdensome for implementing partners to track? Could an approach be developed that would be useful beyond the life of grant funding? In the end, Beyond Access settled on 16 indicators (shown in Table III) that are tracked across all projects. These indicators were selected because Beyond Access believed they represented reasonable overlap with what libraries already tracked, while pointing toward new ways to articulate data in a way that would resonate with development audiences.
These core indicators draw heavily from the GL PMs (indicators 1-5) with additional indicators that capture the integral role of partnerships at local, national, and global level in the Beyond Access model (indicators 8-16).
The PMP incorporates a single indicator (indicator 7) that addresses the seven CIMS impact domains. This indicator serves as a meta-indicator to capture impact in a variety of development domains across all of the Beyond Access projects.
At the outset of each project, Beyond Access staff work with the implementing partners to identify project level indicators and to train them on monitoring and reporting requirements. The primary goal during this process is to prioritize the data that libraries can easily collect in their libraries and leverage for advocacy, so that the data collection and use is sustainable beyond the life of the project and partners appreciate the value of these data. This starts by identifying data that are already collected in project libraries (e.g. number of library visits) and streamlining how that is reported to the implementing partner (usually via an online survey).
CIMS can then be incorporated at an individual project level. Beyond Access works with each project's implementing partner to identify targets that align with advocacy goals. These targets are typically output indicators that align with CIMS development domains, such as the number of people that receive basic ICT training (Digital inclusion) or the number of people that receive workforce development training (Economic development). These data are supplemented by a user survey that is structured so that it can easily be administered by library staff. The survey questions can be derived from CIMS survey questions that align with the project's target domains and advocacy objectives.

PMM 16,2
Limited resources and partner capacity are real constraints that Beyond Access grapples with, yet by simplifying the indicators and providing training and support at the outset to implementing partners, Beyond Access has designed an M&E approach that allows for essential key learnings to be identified and shared.
For more information about the Beyond Access initiative, see www.beyondaccess.net

151
The evolution of GL performance measurement Lessons learned: eight pieces of advice for funders or agencies interested in developing common indicator systems Over the past few years, developing common indicator systems to track progress across projects has become increasingly in vogue among government agencies, philanthropies, and the corporate world. In our three years of work on CIMS, GL learned several things which we now offer as potential insights for others interested in embarking on similar efforts in their fields: (1) When developing a set of common indicators that grantees around the world will measure, build in plenty of time to the work plan. Grantees will naturally be concerned about accountability, so the stakes are high. While a funder or agency could probably develop a list of common indicators alone in a room in a single afternoon, true engagement takes time! The benefit is that the system will be embraced by grantees, who will be eager to use and sustain it.
(2) Use existing grantees or partners to provide energy and insight to new ones. A couple of brand new grantees were skeptical about the need for a common measurement system; rather than have GL reinforce the new requirement in a "top-down" manner, veteran grantees continued to advocate for the system, explaining to new grantees why it would be valuable, swaying their opinion and building support.
(3) Grantees found it helpful that GL offered a set of criteria to help them prioritize and winnow down the long lists of indicators (e.g. "Would data on this indicator be useful for advocacy?").
(4) Offer "optional" indicators (and/or some way for participating organizations to add their own custom categories/indicators) from the outset to encourage buy-in. Optional indicators act as a "pressure release valve" to lower the stakes that any single indicator be included in the final framework. (5) When soliciting and summarizing feedback on potential indicators, use a "decision-making matrix," a table containing all changes, additions, and subtractions from the system, and include documentation on the reasons why each change was made. This record of edits will ensure that everyone feels heard and can see why certain decisions were taken. It also makes it easy to refer to previous changes and remember why certain edits were madeparticularly months later! (6) Determining an agreed-upon common methodology ended up being half the work and the most contentious part of the process. However, it is also one of the most important aspects of a common measurement system, as without some agreed-upon methodological guidelines, the data on the common indicators will not be comparable and the work will have been wasted.
(7) Expect to make small tweaks to the system when designing and implementing it. A funder's ability to be 100 percent consistent is limited by certain unique community contexts. It is important to "pick your battles," knowing what you must stand firm on and where it is possible to be more flexible. For example, GL was flexible with some of the demographic categories; in some contexts, asking about ethnicity or membership in a "minority group" was not politically acceptable, so we waived the requirement that each grantee ask the exact same demographics questions.

152
PMM 16,2 (8) Grantees should be given interactive access to all of the data collected by the system. Extracting data from grantees is not sufficientthey must be given the ability to make use of their own data, at a minimum, but ideally they should be able to benchmark and compare themselves with their colleagues and peers who are doing similar work. Success should be measured not only by whether the data prove useful to the funder, but also to the grantees who have collected and provided it.

Conclusion
In the two years since GL introduced the CIMS, innovative libraries around the worldand the institutions that support themhave also begun conducting performance measurement in new ways, frequently emphasizing their contributions to improvements in users' lives. Several new, similar outcome frameworks have emerged, often with a focus on digital literacy, employment-related skills, and school readiness/ early learning. As GL winds down its strategy over the coming years, it is our hope that an ever-greater number of libraries embrace this practice, measuring whatever user outcomes are locally relevant. We believe this to be one of the most potent tools libraries can use as they seek sustainable funding amidst tough competition for scarce public resources. While seeing CIMS adopted and adapted would be auspicious, it is not the framework itself that we see as important, but the practice of user outcome measurement itself.
The evolution of GL performance measurement