The Finna service: meeting the new measurement challenges in libraries

Timo Laine (National Library of Finland, University of Helsinki, Helsinki, Finland)
Markku Antero Laitinen (National Library of Finland, University of Helsinki, Helsinki, Finland)

Library Management

ISSN: 0143-5124

Article publication date: 5 September 2018

Issue publication date: 7 January 2019

2486

Abstract

Purpose

In the transformed information environment, the impact and value of the services are not adequately shown using the traditional library metrics. It needs to be supplemented with user-centered ways of measurement. The paper aims to discuss these issues.

Design/methodology/approach

The paper is a case study of the new Finna service and the measurement challenges it presents.

Findings

The standards guiding the measurement and evaluation of libraries cannot offer a “cook-book” for the organizations to follow. The paper suggests that as a one possible response to this, the Net Promoter Score can be used as one indicator in measuring the impact of new services.

Research limitations/implications

The findings of the paper are preliminary, because so far there is not a wide experience of the use of NPS in libraries. This calls for further study. The results are encouraging, but more testing is needed with different services.

Originality/value

NPS has not been widely used in libraries before.

Keywords

Citation

Laine, T. and Laitinen, M.A. (2019), "The Finna service: meeting the new measurement challenges in libraries", Library Management, Vol. 40 No. 1/2, pp. 2-11. https://doi.org/10.1108/LM-02-2018-0007

Publisher

:

Emerald Publishing Limited

Copyright © 2019, Timo Laine and Markku Antero Laitinen

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

In the pre-digital library environment, we could not even dream about finding out what kind of information searches the clientele made using the traditional card files of the library – we could only see that the cards were worn out. Now, information systems give us that kind of information, and we only have to agree on common rules what information should be collected. Still, libraries have been digital already more than 20 years but still there is uncertainty what and how should be measured. At the same time, the demands for cost-effectivity and showing the positive impact of library are growing.

The digital information environment of today gives an immense number of possibilities to analyze and use information. Digitalization entails a change of paradigm of cultural heritage organizations’ ways of operations to new, networked models. This change is naturally reflected as a shift from traditional and simple catalogue-centric services toward a multidimensional and many faceted information environment.

In Finland, the National Digital Library (NDL) initiative aimed to modernize end-user service delivery. The Finna service as the end-user interface for the NDL represents a change compared to traditional online service design patterns used in cultural heritage organizations. Finna is developed as an open-source effort, and it combines the collections of Finnish libraries with collections from Finnish museums and archives. Its users can use just one service to access the materials of Finnish libraries, archives and museums, without having to know or care about the differences between services, collections and organizations.

The first online library services featured tight links between the user and the service and the service and the collection. Typically, the open public access catalogue (OPAC) was the only place where the collection was publicly visible online. To be able to use the service, the user had to know about it, and most users were patrons of the library. The information systems architecture was catalogue oriented: the catalogue was the starting point, and the task was to make it openly accessible. The resulting value chains were linear and simple to understand.

Services such as Finna break these links. They combine the collections of the library with collections from other organizations, also from other sectors: a search is no longer a library search but just a search. The new services are service platforms, powering different tailor-made interfaces to the same big index: the context for the user is no longer the library. They let third party applications and services access the collections through open application programming interfaces (APIs) and use them in various different ways: the users might not even know that the information on their phone screens comes from a library collection. They combine the metadata from many libraries or other cultural heritage organizations, making everyone’s records easier to find: the record no longer belongs to any organization.

Because these links are broken, it is no longer possible to distinguish the use of services or collections of individual organizations from that of other organizations. The user of modern services does not necessarily know or care about the organizations providing them. By using these services, he is not a library user, a museum user or an archive user, but in a different sense, he is all of them at the same time. What matters is not the single organization eventually fulfilling the user’s request for a certain book, but how the service as a whole satisfies the users’ needs. On the architectural level, the orientation is reversed: while we traditionally started from the catalogue, now we start from the user-facing services. The new architecture is thus user oriented.

2. Evolution from control to user friendliness

The eighteenth–nineteenth century philosopher Jeremy Bentham developed a design for a prison, the Panopticon (Figure 1). It was not a user-centered design. Instead, it was meant to provide maximum control: if prisoners could not be placed under constant inspection, they would at least be under the belief they were. The challenge was “to achieve a multitude of objectives – safe custody, confinement, solitude, forced labor and education.” The design was a combination of architectural choices and an incentive scheme, but the architecture is our main concern here (Schofield, 2009, pp. 72-74).

Architecturally the Panopticon is a circular building, “with the cells of the prisoners, divided by partitions, around the circumference, and the inspector’s lodge occupying the center.” A fundamental idea is to “enable the inspector to see into all the cells, but none of the prisoners would be able to see into, far less through, the inspector’s lodge.” The prisoners do not know if the inspector is in his lodge or not, and therefore they can never be certain if they are being currently inspected. This would strongly discourage escape attempts (Schofield, 2009, pp. 72-73).

The traditional OPAC works much the same way. It is a conceptually simple design. It only allows very specific operations on a specific catalogue, and all the operations are detected. It is a closed system, not allowing anything to get out, which helps keep usage information in one place. The traditional librarian has many of the advantages of the panoptic inspector.

There are quite a few elements in the OPAC that gives us good usage data for the same reason that they fail the user. What these elements have in common is that they give us control. Now we are giving up that control, including the knowledge it gave us.

First, if the patron uses our OPAC (which is part of our integrated library system (ILS)), we know that the patron is accessing our catalogue, because the ILS contains no other catalogue. However, the patron who wants to access the catalogues of multiple libraries is not happy to have to use many OPACs to access all the catalogues. The patron prefers a service that gives him access to other catalogues in the same service, providing that it is easy to use.

Second, with the traditional OPAC we know that no user goes undetected, so the usage data give us the full picture. People cannot access the catalogue by other means, so that there are no users of whom we are unaware. However, openness is essential in the networked society in which we now live, and openness means opening our data through APIs. When we open our data, we no longer know where it ends up and how it is being used.

Libraries have to give up the idea of having the “full picture” – they have to choose what we want to know more precisely and choose new metrics for it. Libraries need to develop an understanding of what the impact of their open data means in terms of their own impact. For example, an independent developer may write an application that relies on library data to provide a service to the end-user. That service might be something completely different from traditional library services, for example a game. In such a case, the library is one important part of the value chain, but not the whole chain. The library data make the service possible, but it is not the only important part, or even the most important one. Obviously, libraries need to know about such services and applications and the API requests they make.

The changes discussed here and throughout this paper take place against a backdrop of digitalization, of larger social, organizational and technological changes. It will take some time for libraries to reposition themselves in the value creation system. Traditionally, libraries controlled all parts of a value chain. They managed the collections and facilities, employed staff, maintained catalogues and support services such as the OPAC. Libraries were to some extent independent of other players on the field. In the data economy of the networked world (Figure 2), value is created increasingly not through simple value chains but through value networks in multi-agent environments or ecosystems. It remains to be seen what will be the role of the library in the changing world. By opening their data libraries enable other organizations to use it. However, it will take time for stable structures to form around this.

Finally, because the traditional OPAC does not really support other means of discovery than the search, we have known that an increase in searches has meant an increase in interest in our service and our catalogue. Searches are of course a proxy variable but in a traditional setting, this caused no problems. Now, when we develop other means for the users of finding what they want, searches no longer mean what they used to mean.

A library may have recommendation engines that are able to use the information the library has about its users to suggest materials that might be of interest to them. It may also have intelligent, dynamic browse facilities that offer curated views to its materials. The better all these features work, the fewer searches the library users have to perform. And if a major general purpose search engine such as Google is able to index the library catalogue through the regular library online service, people might use that search engine to search the catalogue and the whole discovery layer might not be so important after all (cf. Schonfeld, 2014; Askey, 2013).

The changes in the information environment mean that the libraries are relinquishing control in many areas. One major driver in this change is the growth of user expectations. They expect simpler, better, more functional and easier services.

Earlier, user expectations were fewer. This gave libraries more control. They expected the user to learn about their many services and their organizations, to use their many services and often to spend a lot of time doing so. If the users needed information, libraries made them search for it. If they needed the same kind of information from multiple systems or catalogues, libraries made them search them all.

Today, of course the user comes first. Libraries cannot expect the user to make the effort to understand the difference between multiple systems they have or be satisfied with bad user interfaces. When libraries respond to the changing expectations, they lose some of the control they used to have.

3. Finna – the one stop shop

Library discovery layers have lost a popularity contest: people prefer to use web search engines such as Google for practically all information search purposes. Run by major multinational corporations, they are impossible for libraries to beat or even to match. Finna (Figure 3) is not such a search engine, but its design embodies lessons learned from them.

Finna cannot compete with the big web search engines. Instead, it complements them. A priority is to expose Finna content to search engines to guarantee its discoverability. In addition to this, there are things that web search engines cannot do, at least not yet. Finna Street, a function that uses the user’s location data to show the user photographs taken nearby, is one such thing.

A portal is a web browser interface. For an open digital library, a web browser interface is a necessary but not a sufficient condition. Finna is more than a portal in two ways. First, it has an open API. If you want, you can write your own discovery interface or a game for it. Second, Finna is a continuously evolving service. It is extended and improved based on changing user needs.

Usability design has always been at the forefront in Finna development, and the initial results from Finna user surveys are promising. The user does not want to search but to find. If we can find ways to give the users what they want without making them search for it, we use them. If we can improve the quality of search results and help minimize the number of searches the user has to perform, we do. The number of searches is no longer something we want to maximize, and as a proxy metric, it is no longer very useful to us.

4. Modern services challenge the traditional evaluation methods

In the increasingly complex operational environment of cultural heritage organizations, their culture of measuring their operations is changing from reactivity to proactivity. Increasingly, the measuring and evaluation shall be done on one’s own initiative, not only as a reaction to what has happened.

Though there are many international guidelines such as the international standards, new ways of showing the impact and value of the library organization are needed. Though the standards are necessary for collecting uniform and comparable data, they can only give the cultural heritage organizations the keys to search for applicable ways of measuring. In addition to the traditional measuring, bringing qualitative data next to quantitative data, new ways of showing the impact and value of library may be found.

One of the challenges in supporting the cultural heritage organizations in measuring and evaluation of their operations is that making the standards and keeping them up to date is time consuming and laborious while the digital environment of the organizations changes and develops rapidly. In addition, the operational environments in different countries and different organizations are fairly different. Therefore, it seems neither feasible nor meaningful to give very detailed guidance how to collect data.

Hence, even the standards cannot offer a “cook-book” for the organizations to follow. Still, in order to ensure the comparability of their evaluations for benchmarking and developing purposes, it is of vital importance for them to follow the internationally accepted methods of evaluation. In addition, it is equally important for the standards organs to remain sensitive to the changes in the operating environment and update the standards accordingly. The international standards for collecting statistical data and for evaluation methods in cultural heritage organizations are listed in Table I. We will return to this topic in the concluding thoughts of this paper.

The standard ISO 16439 (methods and procedures for assessing the impact of libraries) encourages the libraries to convert the data from qualitative surveys, such as user surveys, to numerical form that is easy to measure. The idea is to find new type of indicators together with the traditional ones, indicators that not just measure quantities but make also possible the numeric analysis of qualitative data.

The idea of Net Promoter Score (NPS), presented by Reichheld (2003), seems promising in this respect. NPS is a simple way of measuring customer loyalty. It is based on asking a question “How likely is it that you would recommend [brand or company X] to a friend or colleague?” rated on scale from 0 to 10. The customers are grouped to “promoters” (rating 9–10), “passively satisfied” (rating 7–8) and “detractors” (rating 0–6).

According to Reichheld (2003), the promoters are considered loyal customers who most likely continue buying the product or using the services also if a competitive product and service are available. The promoters may also tell about their positive experience to others, whereas the detractors are most susceptible to competitive product or services if available. The group of passively satisfied, on the other hand, are more likely to be turned into promoters.

5. NPS as indicator of library customers’ perception

Laitinen (2018) analyzed three annual Finna user surveys performed in 2014–2016 (Table II). A rating on the average 7.9 in 2014 and 8.0 in 2015 and 2016 on a scale 0–10 was found. The NPS was counted in 2016 for the first time in Finnish library services, the finding being 29.9 percent.

The diagrams describing the distributions of answers to the questions “On a scale of 0–10, how would you grade Finna?” and “How likely is it that you would recommend Finna to a friend or colleague?” in 2016 seemed visually almost congruent (Figure 4) but there was a small difference between the grading and willingness to recommend in the group of the “promoters” (rating 9–10, p=0.0154*).

The survey was made in three languages: in both domestic languages of Finland (Finnish and Swedish) and in English. In theory, the objectivity of the translations might affect the results (how the respondents perceived the questions) but in this survey this topic was not investigated. In any case, the share of Finnish language responses was 95.5 percent in 2014–2015 and 92.2 percent in 2016.

6. Discussion

When using proxy variables, we need to understand them well, and understand our environment and its evolution. When the environment changes, they might no longer be a good proxy. In the Panopticon OPAC world, searches were a good proxy variable. As the environment changed, they lost much of their value as a metric. To understand such changes in the environment, we need to understand all the aspects of what our organizations do, and we need to understand more general social trends. We need to understand management, technology, user needs and the economic trends and so on.

Libraries exist to provide services for the user, or at least to enable such services. This means that they need to think of what the user does and wants. In most cases, if they are able to measure user satisfaction in concrete ways, it trumps all other cards. NPS is a proxy metric: in strict terms, what it measures is not precisely user satisfaction, but we as the authors of this paper believe it is a very good proxy for user satisfaction. If we compare it to the number of searches, we see that whereas the number of searches does not tell us much about the user’s attitude toward our service, NPS is able to measure at least one dimension of that attitude. If there is an important correlation between user satisfaction and NPS, there is a good reason for using NPS.

Using NPS requires us to address some questions, even philosophical ones that arise from the nature of the metric: what does the willingness to recommend actually mean? One major hurdle in using NPS to measure services run with public funding is that recommendations mostly make sense when alternatives exist, in a competitive environment. It is clear what we mean when we recommend Burger King over McDonald’s: they provide essentially the same service, and compete in offering better value to the customer. There is of course no such direct competitor to library services. Thus, when someone recommends Finna, it is not clear what the recommendation means.

In Reichheld’s (2006) material from the business context, the NPS of a typical company was between 5 and 10 percent, remarkably lower than that of Finna (29.9 percent). K. Välbe (personal communication, February 3, 2017) stated that in the National Library of Estonia, the goal of NPS in 2016 was set 65.5 percent (64 percent was achieved) for new customers. Both of these examples from the public sector organizations in two different countries may be an indication from the fact that in the public sector the estimates tend to be higher. The reason for this may be in the specificity of the service products – with the services, there would not seem to be a “natural” competitor.

The similarity of the diagrams shown in Figure 4, describing the distributions of answers to the questions regarding the grading and willingness to recommend Finna service to colleagues or friends led to the thought that the two above-mentioned questions might have been perceived in the same way and a retrospective counting of the NPS for previous years might be justified. Yet, there are implications that the users may not analyze these questions so thoroughly that it would have statistical significance but that they respond intuitively.

The statistically significant difference in the highest grades proves that the willingness to recommend Finna does not correlate with the appreciation. Therefore, there obviously is some threshold between giving a high grading and willingness to recommend the service. The hesitation to recommend the service to others in spite of giving a high grade may be due to the specialty of the user’s need – he may not suppose the service to be suitable for others. Other scientific disciplines may offer tools to determine whether this problem is real or not.

It is also to be noted that when recommending a commercial service, the user puts his reputation on the line in a different way than recommending a cultural service like Finna. If the other person is not happy about the quality of the recommended service, he may doubt the judgment of the recommender. By contrast, by recommending a cultural service the user may try to improve his reputation and status by telling that he is familiar with the service. Even if the cultural service turns out not to be perfect, by recommending it a person may want to signal to others his own familiarity with “higher” forms of culture.

Overall, there are several possible ways to understand the recommendation. One can understand it as a recommendation over indirect competition, such as online and brick and mortar bookstores or commercial entertainment services. One can also understand it as an approval of the quality of service and its fitness for purpose. When a Finna user recommends the service to someone, he may be saying, for example:

  • “I did not need the help of an expert to use it.”

  • “It did what it promised and what I expected.”

  • “It felt designed for a person like you and me.”

  • “It gave me goosebumps![1]

7. Concluding thoughts

In addition to user needs, libraries need to understand technology because it has become such an important asset to them. However, at the same time, technology ultimately does not matter. It is not the answer but it needs to be understood to arrive at the right answer. For example, small technical changes to interfaces might cause great changes in how the user behaves. When this happens, libraries need to update their whole picture of what they are measuring. Are their proxy variables still valid? Are there new metrics they could consider? Technology does not give them all the answers but it gives them a part of the information they need to get to the answer.

The big change in the operative environment of cultural heritage organizations throws the challenge also to the traditional standards. The role of domain specific standards such as ISO 2789 for library statistics will inevitably change as the libraries integrate more tightly to their different and changing environments. If the standards remain domain specific as today, instead of only relying on such standards as a complete solution, the libraries may find more fruitful to supplement them with general-purpose metrics such as user satisfaction, standard web analytics and social metrics. How to use them is no longer merely a matter of standards but a matter of the best practices of application. Establishing those practices requires new capabilities, adaptability, continuous improvement and a broad common understanding of the digital library.

It would be desirable that the international standards stay on the level with service development so that the libraries and other cultural heritage organizations would still have internationally accepted common procedures in their use supporting them to produce comparable data for the purposes of benchmarking and developing their services.

Figures

Jeremy Bentham (public domain), via Wikimedia commons

Figure 1

Jeremy Bentham (public domain), via Wikimedia commons

The creation of value in the data economy of the networked world

Figure 2

The creation of value in the data economy of the networked world

The Finna service of the National library of Finland is a pathway to all collections of the cultural heritage organizations of Finland

Figure 3

The Finna service of the National library of Finland is a pathway to all collections of the cultural heritage organizations of Finland

The distribution of estimates on scale from 9 to 10 on the questions “how would you grade Finna” and “how likely is it that you would recommend Finna” according to Laitinen (2018)

Figure 4

The distribution of estimates on scale from 9 to 10 on the questions “how would you grade Finna” and “how likely is it that you would recommend Finna” according to Laitinen (2018)

International standards for collecting statistical data and for evaluation methods in cultural heritage organizations

ISO 11620:2014(E):2014 (2014) Information and documentation – library performance indicators
ISO 16439:2014(E):2014 (2014) Information and documentation. Methods and procedures for assessing the impact of libraries
ISO 2789:2013(E):2013 (2013) Information and documentation – international library statistics
ISO CD/21248 (2009) Information and documentation – quality assessment for national libraries
ISO 18461:2016(E):2014 (2014) International museum statistics
ISO/CD 19580 (2017) Information and documentation – international archives statistics

Note: CD, committee draft, unpublished document on the circulation for comments

The ratings and NPS of Finna and the response rates to the Finna user surveys 2014–2016 according to Laitinen (2018)

The survey as a whole “On a scale of 0–10, how would you Grade Finna?” “How likely is it that you would recommend Finna to a friend or colleague?”
2014 2015 2016 2014 2015 2016 2016
Ratings + NPS (%) 7.9 8.0 8.0 29.9
Seen 5,367 18,656 22,562 3,239 12,159 14,520 14,520
Answered 3,239 12,159 14,520 3,186 11,919 14,520 14,478
Response rate (%) 60.4 65.2 64.4 98.4 98.0 100.0 99.7

Note: The ratings were counted each year, the NPS in 2016 only

Note

1.

An actual quote (Kingsley, 2017).

References

Askey, D. (2013), “Giving up on discovery”, Taiga Forum, available at: http://taiga-forum.org/giving-up-on-discovery/ (accessed February 7, 2018).

ISO 11620:2014(E):2014 (2014), Information and Documentation — Library Performance Indicators. International Standard, 3rd ed., International Organization for Standardization (ISO), Geneva, p. 100.

ISO 16439:2014(E):2014 (2014), Information and Documentation – Methods and Procedures for Assessing The Impact Of Libraries. International Standard, 1st ed., International Organization for Standardization (ISO), Geneva, p. 82.

ISO 18461:2016(E):2014 (2014), International Museum Statistics, 1st ed., International Organization for Standardization (ISO), Geneva, p. 37.

ISO 2789:2013(E):2013 (2013), Information and Documentation – International Library Statistics. International Standard, 5th ed., International Organization for Standardization (ISO), Geneva, p. 71.

ISO/CD 19580 (2017), “Information and documentation – international archives statistics”, under development, Committee draft, Geneva, available at: www.iso.org/standard/65306.html (accessed February 7, 2018).

ISO CD/21248 (2009), “Information and documentation – quality assessment for national libraries”, unpublished document, Committee Draft, Geneva.

Kingsley, S. (2017), “Historian opetuksen Pokémon”, Kansalliskirjasto, 1/2017:8, available at: https://issuu.com/kansalliskirjasto/docs/30477896_kk_1_2017_print_web/8 (accessed February 7, 2018).

Laitinen, M.A. (2018), “Net Promoter Score (NPS) as indicator of library customers’ perception”, Journal of Library Administration, Vol. 58 No. 4, pp. 394-406.

Reichheld, F. (2006), “Questions about NPS – and some answers”, Blog Entry, July, available at: http://netpromoter.typepad.com/fred_reichheld/2006/07/ (accessed August 27, 2018).

Reichheld, F.F. (2003), “The one number you need to grow”, Harvard Business Review, Vol. 81 No. 12, pp. 46-54, available at: https://hbr.org/2003/12/the-one-number-you-need-to-grow (accessed February 7, 2018).

Schofield, P. (2009), Bentham: A Guide for the Perplexed, Continuum, London, p. 192.

Schonfeld, R.C. (2014), “Does discovery still happen in the library? Roles and strategies for a shifting reality”, report, available at: www.sr.ithaka.org/wp-content/uploads/2014/09/SR_Briefing_Discovery_20140924_0.pdf (accessed February 7, 2018).

Corresponding author

Timo Laine can be contacted at: timo.mz.laine@helsinki.fi

Related articles