New & noteworthy

Library Hi Tech News

ISSN: 0741-9058

Article publication date: 1 October 2006

98

Citation

(2006), "New & noteworthy", Library Hi Tech News, Vol. 23 No. 9. https://doi.org/10.1108/lhtn.2006.23923iab.001

Publisher

:

Emerald Group Publishing Limited

Copyright © 2006, Emerald Group Publishing Limited


New & noteworthy

TalisWinners of Mashing Up the Library Competition 2006 Announced

On September 11, Talis announced the winners of the first global competition intended to openly encourage innovation in the display, use and reuse of data from and about libraries "Mashing up the library 2006". For all those users of libraries who have ever wished they could bring information from their library to life outside the virtual walls of its web site, this competition presented an ideal opportunity to see some of what the future might hold.

Announcement of the Mashing Up The Library Competition in June generated great enthusiasm amongst diverse groups across the world. The competition unveiled fascinating examples of pushing library information out to existing audiences in new ways, or reaching totally new audiences with compelling and captivating applications.

The first prize of $2,000 was awarded to John Blyberg of Ann Arbor District Library in Ann Arbor, MI. His entry, Go-Go-Google-Gadget, shows how simply library information can be integrated into the personalised home page offered by Google, and is described by competition sponsor and member of the judging panel, Talis' Paul Miller, as "an excellent example of taking information previously locked inside the library catalogue and making it available to patrons in other contexts where they may spend more time than they do in their catalogue."

Available information includes new and the most popular material in the library, and patron-specific information on checked-out and requested items. "Superpatron" Ed Vielmetti applauded the simplicity of this entry, remarking in a clear invitation for others to follow John's lead that "the visible source code is very tiny and easily hackable." Vanderbilt University's Marshall Breeding concluded, "I like this entry's spirit of opening up information in the library system and putting it under the control of the user."

A close second prize of $1,000 was awarded to the Alliance Library System in East Peoria, IL, and their global partners in the Second Life Library. Their entry, the Alliance Second Life Library 2.0, was described by Talis' Miller as "both a testament to international co-operation amongst libraries and a compelling demonstration of the ways in which traditional library functions can be extended into cyberspace, reaching new audiences in ways exciting and relevant to them as they live their lives." Alliance Library System intends to use the funds to extend their work within Second Life.

The Mashing Up the Library Competition marks an important step forward in encouraging open and inclusive innovation from libraries around the world, regardless of their consortial memberships or vendor allegiance. Improved tools, improved access to data from and about libraries, and increased awareness mean that libraries are in for an exciting and challenging journey. Talis is committed to helping libraries to reach out to existing and new markets for their capabilities, and the ongoing support of this competition is one aspect of that strategy.

What's next? Rather than re-run the same competition again next year, Talis wishes to encourage innovative work on an ongoing basis. As such, the competition web site has reopened, and will accept new entries. Talis welcomes approaches from library systems vendors and other interested parties willing and able to join in celebrating and rewarding all that is best in innovating around the display, delivery and use of library information, today and into the future.

To find out more: www.talis.com/tdn/competition

OCLCAcquires DiMeMa; Zick named Vice President of OCLC Digital Services

OCLC Online Computer Library Center has acquired Digital Media Management (DiMeMa), the organization that developed and supports CONTENTdm, the leading digital management software for libraries distributed by OCLC.

CONTENTdm software offers a complete set of tools to store, manage and deliver digital collections such as historical documents, photos, newspapers, audio and video on the web. OCLC has been the exclusive distributor of CONTENTdm software to libraries, cultural heritage organizations and other nonprofit organizations since 2002.

"CONTENTdm is the industry leader in digital management software for libraries," said Jay Jordan, OCLC President and CEO. "CONTENTdm makes it possible for libraries to easily manage their own, unique digital collections. As part of OCLC, the DiMeMa team will be better positioned to explore new ways to help libraries and other cultural heritage organizations manage their digital collections and make them accessible worldwide."

Greg Zick, founder of DiMeMa and former Professor at the University of Washington, will be Vice President of OCLC Digital Services. CONTENTdm was developed while Dr Zick and a team of programmers were conducting research into optimal digital image database technologies in the Center for Information Systems Optimization (CISO) at the University of Washington. At the time, special collections of the University of Washington Libraries were stored in a variety of forms and formats, and demand was building to provide flexible online access to these resources. The libraries began to use the CISO Lab software for fast, full-featured access and management of the collections. After extensive field testing, the products resulting from these research and development activities were made available to organizations outside the University. DiMeMa Inc. was formed in 2001 to support the growing CONTENTdm user community and to focus on accelerated research and product development.

The addition of DiMeMa staff will also help the RLG-Programs division and OCLC Research in their efforts to explore the applications of digitization in the library and museum communities. The newly-organized Digital Services Division will integrate both OCLC and RLG digital services into the OCLC portfolio. Digital Services staff, including the DiMeMa staff, will collaborate closely with RLG-Programs staff on shared issues of curation, preservation and presentation of digital resources.

CONTENTdm has evolved into a powerful digital collection management solution that offers scalable tools for archiving collections of any size. Today more than 300 libraries and other cultural heritage organizations license CONTENTdm software to manage more than 2,500 digital collections.

Metadata for these digital collections can be added to WorldCat, the world's largest database of items held in libraries. Once in WorldCat, these collection items can be found by searching the database, or searching the Web. Items in WorldCat can now be discovered through WorldCat.org, a new search site that also offers a downloadable search box, and through popular search engines like Google and Yahoo! as part of the OCLC Open WorldCat program.

To see some digital collections managed with CONTENTdm software, visit: www.contentdm.com/customers/

OCLCDesignated Maintenance Agency for OpenURL Standard

Online Computer Library Center (OCLC) and the National Information Standards Organization (NISO) have announced that OCLC will assume responsibilities as Maintenance Agency for The OpenURL Framework for Context-Sensitive Services (ANSI/NISO Z39.88-2004) for a period of five years. The standard defines architecture for creating a context-sensitive networked service environment.

As the World Wide Web began its explosive growth in the early 1990s, the scholarly-information community made available digital scholarly materials consisting of metadata and full-text content. As this body of materials grew, it became increasingly difficult to provide adequate links between related information assets, distributed across many collections and controlled by different custodians. In 1999, NISO initiated an effort to improve reference linking. Herbert Van de Sompel, now with the Los Alamos National Laboratory, developed a system of context-sensitive linking, based upon a new type of URL, the OpenURL, and it provided the foundation for what has become ANSI/NISO Z39.88.

"OpenURL has significantly improved the world's access to electronic journal content," said Mike Teets, Vice President, OCLC Global Product Architecture. "It is now progressing to bringing the same access to a much broader set of services for electronic resources. As the use of OpenURL expands and more services are automated using this critical infrastructure, there is a growing need for a registry supporting the communication and extension of the current standard as well as the development of community profiles. OCLC has committed our reliable architectures to supporting the OpenURL community and its continued success."

NISO (www.niso.org) is a not-for-profit association accredited by the American National Standards Institute (ANSI). NISO fosters the development and maintenance of standards that facilitate the creation, persistent management, and effective interchange of information so that it can be trusted for use in research and learning. To fulfill this mission, NISO works with intersecting communities of interest and across the entire life cycle of an information standard.

OCLC: www.oclc.org/

NISO: www.niso.org/

PALINET, Amigos, SOLINETUnite in Offering ScholarlyStats

Amigos, PALINET, and SOLINET have signed an agreement with MPS Technologies to offer ScholarlyStats to their combined membership. The agreement is a cost- and time-saving move by three of the largest library consortia in the USA, allowing subscribing members to receive maximum discounts on ScholarlyStats, a provider of consolidated vendor usage statistics. Amigos Library Services is one of the largest library service networks in the nation, consisting of over 650 libraries and cultural institutions, located primarily in the southwestern United States. PALINET, a member-owned and governed regional Library Network, serves 600+ members throughout the Mid-Atlantic region and beyond. SOLINET is a non-profit membership organization serving more than 2,600 libraries of all types and sizes in ten Southeastern states and the Caribbean. SOLINET, Amigos and PALINET represent nearly two-thirds of libraries in the USA.

MPS Technologies is part of the Macmillan group of companies and provides a range of technology-driven services specifically designed to support libraries and publishers, including web analytics, fulfillment services and content delivery. The ScholarlyStats process collects, standardizes and consolidates journal and database usage reports, providing libraries with a single point of access to their valuable usage data. MPS collects and consolidates the journal and database usage statistics that are provided each month to libraries by their vendors. The information is processed to standardize formats, and a suite of reports is delivered to the library's ScholarlyStats portal.

ScholarlyStats reports include a set of consolidated reports which show use across vendors, based on the standards outlined by the COUNTER code of practice. On request, MPS can transfer ScholarlyStats reports into other library systems, including ERM systems. This data transfer is automated following the industry-standard SUSHI protocol.

www.scholarlystats.com

www.mpstechnologies.com

Deep Web Technologies

Releases New Version of Federated Search Software

Deep Web Technologies (DWT) announced in August 2006 the release of the next generation of its proprietary federated search software Explorit(TM) 4.0. DWT's software has enabled private industry and the US government, to create breakthrough web sites such as the "one stop" Science.gov which empowers users to access 50 million pages of federal research and development projects. DWT software is designed to access the deep web – the part of the internet that encompasses vast and diverse content including commercial subscription databases, content buried within publicly available websites, and internally-generated documents scattered throughout an organization – and claims to reach the 94 percent of the electronically-stored data and documents missed by consumer search engines such as Google.

The new version now offers an enhanced interface to allow for the simple creation of custom applications for different groups of users. Explorit 4.0 also introduces new monitoring and management tools. Graphical-based tools monitor usage and the health of individual data sources. Further, tools have been added that allow for the management of the overall system, including reconfiguring the system for growth, better performance and dealing with failed hardware. Performance has been improved to better handle large numbers of information sources and user queries.

According to DWT, the commonly used search engines only search about 6 percent of the research and data that is stored electronically today. They cannot conduct a federated search; a search that collects and combines data from subscriber-based portals, Intranets, and other private web sites with information on public sites. DWT's Explorit 4.0 launches a federated search of all relevant portals with one search query in one user interface. Explorit 4.0 then delivers the query's results in a relevantly ranked order.

Samples of DWT's technology at work can be viewed on sites such as the DOE's Energy, Science and Technology Virtual Library (http://energyfiles.osti.gov) the DoD's scientific site (http://multisearch.dtic.mil) the GrayLIT Network (www.osti.gov/graylit/ andwww.Science.gov).

Deep Web Technologies web site: www.deepwebtech.com

More News from the Google Books Library ProjectMbooks from University of Michigan

The first digital works resulting from the University of Michigan/Google Digitization Partnership are now being used to enhance the University Library's online catalog.

The online catalog points to a new U-M Library system called MBooks that was developed specifically for the materials digitized by Google. The system, intended to support scholarly research, was designed to meet the specialized needs of researchers by providing more information about works in the collection and – where allowed – actually making the text of works available through the catalog. In addition to a page-turning function, the online material includes updated bibliographic information, persistent URLs – essential for proper citation – and the ability to change resolution (i.e. zoom in or out), and to change format (such as converting to PDF). The ability to magnify or rotate the image is particularly important for researchers who must study detailed images such as formulas for chemical compounds or intricate historical cartography, and for persons with some disabilities.

For uncopyrightable works (such as works created by the US Government), works in the public domain, and works authorized for public display by the copyright holder, the text will be fully viewable. For all material, the user may search within a volume and retrieve the number of times a search term appears per page. This feature is useful, not only for determining relevancy, but also for scholarship requiring precise and exhaustive citation.

Included in the material will be the university's extensive federal government document collection. A small sampling of documents available today includes the diplomatic correspondence of Benjamin Franklin and John Adams, and approximately 2,200 Congressional hearings from the 1970s and 1980s.

For information on the U-M online catalog: http://mirlyn.lib.umich.edu

For information on MBooks: http://mdp.lib.umich.edu/m/mdp/mdp-faq.htm

See other documents related to the Michigan Digitization Project at: www.lib.umich.edu/mdp

More on the University of California and Google's Agreement

In related news, the University of California has provided more information on their agreement with Google to participate in the Google Books Library Project, including information on what UC intends to do with the digital copies:

"The digital copies will be used in different ways depending upon their copyright status. The UC libraries will encourage the free and unfettered full-text access to scanned books that are in the public domain. The libraries may also take advantage of online, full-text public-domain holdings as an opportunity to explore how best to support scholarly exploitation of evolving, vast, full-text digital libraries. This could include linking to the full-text public domain works in the UC's online library catalog, the Melvyl Catalog. Books that are protected by copyright will only be accessible to the extent allowed by copyright law.

As part of their historic missions, libraries have been charged with preserving the cultural memory and scholarly record in the public trust. UC will retain its digital copies of books protected by copyright in a dark archive – that is, in a digital preservation repository that is intended to ensure the longevity of its contents, but not to make its holdings accessible to end users."

University of California Google Information: www.cdlib.org/news/google.html

UC-Google contract (PDF): www.cdlib.org/news/ucgoogle_cooperative_agreement.pdf

A comparison of the UC and UM contracts with Google by Karen Coyle: http://kcoyle.blogspot.com/2006/08/dotted-line.html

Google BooksOffers Downloads of Public Domain Books

In August 2006 Google announced that readers can now find new, and free, downloadable versions of some of the world's greatest books on Google Book Search. With the goal of expanding access to books that are out of copyright and have become public domain material, Google has added the ability download some books. Users can search and read these books on Google Book Search like always, but now they can also download and print them to enjoy at their own pace.

To easily find books to download, readers can select the "Full view" button when searching on Google Book Search, and then click on the "Download" button shown on public domain books. They can then download a PDF file to their computers to read when they are offline, save for later, or print a paper version. This feature is only available for books in the public domain. Google continues to display only basic bibliographic information, and, in many cases, small snippets of text – at most, a few lines of text surrounding a search term, unless they have the publisher's permission.

Examples to try:

  • Newton's "Principia".

  • Dante's "Inferno".

  • Hugo's "Marion de Lorme".

  • Bolívar's "Proclamas".

  • Goethe's "Die Leiden des jungen Werthe".

Google Books: http://books.google.com

Google NewsAdds an Archive Search

Available beginning September 2006, Google News now has an archive search to help users quickly and easily search for events, people and ideas over different periods of time. History buffs and curious users alike can explore more than 200 years of historical information to get a glimpse of the emotions and attitudes of the past.

When users search for an historical event or person, they will see the most relevant articles related to their query, and they will be able to browse an historical timeline to get a broader overview of the results. For example, searching for information about the 1969 moon landing will showcase original news written in 1969, as well as more recent coverage from the last four decades of analysis. Articles related to a story or theme within a given time period are grouped together to allow users to see more perspectives on the events. Users can also narrow their searches to specific time periods or publications of interest.

Users can search archives from Google News by clicking on the "News Archive Search" link. For selected queries, users of Google web search may also see links to the top three related articles from the news archives integrated at the bottom of the result page.

When searching news archives, results are ranked based on relevance. News archive search aims to rank results such that the articles/events that would be of interest to users exploring history appear first. Google takes into account the full text of each article, the publication in which the article appears, how often the underlying event has been referred to or described, in what manner and by whom.

Google is working with prominent information providers to help users discover relevant historical information. This includes freely available articles from sources such as TIME.com, The Guardian, and many others, as well as snippets of articles available for a fee or via a subscription, such as those from news organizations like The New York Times, The Wall Street Journal and Washington Post, Newsweek Interactive, and from news aggregators like AccessMyLibrary.com from Thomson Gale, Factiva, HighBeam™ Research, LexisNexis and others.

Search results available for a fee are labeled "pay-per-view" or with a price listed. Clicking on a link for fee-based content takes you to the content provider's website to complete the transaction. At the present time no option to refer users to libraries' open URL resolvers to guide them to licensed content is available.

Google News Archive Search: http://news.google.com/archivesearch

MicroformatsDefine and Carry Actionable Data

What are microformats? According the microformats.org blog, "designed for humans first and machines second, microformats are a set of simple, open data formats built on existing and widely adopted standards. A discussion on the web4lib list boiled it down to "microformats carry/define actionable data": http://lists.webjunction.org/web4lib/search/index.cgi

They are:

  • a way of thinking about data;

  • design principles for formats;

  • adapted to current behaviors and usage patterns;

  • highly correlated with semantic XHTML, AKA the real world semantics, AKA lowercase semantic web, AKA lossless XHTML; and

  • a set of simple open data format standards that many are actively developing and implementing for more/better structured blogging and web microcontent publishing in general.

The principles of microformats include:

  • solving a specific problem;

  • start as simple as possible;

  • design for humans first, machines second;

  • reuse building blocks from widely adopted standards;

  • modularity/embeddability; and

  • enabling and encouraging decentralized development, content, services.

See the microformats wiki for a discussion which attempts to create a definition in more practical terms: http://microformats.org/wiki/what-are-microformats

The microformats wiki also has information on specific microformats implementations and open source specifications: http://microformats.org/wiki/Main_Page

Microformats Blog: http://microformats.org

Pew Internet and American Life ProjectUsage over Time Spreadsheet

The Pew Internet and American Life Project has released an updated and reorganized version of their "Usage over time" spreadsheet. It can be found in the latest trends section of the project web site. The spreadsheet is meant to serve as a quick reference guide to some of the core data on internet use and online activities that have been gathered by the project since 2000. The spreadsheet can be used to examine changes over time among online Americans in key internet activities such as using email, getting news online, doing internet banking, using search engines, access weather information, buying products, pursuing hobby information, making travel reservations, getting sports information, downloading music and other digital files, sending instant messages, and participating in online auctions. The spreadsheet also contains some demographic data for all of those activities, so users can compare trends among online men and women, different age cohorts, and different racial and ethnic groups.

While the data included in the spreadsheet reflects just a small sampling of all the research done by the Project over the past seven years, those interested in digging deeper into their library can access all of the raw data sets and questionnaire files at the following address: www.pewinternet.org/data.asp The raw data is available to scholars and analysts to use in their own research.

Pew Internet Project Trends: www.pewinternet.org/trends.asp

Pew Internet Usage Overtime Spreadsheet (Excel): www.pewinternet.org/trends/UsageOverTime.xls

Two New ReportsFrom CSHE at UC Berkeley Available

Use and Users of Digital Resources

The Center for the Study of Higher Education (CSHE) at the University of California Berkeley has announced the final report of the Digital Resource Study is available: "Use and users of digital resources: a focus on undergraduate education in the humanities and social sciences" Authors, Diane Harley, Principal Investigator; Jonathan Henke, Shannon Lawrence, Ian Miller, Irene Perciali, and David Nasatir.

The purpose of this research was to: map the universe of digital resources available to a subset of undergraduate educators in the humanities and social sciences; and to investigate how and if available digital resources are actually being used in undergraduate teaching environments. The study employed multiple methods, including surveys and focus groups. The definition of digital resources is intentionally broad and includes rich media objects (e.g. maps, video, images, etc.) as well as text.

Contents include:

  • Understanding the humanities/social science digital resource landscape and where users fit into it.

  • How are digital resources being used among diverse communities?

  • Faculty discussion groups and faculty surveys.

  • Transaction log analysis and we site surveys.

  • Why study users?

  • Interviews with digital resource providers.

  • Site owners and user researchers meeting.

  • Conclusions.

  • Bibliography.

  • Appendices.

Report description: http://cshe.berkeley.edu/publications/publications.php?id=234

Report (PDF): http://cshe.berkeley.edu/publications/docs/ROP.Harley.DigitalUsers.15.06.pdf

The Influence of Academic Values on Scholarly Publication

Another interesting study report from CSHE is available: "The influence of academic values on scholarly publication and communication practices". Authors Diane Harley, Sarah Earl-Novell, Jennifer Arter, Shannon Lawrence and C. Judson King.

This study reports on five disciplinary case studies that explore academic value systems as they influence publishing behavior and attitudes of University of California, Berkeley faculty. The case studies are based on direct interviews with relevant stakeholders – faculty, advancement reviewers, librarians, and editors – in five fields: chemical engineering, anthropology, law and economics, English-language literature, and biostatistics.

Report description: http://cshe.berkeley.edu/publications/publications.php?id=232

Report (PDF): http://cshe.berkeley.edu/publications/docs/ROP.Harley.AcademicValues.13.06.pdf

E-Book Platforms and AggregatorsResearch Report from ALPSP

The Association of Learned and Professional Society Publishers (ALPSP) has published a new research report which is a comprehensive evaluation of e-book platforms. The report entitled "E-book platforms and aggregators: an evaluation of available options for publishers" was authored by Linda Bennett, who conducted a comprehensive survey of e-book platform providers. The report reveals that the e-book platform has now carved its place as a mainstream vehicle for the delivery of publisher content.

The report explores in depth the attributes and features of the main aggregator platforms. The commercial and pedagogical advantages and drawbacks of e-books are also described from the different perspectives of publishers, librarians and academic. Five distinct aggregator groups of e-book providers are examined in detail: "general" aggregators; "specialist" aggregators; online journals aggregator who also host e-book collections; digital warehouses; and library suppliers. The options available to publishers considering the development of their own e-book platforms "from scratch" are considered. Three case studies explore the experiences of publishers who have already developed e-book strategies.

Key features include:

  • The comprehensive evaluation of e-book platforms and aggregators.

  • The technologies on which the platforms are based and their precise attributes and functional features.

  • Precise purchase/licensing models that each operates.

  • Sets out how DRM and encryption is managed, and how securely.

  • How many titles the aggregator holds.

  • How many customers the aggregator deals with, and in which geographical regions.

  • Pricing models, and how publishers are paid.

  • Financial stability.

  • The types of marketing support offered to publishers.

The report is available for a fee in either print or electronic format and can be ordered from the ALPSP web site.

ALPSP Publication Order Form: www.alpsp.org/publications/pub15.htm

Digital Curation CentreReleases Reflective Self-Evaluation Survey Report

The Digital Curation Centre (DCC) recently undertook an externally moderated reflective self-evaluation to gather evidence of the extent to which the DCC is meeting its objectives. The evaluation used a number of different methods to gather data from a variety of users, potential users and others with an interest in digital curation. Evaluation activity included a detailed DCC staff survey, focus groups, interviews and a public survey.

The report describes the evaluation of the first phase of activity of the DCC, which was established in early 2004 with funding, initially from the Higher Education Funding Councils' Joint Information Systems Committee (JISC) coupled from September 2004 with funding from the Engineering and Physical Sciences Research Council (EPSRC).

The report is structured around the three key themes identified in the aims agreed for the evaluation by the Steering Group, namely impact and effectiveness; takeup; and, quality of services and resources. A further section reports on suggestions made by survey respondents concerning changes, which may be desirable in the DCC's operations and opportunities which should be considered for exploitation.

The report presents a largely positive view of the achievements of the DCC to date; confirms expert opinion that it is needed in the longer term; and highlights a number of ways in which it can become more effective and have greater impact.

Full text of the report: www.dcc.ac.uk/docs/DCC_Evaluation_Report_Final.pdf

European CommissionCalls on Member States to Contribute to the European Digital Library

The European Commission has urged EU Member States to set up large-scale digitisation facilities, so as to accelerate the process of getting Europe's cultural heritage on line via the European digital library. In a recommendation on digitisation and digital preservation released in September, it calls on Member States to act in various areas, ranging from copyright questions to the systematic preservation of digital content in order to ensure long term access to the material.

"Our aim is to arrive at a real European digital library, a multilingual access point to Europe's digital cultural resources", commented Information Society and Media Commissioner Reding. "It will allow, for example, Finnish citizens to easily find and use digital books and images from libraries, archives and museums in Spain, or a Dutchman to find historical film material from Hungary online".

At present only a fraction of the cultural collections in the Member States is digitised. A common effort is necessary to speed up the digitisation and online accessibility of the material in order to arrive at the necessary critical mass. With the recommendation just adopted, the Commission invites the Member States to take concrete steps in this direction.

By 2008, two million books, films, photographs, manuscripts, and other cultural works will be accessible through the European Digital Library. This figure will grow to at least six million by 2010, but is expected to be much higher as, by then, potentially every library, archive and museum in Europe will be able to link its digital content to the European Digital Library. The online availability of Europe's rich and diverse cultural heritage will make it usable for all citizens for their studies, work or leisure and will give innovators, artists and entrepreneurs the raw material that they need for new creative efforts.

The measures put forward in the Recommendation come on top of the financial contribution that the Commission already has set aside for the digital libraries initiative in the EU's Research and Development programmes and in the eContentplus programme. The Commission will co-finance amongst other things a network of centres of competence on digitisation and digital preservation. Europe's libraries, museums and archives are taking the lead in a range of projects starting this year which will add to the building blocks for the European digital library.

The European Digital Library is a flagship project of the Commission's overall strategy to boost the digital economy, the i2010 initiative. The text of the Recommendation on digitisation and digital preservation can be found on the i2010 Digital Libraries Initiative web site at: http://ec.europa.eu/information_society/activities/digital_libraries/index_en.htm

Related articles