Article summaries

Library Hi Tech News

ISSN: 0741-9058

Article publication date: 4 July 2008

167

Citation

Vassie, R. (2008), "Article summaries", Library Hi Tech News, Vol. 25 No. 6. https://doi.org/10.1108/lhtn.2008.23925fae.001

Publisher

:

Emerald Group Publishing Limited

Copyright © 2008, Emerald Group Publishing Limited


Article summaries

Article Type: Professional Literature From: Library Hi Tech News, Volume 25, Issue 6.

Electronic theses and dissertation (ETD) repositories: What are they? Where do they come from? How do they work?

Kristin Yiotis, in OCLC Systems and Services, v. 24 (2008) issue 2, pp. 101-15

In higher education, theses and dissertations are a cherished academic genre, with the quality of its students' intellectual products being a key indicator of the quality of the university itself. In particular, a university's standing is based in large measure on its ability to support original research, all of which is dutifully preserved in the library's special collections. Although the introduction of microfilm, typewriters and word processing during the last 50 years have helped increase circulation, in general theses have remained a treasured but underused resource.

“The capture and storage of ETDs” as the first such project was called sprang from discussions begun at UMI in America in 1987 in cooperation with the Coalition for Networked Information (CNI) et al. Ten years later the first examples in electronic format were accepted, while others were scanned from paper and microform versions. Then, in late 2006, the documents were migrated to ProQuest's subscription-based ETD database.

Meanwhile CNI engaged with other university partners to “develop and disseminate a standard method of using SGML to make dissertations available online”, submitting a proposal to the federal US education department. In addition to the obvious argument that “opportunities to unlock valuable university resources” were being missed, another important premise supporting the proposal was that, of the students receiving the 400,000 postgraduate degrees in the USA each year, few had the ICT literacy skills necessary to contribute fully to a future in which electronic publishing and networked information systems would be the norm. As a result of a successful bid, winning both government and corporate funding, the Networked Digital Library of Theses and Dissertations (NDLTD) was established.

In 1999 the NDLTD's further objective of “developing and testing models to arrive at standards for document formats and interoperability” attracted the attention of UNESCO, whose international mandate includes facilitating the “free exchange of ideas and knowledge”. This new initiative in turn led to the production of The guide for electronic theses and dissertations (http://etdguide.org).

With ETD becoming an institutional requirement across the USA, benefits are already being felt in the following areas:

  • student knowledge of how to contribute to and use digital libraries;

  • development of university digital libraries;

  • global sharing of university research; and

  • higher quality of postgraduate dissertations.

And, of these, the prime beneficiaries seem to be the students themselves, with one university reporting 1,565,151 downloads in one year from its collection of its initial collection of 3,393 ETDs. From UNESCO's perspective, the benefits will eventually spill over to society at large. Yet others argue the case for institutional ETD repositories as marking the move of libraries from mere custodians of knowledge to the highly visible vanguard, projecting their institutions' knowledge creation capabilities.

In most academic libraries, ETDs form a subset of each university's wider digital repository, and different institutions have set about populating their in different ways. For some, like Johns Hopkins University, after evaluating the current ETD publishing systems like DSpace, ePrints, DPubS and DiVA, allow students from all faculties to participate, while continuing to require a hard copy also. Elsewhere, like Vanderbilt University, departments have volunteered to make it a local requirement. At the University of Kentucky, the choice of electronic or hard-copy format is left at the student's discretion. However, at Virginia Tech for example, paper copies have been dispensed with altogether. In all this, there are serious implications in terms of student training. At the California Technical Institute, students are “walked through” the submission process online, from thesis regulations to faculty approval, whereas Virginia Tech offers its students workshops.

Developed at the University of Southampton, ePrint is the original open-source repository software, which assumes that faculty and students directly upload their own research. MIT's DSpace package, which is used not only in North America but also Japan, Africa, India and the UK, is considered more flexible and robust by some because of its fewer assumptions on the type of digital object being uploaded. Whichever software platform is chosen, and whether implemented by a university or, like ProQuest, a commercial organisation, all adhere to the 1999 interoperability and metadata harvesting standards known as the Open Archives Inititative. These provide for standardised metadata tags and unique identifiers stored in SGML/XML.

With all this development in ETDs, however, five key concerns have been identified, which each institution must address:

  • ownership of property rights;

  • the amount of access permitted;

  • the relationship with publishers;

  • plagiarism; and

  • costs.

Essentially, property rights do not change merely because a thesis or dissertation is submitted or stored electronically rather than on paper. However, care is required to ensure that the desired improved access does not deprive the owner, be it the author, the sponsor or the university, of her/his/its rights. The relationship with publishers is likewise problematic, since publishers may avoid works which are freely accessible elsewhere in a similar, prepublished form with more or less identical content. To secure this aspect of ETDs, various levels of access must be applied, ranging from worldwide access to the entire document to no access to any part. On plagiarism, while this is made easier by ETDs, so too is the detection. Regarding costs, institutions must recognise that, while electronic submission on its own may be cost-neutral for students, requiring both electronic and papers copies is likely to impose extra financial burden. Equally consideration must be given to the impact on university personnel and infrastructure.

Finally, in embracing this change, the unresolved issue of long-term electronic archival standards cannot be ignored. While PDF has been accepted as a preferred document format by the Government Printing Office (GPO) in the USA, and as a low-cost solution for ETDs, requiring minimal training and start-up costs, reliance on proprietary software poses serious challenges for future compatability. While the GPO suggests transferring all publications to HTML for text and TIFF for images, and while others are exploring SGML or XML as alternatives to PDF, there are inherent difficulties, particularly with SGML, which is seen as too complex. As regards XML, attempts thus far have uncovered the problem of devising a single, universal tagging schema (including musical, mathematical and scientific notation) capable of capturing all possible elements found in ETDs across all disciplines. Yet despite all these issues, a broad consensus favours the further implementation of ETDs as adding value both to the postgraduate experience and to potential of such research to impact on society.

Google Scholar and academic libraries: an update

Karen A. Hartman, Laura Bowering Mullen, in New Library World, v. 109 (2008) issue 5/6, pp. 211-22

The appearance of Google Scholar in November 2004 gave rise to considerable soul-searching on the part of academic librarians. Keen to evaluate its potential value to users, they were nevertheless perplexed as to how to integrate this broad-based resource into their carefully crafted categories of subject-specific subscription services. Or might a place be found for it in the general alphabetical listing of all citation resources, catalogues and subject guides? Undoubtedly attractive to users, not least on account of its ability connect them to local subscribed content, its free status gave librarians qualms because of fears of their consequent inability to steer its development.

At Rutgers University in New Jersey, after the decision had been taken to add the website to the libraries' approved collections and services, and to share with Google Scholar details of journal subscriptions, a study was conducted during the summer of 2005 to determine the extent of integration among all 113 members of the Association of Research Libraries (ARL). This study found that on six ARL members had linked to the resource from their homepages, with a mere 27 including a link from their alphabetical list of indexes and databases. In addition, 14 had incorporated it among their subject guides, while the website appeared in the OPACs of only six.

Two years later, in 2007, with the onset of what Siva Vaidhyanathan termed “The Googlization of everything”, and university libraries more ready to recognise the value of free resources, a further study was launched to gauge how much had changed. To begin with, as a product, Google Scholar remains a beta version; it still does not divulge its sources or partners publishers; it includes a fair amount of non-scholarly content; it appears to be updated only inconsistently; it does not allow Boolean searching; it lacks a controlled vocabulary or any authority files for authors' names or journal titles; there is no choice of how to sort results, and only limited means of exporting citations. Furthermore, the continued exclusion of the academic library community from the development of Google Scholar leads to inevitable questions on just how “scholarly” the search results really are.

However, on the positive side, librarians have been able to demonstrate increasingly high levels of usage through the numbers of authenticated links to subscribed resources. Another factor is the evident comfort feeling that users have with Google products generally, coupled with the high-yield discoverability of materials, especially in interdisciplinary topics, owing to the sheer quantity of full-text sources indexed. Two key desirables have been identified, both of which Google Scholar meets:

  • lowering the real or perceived barriers to access; and

  • making information available when the user needs it.

In addition, the concern over the academic level of content is being addressed, for example by the embarkation of major publishers like Elsevier. And the likelihood is that more and more publishers will come on board as the direct correlation between discoverability and usage prompts authors in turn to insist on having their research listed.

Turning to published literature on Google Scholar continues to concentrate on usability rather than on decisions to include or not on library websites. Those who do mention it point out the issue of its “fuzzy” cross-category nature. At the same time, there is a realisation that ARL websites should move away from the traditional “about the library” emphasis and instead focus on library users' information-seeking activities, ensuring resources are never too many clicks from the homepage. The results of the authors' own follow-up analysis are shown in Table I.

Table I

Other recent research compares Google Scholar with commercial federated search products, and also with tools that provide citation analyses. On citation analysis, one such tool is Harzing's “Publish or Perish”. With results based on Google Scholar, it is proving a popular alternative to subscription alternatives. In the fields of science and technology, despite the observed inconsistencies in Google Scholar's “cited by” listings, it has a strong following. Among its positive features are its cross-disciplinary search (e.g. “later-life migration”) capabilities when compared to the likes of Web of Science and Scopus and also its deeper penetration of books and open-access grey literature, such as preprints and conference proceedings.

On the issue of federated searching, Google Scholar's branded status as an effective enough, one-box “place to start” has been recognised by librarians for their users, who can be confused by plethora of options available. Indeed, about half of the ARL institutions surveyed appeared not to be using such commercial products as Ex Libris' Metalib or Serials Solutions' “360 Search”. A small-scale study of undergraduate use of Google Scholar compared with Metalib at Uppsala University may provide an underlying reason for this. It found that the free product “performed better in almost all measures”, with complexity of use detracting from perceptions of the commercial alternative.

To conclude, Google Scholar is considerably more pervasive on ARL-member websites now than two years ago. Even compared with a comparable free academic search tool, the authors' own findings show that the less than ten percentage points advantage it had over Scirus in 2005 in terms of mentions in the alphabetical list of databases and indexes had leapt to 40 per cent by 2007. Librarians' earlier reticence over free products has largely disappeared. However, a clear need for effective library instruction remains, as does the challenge to reduce the number of complicated lists which users face, and the amount of library jargon.

Internet abuse and possible addiction among undergraduates: a developing concern for library and university administrators

James Castiglione, in Library Review, v. 57 (2008) no. 5, pp. 358-71

The internet provides the infrastructure for the deployment of a significant proportion of the virtual learning environments, learning materials and information on which contemporary higher education depends. At the same time, it is also the means by which students access online role-playing games (ORPG) and the like. In 2006 it was estimated that those young people who engaged in ORPGs spent on average 20 h per week on them, not including email, chatting and other internet-related activities. And this excessive use, which may lead to “academic impairment”, is now a source of concern to university administrators, faculty, librarians and medical staff.

Internet use is defined as inappropriate if “it results in impaired functioning such as compromised grades or the failure to fulfill responsibilities”. In the absence of a conclusive body of evidence on the effects of internet use, concerns are based partly on analogy with 1980s studies of leisure-time television, where viewing of up to 10 h was found to be optimal; anything above that, it appeared, tended to displace constructive educational activities. While not denying the positive benefits of internet use, evidence for potential negative outcomes of inappropriate internet use include: repetitive strain injuries; social isolation; interference with appropriate eating patterns; obesity due to decreased physical activity; and, of course, reduced academic achievement.

Already there is research correlating late evening internet use and sleep disturbance with a decline in academic performance. However, “video game addiction is not considered a mental disorder at this time”, according to the American Psychiatric Association, which has nevertheless flagged up the need for hard evidence of craving, compulsion, loss of control and “persistence in the behaviour despite accruing adverse consequences” for further analysis by 2012. In the meantime there is growing awareness that a problem exists, particularly among undergraduates.

Routine self-assessment tests of first-year students' academic skills and ability to cope with their studies have shown that those with positive perceptions of personal efficacy are less likely to drop out of education. Those with negative perceptions are more likely to become depressed or to experience a reduction in productive thinking due to anxiety about possible failure. Among the recognised avoidance strategies in this latter group are alcohol and drug abuse. However, while these forms of abuse tend to attract opprobrium and are expensive, excessive non-educational internet use is not currently perceived as socially unacceptable and can be carried out openly at little or no cost to the student because of the ready availability of free campus-wide wireless connectivity in libraries, in teaching areas, as well as in residential accommodation.

Among the early signs that the decades-old government policies leading to the uncritical implementation of computerisation of school and university education in Europe and America constituted a problem was poor attendance at major cultural events. On one campus, an investigation found that 43 per cent of failed students showed “excessive patterns of late-evening logins to the university computer system”. At the same time, lecturers have begun to notice growing incidence of distracting and intrusive internet use by students during lectures.

Clearly, something in the higher education system, which has traditionally presupposed and increasingly demands an aptitude on the part of students for “self-regulated and mindful learning”, needs changing. However, a review of library literature provides no guidance at all on the development of training modules in ICT literacy, etc. aimed at raising students' capacity for self-regulation or offering advice on appropriate levels for recreational internet use. To tackle this deficiency, it is proposed that librarians take the initiative by establishing project teams involving faculty members, administrators and student representatives.

Although excessive inappropriate internet use has yet formally to be designated a mental disorder or addiction, drawing on experience of what does work for substance abuse to effect positive behavioural change, namely targeted messages at frequent intervals “without being intrusive”. Among the methods proposed are: advertisements in student newspapers; announcements on university radio; posters placed around the library and campus; and, most controversially, the use of messages scrolling across the bottom of network-connected computer screens.

Ultimately, the goal of further research and action by librarians and university administrators on the issue of non-educational internet use should not be to thwart technological progress but rather, through raising student awareness, to direct students towards appropriate use, thereby reducing the incidence of negative educational outcomes.

Books as inventory: suggested lessons from business

James W. Marcum, in The Bottom Line: Managing Library Finances, v. 21 (2008) issue 1, pp. 14-16

Heretical though it may seem to the traditional academic librarian, coming into library administration with prior experience in car dealership, one can but admire the usage statistics of small public branch libraries, with collections of 50-100 thousand volumes circulating on average two to three times per year. During the difficult economic climate of 1980s, selling new cars was not the easiest way to make money. The two main problems lay in (a) tying up too much cash in spare parts and (b) whether to hang on to an over-priced trade-in for six months or to sell now below cost, take the loss and use the cash to finance a hopefully better deal next time.

So, how do the problems of car salesmen relate to libraries, especially within universities? To begin with, the just-in-case concept still pervades book selection. The more that is published, the more the pressure grows to acquire at increasingly unsustainable rates. Second, most libraries still live in a big-is-beautiful world, where volume count remains a core measure of collection “goodness”. Third, to dispel the perception that library directors lack any business sense, high demand should take precedence over “long tail”, focusing on what current users want and looking to collaboration, print-on-demand and digitisation to plug gaps as and when required. Fourth, shelves stuffed with un[der]used books and print journals, in other words excess inventory, limit choices and proscribe change, not least in the direction of ICT. And finally, one has to remember that the use, and not the number, of books provides the best evidence of meeting stakeholders' needs: inventory turn signals high vale.

One key difference between information and physical objects, including spare parts for cars, is that the latter can only ever be in one place at any one time. In many, if not most, cases, comparable information to that found in an item out on loan can be located in another book, an article, or a website. Keeping specific volumes on open access, unused perhaps for decades, just in case is a throwback to a bygone age, when books were rare, and their very existence seemed to inspire reverence. But that is the past. Now information is abundant. Like the car salesman who paid too much for a trade-in vehicle that no one wants to buy, accept that a mistake has been made and do not keep it, get rid of it. Make space for something that attracts more custom: in the library context, that might even mean dismantling shelves to make room for a café.

To every rule, there is an exception. Freud's Interpretation of dreams, for example, apparently only sold 351 copies in the first six years following publication. But generally, unused books are “dead inventory”; and having 90 per cent of holdings sitting idle may bring a glow to the hearts of a few faculty and members of the friends of the library group, but it creates a fog preventing today's “net generation” from making any sense of the library's purpose.

Roderic Vassie

Related articles