Previous   Contents   Next
Issues in Science and Technology Librarianship
Summer 2006
DOI:10.5062/F42V2D1H

URLs in this document have been updated. Links enclosed in {curly brackets} have been changed. If a replacement link was located, the new URL was added and the link is active; if a new site could not be identified, the broken link was removed.

Wells Reserve Research Library: A Use and User Analysis

Bevan Angier
MLIS candidate
Syracuse University
Syracuse, New York
beangier@syr.edu

Anne Piergrossi
MLIS candidate
Syracuse University
Syracuse, New York
akpiergr@syr.edu

Kathleen Spahn
MLIS candidate
Syracuse University
Syracuse, New York
kmspahn@syr.edu

Abstract

This study focuses on the collection of a small science library within the National Estuarine Research Reserve System. It reviews the history, context, and existing constraints on its collection and development and suggests ways in which the library might overcome barriers in its effort to obtain grants for its expansion of print materials.

The library plays a three-fold role in the local community: it hosts both staff and visiting scientists conducting estuarine research; it educates the local community -- schoolchildren, adults, and policymakers; and it advocates for sensitive and sustainable stewardship of the land in partnership with local organizations and policymakers.

The Education Director has been repeatedly turned down in her attempts to obtain grant money because the library is too new (to the public) to demonstrate any user base or long-term viability. Granting organizations suggested she come back for reconsideration when rising usage statistics could be demonstrated.

This study seeks to propose a method by which the Wells Research Library can measure the use of its collection and more accurately identify its user base and attempts to bring together a variety of tools and methods into a single cohesive measurement process. Once established as a routine, the regular collection of data can be used to demonstrate patterns and changes in the library's use and user base over time, and thereby be able to present such data to granting organizations. This system can be adapted by any small specialized library interested in measuring its users and usage on a regular basis -- particularly one connected to a nonprofit or parent organization.

Introduction

The National Estuarine Research Reserve System (NERRS), established in 1972, consists of 26 protected coastal regions across the United States. A partnership between the coastal states and the National Oceanic and Atmospheric Administration (NOAA), NERRS "...protects more than one million acres of estuarine land and water, which provides essential habitat for wildlife; offers educational opportunities for students, teachers and the public; and serves as living laboratories for scientists" (National Estuarine Research Reserve System 2006). The various sites are managed by local state organizations and receive some funding and assistance from NOAA. Five of the NERRS institutions have on-site libraries.

The Wells Reserve, in Wells, Maine, joined NERRS in 1986. Located on the 1,600-acre site of Laudholm Farm and containing fields, forests, wetlands, sand beach, dunes, and rivers, the Wells Reserve is overseen by the nonprofit organization Laudholm Trust.

The Wells Reserve plays a three-fold role in the local community: it hosts both staff and visiting scientists conducting estuarine research, particularly in relation to the restoration and maintenance of these coastal environments in the face of development, sprawl, erosion, and pollution; it educates the local community -- schoolchildren, adults, and policymakers alike -- about coastal ecology; and it advocates for sensitive and sustainable stewardship of the land in partnership with land trusts, conservation commissions, and local policymakers.

In any given year, as many as 100 scientists may visit Wells to conduct research, and the Reserve will typically direct or participate in around 20 studies involving scientists, students, and staff from the Reserve and from other academic and environmental institutions. Wells staff also create and lead more than 20 different interpretive tours of the habitat for all age groups and interests, with a consistent focus on environmental stewardship and issue awareness. One of the most ambitious outreach programs promises at least one day of coastal ecology education for all local elementary school students.

About six years ago, the Director of Education began to consolidate the miscellaneous materials used by researchers, volunteers, local educators, and staff into one location. Over the years the Reserve has accumulated approximately 3,000 items, mostly through donations. A selection policy has kept the collection focused, but it suffers from a lack of funding that prevents any expansion of the collection in a planned fashion.

The collection, whose primary users are researchers, docents, and local educators, is cataloged online through a regional OPAC known as Minerva. This consortium of more than 85 Maine libraries provides electronic catalog access to more than six million library items through Innovative Interfaces' Millennium Cataloging software. On April 13, 2005, the Wells Research Library formally announced its participation in Minerva, making its collection electronically visible for the first time.

The Education Director has been repeatedly turned down in her several attempts to obtain grant money because the library is too new (to the public) to demonstrate any user base or long-term viability. Granting organizations suggested she come back for reconsideration when rising usage statistics could be demonstrated

This study seeks to propose a method by which the Wells Research Library can measure the use of its collection and more accurately identify its user base and attempts to bring together a variety of tools and methods into a single cohesive measurement process. Once established as a routine, the regular collection of data can be used to demonstrate patterns and changes in the library's use and user base over time. This system can be adapted by any small specialized library interested in measuring its users and usage on a regular basis -- particularly one connected to a nonprofit or parent organization.

Previous Research

We approached previous studies in three ways: by measuring the recorded usage of existing library materials and resources; by investigating the user base through a consideration of user needs and satisfaction; and by placing the Wells Reserve Library into a larger context of similar libraries to help in evaluating both the quality and quantity of its resources.

Use Studies

"Use as an activity is still the most valid measure of any item's worth to a library or information system..." (Burns 1978). Much of the literature on the subject of measuring library usage has to do with issues of collection maintenance -- identifying the most- or least-used areas of the library. This review is limited to those studies that might be useful as tools in measuring the evolving use patterns of a small emerging library. It does not include studies that focus on use of electronic resources, as these are extremely limited in the current configuration of the Wells Reserve Library.

In-Library Use Data

Biggs' (1990) framework for categorizing methodologies includes "touch techniques," "reshelving techniques," and questionnaires or interviews. Each has advantages and disadvantages.

Touch techniques count anything moved for any purpose in a given period of time. Both time-consuming and prone to under- and over-counting, its primary advantage lies in not requiring patron participation.

Reshelving methods record items as they are put away and rely in part on patron participation. Numbers can be tallied either as "all or nothing" or in greater detail. Problems include an inherent underestimate of use, as it relies on patrons leaving books on tables, and count deflation if more than one person uses a book before it is reshelved. Because of the potential for distorted figures, it is recommended that this method be used in conjunction with other methods.

More difficult, but with the potential for richer, more accurate information, is the use of questionnaires or interviews. Biggs notes several potential dangers: selection of a representative time period and pool of users; the difficulties inherent in creating a valid questionnaire that will elicit easily interpreted answers; and self-conscious use of materials by patrons for the purposes of the survey.

Circulation Data

In addition to simple counting of numbers of items circulated, Van House et al. (1987) include turnover rate as one statistical measure of a library's output.

Christiansen, Davis, and Reed-Scott (1983) note that advantages of circulation studies are their flexibility of study/sample size and of data analysis, easily countable units, and objective nature of the information. There are disadvantages as well: data reflect successes, not user failures; excludes in-house use and under-represents actual use; and fails to identify low use due to poor quality of collection.

As a counterpoint, Mason (1998) cautions against an over-reliance on the use of circulation data as a gauge of library use and suggests that electronic catalogs, databases, electronic encyclopedias, and the web itself have become core services. Her point is a valid one for Wells Reserve Library to consider, as data gathered solely about Wells Reserve Library's "traditional" resources might serve to reinforce the limitations under which it currently operates.

Interlibrary Loan Data

This aspect of the Wells Reserve Library usage statistics might reveal some specific inadequacies that could be corrected through careful acquisitions of new materials. It also would add weight to any requests for funding if patterns of weak areas were to emerge. Van House et al. provide a tally sheet to be filled out with a variety of codes to identify areas. Ochola (2002) demonstrates both use and non-use of areas of a collection by combining use factor with the percentage of expected use and ratio of borrowings to holdings.

Combination Methods

Combined methodologies are a mix of questionnaire, interview, observation, and circulation records. Examples include the Pittsburgh Study (Galvin and Kent 1977) and a smaller study by Kairis (2000) comparing use of donated materials to purchased materials.

Drawing on the literature of measurement and evaluation, Nicholson (2004) has developed a theoretical model enabling libraries to obtain a more holistic self-understanding. Under its current budget constraints, the Wells Research Library does not have the resources to build such an extensive system. However, as Nicholson suggests, his framework "should be seen as a guide to the selection and implementation of measurement and evaluation procedures rather than a detailed process that must be followed without deviation." The establishment of routine procedures to collect and analyze use and user data represents a first step in that direction.

User Studies

A review of the current literature indicates that user surveys are at the crux of modern library assessment, particularly when they focus on outcomes as opposed to outputs. Oltmanns (2004) suggests that the use of assessment in regular library activities assures an understanding of customer needs and eventual satisfaction of those needs. According to Oltmanns, using assessment regularly helps libraries to predict and anticipate future needs while remaining flexible enough to manage the library's response to change easily and effectively.

Community and Organization Analysis

Much of the user survey literature discusses confusion about the terms "use" and "users" when referring to surveys or studies. Powell (1988) states that "...user studies should focus not on what libraries do, but on what people do, or wish they could do if they could obtain the necessary information." According to Powell, user surveys fall into two categories: in-house surveys of users and community analysis. He considers both users and non-users, concludes that surveying nonusers results in "non innovative" suggestions, and recommends that libraries concentrate on users, collecting data on "characteristics of the users, their purposes for using the library, services used and level of satisfaction, reasons for dissatisfaction, materials used and availability, users' search patterns, additional library services needed as perceived by users and priority of services."

Pungitore (2001) describes community analysis as "defining community information needs analysis as a structured, planned, formal study that identifies the information requirements of the people within a library's jurisdiction." In his view, "...needs analysis consists of 10 steps: laying groundwork, preparing to conduct the study, framing questions and choosing tools, designing data-gathering instruments, using in house data, launching a study, analyzing results, sharing results, and acting on results."

Kaarst-Brown et al. (2004) discuss the relevance of identifying the unique characteristics of an organization's culture to help identify which are relevant to future success and growth, a concept that is certainly applicable to a library trying to discover who their users are and why they use this particular library.

User Perceptions and Needs

Cullen (2001) analyzes a number of surveys using an evidence-based approach that examines differences in perceptions between user and manager expectations. Cullen draws a connection between service quality and user satisfaction and discusses how user surveys can determine this. Hernon and Altman (1998) maintain that assessing the quality of customer service is an absolute necessity in today's competitive environment. Assessment must identify the gap between a library's own internal standards and customer satisfaction. They highlight three important questions to be answered: "What kind of reputation does your library have? How well does that reputation match the one that the library staff want and what is being done to improve the reputation?" Service quality is defined as "...the relationship between the library and its users, in terms of customer expectations and how the organization meets or surpasses them; as involving long term paying attention to the customers expectations, and as a reputation that is known by the users and existing and potential funding sources." Hernon and Altman suggest using a five-point customer satisfaction survey as well as focus groups, feedback, and studies of "customer use and information seeking behavior."

Methodologies/Focus

A recent movement towards accountability in public libraries has made the application of cost-benefit analysis a reality, according to Holt and Elliott (2003). As libraries continue to be held accountable they need to measure not just inputs and outputs but outcomes as well. These authors designed a project called CBA I in the mid 1990s to measure large public library services and CBA II in 2002-03 to measure medium and small libraries. Holt and Elliott found that using the economic measurement tool of cost-benefit analysis and applying it in a library setting successfully measured service outcomes. Library's Continuous Improvement Fieldbook (Laughlin et al. 2003) also discusses survey tools, such as the affinity diagram, check sheet, criteria rating scale, operational definition, and plus/delta, which can be used to measure outcomes.

Determining the bottom line of the total operational cost is crucial for a library seeking outside funding. Forrest (2003) discusses this trend towards outcomes assessment and recommends "a commitment to systematic and ongoing collection, compilation, analysis and reporting of data about inputs, outputs, and outcomes" to develop a culture of assessment in a library.

The University of Washington libraries have been assessing user needs, satisfaction, and library performance since 1992, and Hiller (2001) discusses the many approaches employed over the years, including large-scale user surveys, informal methods such as suggestion boxes and service desk comments, meetings between faculty and librarians, focus groups, and finally LIBQUAL+ (an outcomes measurement tool adapted specifically for libraries from SERVQUAL and frequently used to determine customer satisfaction based on quality of service). In an era of dwindling budgets Eng and Gardner (2005) suggest reviewing other surveys and adapting them to meet individual library needs by focusing on performance and satisfaction.

Morris and Barron (1998) suggest a constant dialogue with library patrons as a means to uncover needs and work towards achieving user satisfaction. They discuss the need to put customers at the center of public services and emphasize the four themes of "quality, choice, standards, and value." They also point out the positive aspects of using specific communication methods such as comments and complaint procedures, surveys of users and non-users, user consultation, staff feedback, and the development of library charters in which customers are central.

Comparative Institutional Studies

To provide comparative context, other estuarine research libraries, science-technology libraries, and special libraries were examined in order to determine their administrative approaches and how they conduct evaluations and assessments of their collections and users.

Other Estuarine Libraries

Two enlightening studies evaluate, respectively, the teacher development programs in NERRS institutions in general, and the training needs and a market inventory for the Jacques Cousteau NERR in New Jersey. The studies were performed by independent consulting firms under contract to NERRS and the Jacques Cousteau NERR, and are posted as PDF documents on each library's web site.

These studies show that education (both K-12 and adult continuing ed) forms the primary focus for library activities at these institutions, with seemingly less emphasis placed on serving researchers or scientists.

The estuarine institutions gain funding opportunities from partnerships, particularly with local universities (which provide access to resources and funding) and also with local schools, nonprofits, and community decisionmakers. The libraries are also used as a community hub, with auditoriums or public spaces available for distance learning or video conferencing (seminars, presentations, etc.).

Sci-Tech Academic Libraries

The theme of partnership continued in the sci-tech library literature as well. Broome's (2004) study highlights a particularly creative application of resource sharing. Georgia Southern University, which does not have the resources to support a new library or full-time librarian, instead provides sci-tech resources to their students through partnering agreements with other state universities and a shared librarian "circuit rider" consultant.

Special Libraries

Because of their very nature as parts of a larger organization, special libraries have long been forced to quantify and measure their performance. The literature reflects this focus on assessment. Langley and Martinez (1999), librarians at Duke University's Science Libraries, found that performing cost-benefit analysis was one way to prove value to the organization and plan for the future. Gohlke (1997) discusses competitive benchmarking as a means to evaluate and improve a library's performance, and Weiner (2000) advocates return on investment (ROI) analysis as another evaluation method. Paris (1990) raises the important issue of buy-in from management, which can be developed in part through the gathering of quantifiable, measurable statistics supporting the value of an in-house library. Keyes (1995) exhorts librarians to run their library like a business and consistently demonstrate value to the larger organization. Powers (1995) encourages the creation of an annual report and integrated marketing plan to help a library define its place within an organization and assist in fundraising activities such as grant writing.

A recent overview of zoo and aquarium libraries by Barr (2005) provides useful parallels for Wells, with whom they share a narrow subject focus, generally small size, and attachment to a larger nonprofit institution. Most zoos and aquariums have missions emphasizing education, research, and conservation, and they cannot gain accreditation unless they have a library supporting those missions. Like NERRS institutions, these libraries serve staff, outside researchers (students, scientists from other institutions), and the public. Barr ties the quality of the library to the reputation of its institution, which, if it does not "...keep up with advances in knowledge will lose credibility and ultimately suffer, very likely, with effects on attendance and funding." Zoo/aquarium libraries also participate in creative partnerships with related academic or community institutions in order to provide resources in the face of low budgets and rising costs. Finally, Barr's survey revealed that these libraries are moving towards professionalization, with nearly 20% of responding libraries reporting a librarian with an MLS. This can be compared to an earlier study by Gibbs (1993) regarding the qualifications of special science librarians, in which all 12 special librarians interviewed in the North Carolina Research Triangle held the MLS degree.

Sources of the Data

Data analysis could be drawn from three places: use of the collection, user and nonuser information, and benchmark studies of other estuarine libraries.

The two main measures of usage data are to be derived from circulation and interlibrary loan requests as tabulated in the regional OPAC system and through tallies. Circulation data are to be examined in terms of resources checked out and resources used in-house. Analysis of interlibrary loan data should be performed to shed light on which materials are most used and where the collection has gaps or weaknesses.

Surveys of reserve staff; local educators and students, both K-12 and college-level; scientists and researchers; other environmental nonprofits in the region; local government and community decisionmakers; and a random sample of the community in general will result in a comprehensive understanding of who is and is not using the collection.

By contacting staff and/or volunteers of other estuarine libraries, the Wells Reserve Library should develop benchmark data to compare the usage and users of other libraries with their own.

How to Elicit the Data: Library Use

To begin, the Wells Reserve Library will need to collect and analyze data to represent the actual usage made of library materials and resources. This does not attempt to place a value on the collection itself, or to examine the patterns of use as they relate to individual items in the collection. Since no measure of unsuccessful research or non-use of resources is included here, no conclusions can be drawn about how well the collection is serving its intended community. Through analysis of interlibrary loan data such as Ochola proposes, it is possible that some conclusions could be drawn about strengths and weaknesses of the collection. This is not within the scope of this proposal, but is something the Wells Reserve Library may want to pursue at a later time.

Mindful of the disadvantages of circulation data described by Christiansen, Davis, and Reed-Scott, a variety of in-house use measures have been included. We hope that the combination of the more subjective measures of the user data (satisfaction, etc.) with the quantitative data here will also shed light on areas where improvements might be made.

Every effort has been made to minimize staff involvement in data collection due to absence of professional staff and the limited hours that the library is open to the public.

The Wells Reserve Library has joined a statewide online public access catalog known as Minerva. Although the patron interface to the holdings is quite basic, Innovative Interfaces, Inc. is the underpinning. Known as Millennium Web Interface, this library system software permits sophisticated data collection of both usage and collection data. The methodology used in this study focuses on the usage components, but it is worth noting that extensive analysis can be done of a library's holdings and the circulation of those holdings on the item level.

Currently the Wells Reserve Library collects only a minimum of information from its registered users (name, address, phone, e-mail), and identifies only two types of user -- staff and public. The Millennium system permits the addition of multiple "Patron Statistical Codes" (Pcodes) that are used to identify subsets of users for collecting statistics based on patron characteristics. Those Pcodes are library-defined, so they can reflect any characteristics that would be useful in the gathering of data. The Millennium system can generate reports on registered patrons ranging from a simple count to an analysis of circulation and collection use based on Pcode.

The Wells community is difficult to characterize with any accuracy since it reflects an interest group rather than a geographic area. In addition, while the focus of interest is narrow, the patron perspectives are varied. Maintaining a registered user population without any identifying characteristics limits the library's ability to assess its collection as it reflects the community's needs. We suggest that the Wells Reserve Library begin to include more detailed information in its patron registrations in order to be able to better understand both the user population and the usage patterns that will appear over time. For example, simply identifying age and education level might be a quick indicator of the level of materials most likely to be used by patrons. The collection of usage statistics will be accomplished through the library's participation in the Innovative Interfaces system. Preformatted reports are available through a simple user interface, and more customized reports can be generated by exporting data to an Excel spreadsheet.

Two main areas will be considered during data collection: circulation of materials (both owned, and borrowed or loaned through ILL) and in-house use of resources (materials, computer use, and reference). In all cases, due to the small size of Wells Reserve Library, complete counts will be made rather than sampling a subset of transactions.

Circulation

Circulation reports can be generated through Millennium Web Management Reports, which show raw numbers and the percentages for items: check out/in, renewals, holds, recalls, and totals. Time periods range from as little as one day up to 36 months prior. The items can be sorted by terminal (library) and by shelving location. In the case of Wells, location would indicate whether an item was from the general collection, docent materials, AV, or reference.

Drawing from the measures and calculations described by Van House et al., we plan to generate the following reports:

  1. Circulation Activity by Location (WRL) breaks out by patron type, shelving location, item type (medium), and call number. Again, this can be collected for varying time periods.
  2. Interlibrary Loan Report shows both the number of items borrowed from other libraries, and the number of items loaned to other libraries. Totals indicate the ratio of items borrowed/loaned.
  3. Monthly reports providing an ongoing picture of the library's circulation activity.
  4. Annual reports showing per capita measures (number : population) for circulation, interlibrary loans, and turnover rate.
Other than the per capita measures, all these reports can be presented as tables and graphs using tools within the Millennium web interface, and can be seamlessly exported to a Microsoft Excel spreadsheet.

In-house Use

Data collection in this area will include a variety of methods. Posters alerting patrons to the reasons and importance of the survey should be prominently displayed. Because the library has only very limited hours when it is open to the public, it is suggested that the sampling period be limited to three two-week periods (spring, summer, fall). The limitation is imposed in an effort not to tax the capacity of volunteers to handle the scanning and reshelving of books, or the patience of regular patrons asked to fill out tallies on a daily basis. It is hoped this will be a sufficiently representative sample to provide an accurate picture of usage. However, the library should review the results critically and adjust the time periods if necessary. The following data will be collected:

  1. Barcode scanning of books as they are reshelved using automated reporting through Millennium software
  2. Self-reported tally of users as they leave building
  3. Computer use through self-reported tally of hours of use and purpose
  4. Tally of appointments made to use library during non-regular hours
  5. Tally taken from log of reference materials borrowed
  6. Tally of queries made to any staff/volunteers present in library
The collection of this information makes no distinction among the types of use. Each is reported as a single use and is being used to create baselines for subsequent surveys.

It should be noted that there are no viable means in place with which to track queries to the OPAC system that do not result in formal loan requests. Simple queries to the catalog are not logged in a way that identifies the originating computer. Wells Reserve Library should monitor the OPAC's ability to do this tracking, as it might be an important means for measuring unmet needs.

Survey for Benchmark Institutions

The final piece in the use puzzle will come from a survey of benchmark institutions. It is recommended that the Education Coordinator at Wells, in coordination with the volunteer librarian, begin this process by making a phone call to the librarians at the NERRS libraries to lay the groundwork. The initial conversation can then be followed up with this survey, e-mailed or mailed depending on the preferences of the individual librarian. See Appendix 1 for a sample questionnaire.

How to Elicit the Data: Users

Long Form Survey: Print/Online

A questionnaire designed to gather information from users will be distributed in a variety of formats to ensure that it reaches all audiences served by the Wells Reserve Library. The questionnaire/survey will be:

An online version of the survey will also be created using a reasonably priced tool such as Survey Monkey to design the survey, collect responses, and analyze the results. The Wells Reserve web site will promote a link to the survey online. People receiving the long survey have the option of responding online, enabling them to choose the format that is most convenient.

For respondents, both print and online, an incentive to complete the survey should be offered as discussed by Eng and Gardner in an American Libraries article (2005). One possibility: a chance to win a private tour of the Wells National Estuarine Research Reserve and one year membership in the Laudholm Trust. According to the Wells Reserve web site membership in the Laudholm Trust provides "free admission to the Reserve from Memorial Day through Columbus Day and program discounts" (www.wellsreserve.org/).

It is recommended that the surveys be available both online and in the library for three two-week periods (spring, summer, fall) -- the same period during which the use surveys are conducted -- and that the survey be publicized for the month prior to the survey period to build community awareness and interest. Awareness can be built using press releases to local/regional news sources, Reserve web site announcements, public service announcements on local/regional cable TV and radio stations, and flyer and poster distribution by Reserve members and volunteers. The Reserve's community partners (school district officials and teachers, local environmental nonprofits, local/regional government bodies, Chambers of Commerce) can also help spread the word about the survey.

Because of the Reserve's limited budget, the survey (see Appendix 2) is modeled after LibQual and other existing survey tools, with adaptations that take into account the specific context of Wells' situation as suggested by Chivers and Thebridge (2000).

Since the library is only open two half days each week from January 16 to December 15 we recommend that a head count be done for at least two periods of two months each, in the winter (while school is in session but the Reserve is less busy) and in the summer (while school is out but during the Reserve's peak season). During these time periods, library volunteers perform a head count of persons entering the library on a daily basis during regular hours of operation using a simple tick sheet method. It is recommended that this be done annually until the library has enough gate statistics to chart patterns of use.

Checklist Survey: Print

For student users enrolled in middle/high schools an abbreviated questionnaire in the form of a checklist (see Appendix 3) was devised for distribution in the public school systems served by the Reserve. This checklist survey and collection box can also be placed in the youth sections of the area's public libraries. It is recommended that the Reserve volunteer or employee who handles public relations utilize all existing school contacts to create a base of support and make use of school staff meetings to get announcements made about the survey and garner additional cooperation and support. This same person should also meet with public library directors/board members to manage the distribution of the surveys in area libraries.

A high-profile thematic tie-in for the survey could be Earth Day; the survey could be publicized along with Earth Day activities on signs, posters, or in public service announcements on radio and television and in the schools themselves. As with the general survey, some form of incentive will be useful to encourage participation. Possibilities include entry into a drawing for free enrollment at a Well's Reserve program of the winners' choosing.

It is assumed that if the library is to enjoy continued growth new users must be developed early on and added to the existing base of users. Another benefit of attracting school-aged users is the likely involvement of their parents at some level of library/Reserve use.

Non-Users

Anyone in the library geographic area, regardless of whether or not they use the library, will be encouraged to complete the survey because it is likely that they are at least aware of the Wells Reserve and could become library users. Even if people have not used the library themselves, their children may have attended field trips there, or they may have attended the yearly community craft fair or some other event at the Reserve, or their local councilors may have used the library to research environmental issues important to community decisionmaking. In addition, a random sample of residents from the phone directory will receive the survey after their names have been checked against those on other lists to prevent double mailings (Cullen 2001).

Evaluation of the Data

Our evaluation of the data will focus on answering the following questions:

By what percentage is usage of the collection changing?

This will be gleaned from circulation records, ILL data, in-house circulation measurement methods (including a tally of reference materials used by in-house staff and barcode scanning to collect reshelving data), and librarian observation/experience with patrons who request research appointments or have reference questions by telephone or in person. Also included in this calculation will be usage of the public research computer. Its popularity will be tracked with a simple tally sheet on which users indicate their hours of use and nature of research. This percentage will be calculated both per resource type and total. Due to lack of employee resources, this data will be collected only during predetermined periods of time throughout the year.

Which resources are the most popular?

The usage of resource types (periodicals, reference, AV, docent materials, and general collection are the indicators used in the catalog) will be ranked by popularity and measured on a monthly basis using the same data as above.

Which resources do users themselves indicate using?

We will measure the usage of resources as indicated by the in-house surveys, surveys mailed/e-mailed out, and online surveys, and compare these results to the actual use statistics above in order to determine any gap between stated and actual use.

When do users visit the library?

Usage ranked by day of week and block of time (9-12, 12-3, 3-6) will be measured as determined by observation, survey, and sign-in sheet which contains time in, time out, affiliation, and the date.

How many people use the library?

Overall attendance will be measured twice a year for two months each time using observation and sign-in sheet. We will also measure the percentage change in registered users as measured against any change in overall borrowing or in-house use rates.

How do users interact with the library?

We will measure the percentage popularity of telephone reference, e-mail reference, pre-arranged reference appointment, and walk-in usage.

What is the breakdown of users?

Users will be broken down into percentages by region. (Note: Wells has specifically chosen not to collect data beyond name, address, phone number, and e-mail for cardholders, even though it is possible to do so through the regional OPAC system.) The user surveys and sign-in sheets contain an affiliation field and, depending on the statistical accuracy of the responses to these, affiliation may or may not also be calculated. Wherever possible within these results, the overall response rate as well as the response rate within each identified grouping will be determined, as well as what percentage of the respondents used which type of measurement tool.

This survey will be run on a yearly basis to measure growth or shrinkage in user base and groupings within that base.

How do we compare to benchmark NERRS institutions?

The percentage difference between our results and the responses of the NERRS benchmark libraries will be compared. The accuracy and usefulness of this section will depend upon the NERRS libraries, their willingness to participate, and their possession of similar data to share, but it is hoped that at least some baseline comparisons and qualitative analysis can be made.

Presentation of the Results

The results will initially be tallied in a spreadsheet for maximum flexibility and analysis. From this base document, the results will be presented in a number of different formats depending on the context.

Evaluation of the Results

Once the use/user survey is completed and the results known, the entire process must be evaluated to ensure that any problems are addressed for the next assessment cycle (Zarnosky and Evans 2000). A group of Wells Reserve employees -- those who administered the study and a few who did not, to provide an outside perspective -- should be called together to evaluate the overall assessment process and examine the study, its methodologies, and its results. Did the amount of time dedicated to the assessment provide any return on investment? If the answer is yes, and the assessment process will be continued in subsequent years, should any changes in methodology, assumptions, conclusions, variables, sampling, data evaluation, or data presentation be made? Can the study be replicated in the library or elsewhere in the organization? Finally, if the study did indeed support the hypothesis, has it helped the Wells Reserve Library write any successful grant applications?

Appendix 1

Coastal Resource Library Questionnaire

Appendix 2

User Questionnaire

Appendix 3

In-School Checklist Survey

References

Barr, Dorothy. 2005. Zoo and aquarium libraries: an overview and update. Science & Technology Libraries 25(3): 71-87.

Biggs, M. 1990. Discovering how information seekers seek: methods of measuring reference collection use. Reference Librarian 29: 103-117.

Broome, Joellen. 2004. Science and technology library innovations without a science and technology library. Science and Technology Libraries 24(3/4): 375-388.

Burns, R. 1978. Library use as a performance measure: its background and rationale. Journal of Academic Librarianship 4(4): 5-8.

Chivers, B., & Thebridge, S. 2000. Best value in public libraries: the role of research. Library Management 21(9): 456-465.

Christiansen, D. E., Davis, C. R., & Reed Scott, J. 1983. Guide to collection evaluation through use and user studies. Library Resources & Technical Services 27: 432-440.

Cullen, R. 2001. Perspectives on user satisfaction surveys. Library Trends 49(4): 662-686.

Eng, S., & Gardner, S. 2005. Conducting surveys on a shoestring budget. American Libraries 36(2): 38-39.

Forrest, C., & Williamson, A. J. 2003. From inputs to outcomes: measuring library service effectiveness through user surveys. Georgia Library Quarterly 40(2): 12-18.

Galvin, T. J., & Kent, A. 1977. Use of a university library collection: a progress report on a Pittsburgh study. Library Journal 102(20): 2317-2320.

Gibbs, Beth Liebman. 1993. Subject specialization in the scientific special library. Special Libraries 84(1): 1-8.

Gohlke, Dorothy Annette. 1997. Benchmark for strategic performance improvement. Information Outlook 1(8): 22-24.

Hernon, P., & Altman, E. 1998. Service quality and customer satisfaction do matter. American Libraries 29(7): 53-54.

Hiller, S. 2001. Assessing user needs, satisfaction, and library performance at the University of Washington libraries. Library Trends 49(4): 605-625.

Holt, G. E., & Elliott, D. 2003. Measuring outcomes: Applying cost benefit analysis to middle sized and smaller public libraries. Library Trends 51(3): 424-440.

Kaarst-Brown, M.L. 2004. Organizational cultures of libraries as strategic resource. Library Trends 53(1): 33-53.

Kairis, R. 2000. Comparing gifts to purchased materials: a usage study. Library Collections, Acquisitions and Technical Services 24(3): 351-359.

Keyes, A. M. 1995. The value of the special library: review and analysis. Special Libraries 86(3): 172-187.

Langley, Anne, & Martinez, Linda. 1999. Learning our limits: the science libraries at Duke University retreat to respond to our changing environment. Issues in Science and Technology Librarianship 24 [Online]. Available: http://www.istl.org/99-fall/article1.html [Accessed July 25 2006].

Laughlin, Sara, Shockley, Denise Sisco, and Wilson, Ray. 2003. The Library's Continuous Improvement Fieldbook. Chicago: American Library Association.

Mason, M. G., St. Lifer, E., & Rogers, M. 1998. Cleveland Public redefines patron usage in electronic age. Library Journal 123(6).

Morris, A., & Barron, E.a> 1998. User consultation in public library services. Library Management 19(7): 404-415.

National Estuarine Research Reserve System. 2006. [Online]. Available: http://nerrs.noaa.gov/ [August 7, 2006].

Nicholson, S. 2004. A conceptual framework for the holistic measurement and cumulative evaluation of library services. Journal of Documentation 60(2): 164-182.

Ochola, J. N. 2002. Use of circulation statistics and interlibrary loan data in collection management. Collection Management 27(1): 1-13.

Oltmanns, G.V. 2004. Organization and staff renewal using assessment. Library Trends 53(1): 156-171.

Pandion Systems, Inc. 2003. Inventory and Assessment of K-12 and Professional Teacher Development Programs in the National Estuarine Research Reserve System. [Online]. Available: {http://nerrs.noaa.gov/Education/k-12.html} [Accessed: March 15, 2005].

Paris, Marion. 1990. A management survey as the critical imperative for a new special library. Special Libraries 81(4): 280.

Powell, R. R. 1988. The relationship of library user studies to performance measures -- a review of the literature (Occasional Paper, no 181). Urbana, Ill: University of Illinois GSLIS.

Powers, Janet E.a> 1995. Marketing in the special library environment. Library Trends 43(3): 478-493.

Pungitore, V.a> 2001. Identifying and analyzing user needs. Library & Information Science Research 23(4): 373.

Responsive Management. 2003. Coastal Training Needs Assessment and Market Inventory for the Jacques Cousteau National Estuarine Research Reserve, Volume 1. [Online]. Available: {http://web.archive.org/web/20130418034658/http://www.responsivemanagement.com/download/reports/NJCoastalReportFinaldist.pdf} [Accessed March 26, 2005].

Van House, N. A., et al.a> 1987. Output Measures for Public Libraries: A Manual of Standardized Procedures (2nd ed.). Chicago: American Library Association.

Weiner, Barbara. 2000. A bottom-line adventure: time and cost study at Hazelden Library & Information Resources; presented at the Substance Abuse Librarians and Information Specialists conference, April 1999. Behavioral & Social Sciences Librarian 18(2): 27-31.

Zarnosky, Margaret R. and Evans, Edward G. 2000. Developing Library and Information Center Collections (4th ed.). Englewood, Colo.: Libraries Unlimited.

Previous   Contents   Next

W3C 4.0 
Checked!