Issues in Science and Technology Librarianship Summer 1998

This document was distributed as a handout at the STS Publisher/Vendor Relations Discussion Group, June 27, 1998


(April 1998)

1. MEASUREMENT ELEMENTS FOR ABSTRACTING & INDEXING SERVICES (e.g., EconLit and the A&I portion of a mixed database such as ABI/Inform) & FULL TEXT DATABASES (e.g., reference works like Britannica Online and journal providers like Academic Press/IDEAL and JSTOR):  Priority measurement elements are bold.  Statistics should reflect usage from a resource provider's main and mirror sites.

A.  Number of queries (Searches)  categorized as appropriate

(Note:  number of sessions (Logins) may be substituted in the event number of queries is not available)

  1. By database
  2. By IP address / locator to subnet level
  3. By special data element passed by subscriber to vendor (e.g., account number)

B.  Number of turnaways due to contract limits (e.g., requests exceed simultaneous user limit)

C.  Number of items examined (i.e., marked or selected, downloaded, emailed, printed):

  1. Citations (for A&I databases)
  2. Journals (for fulltext databases) broken down by title, ISSN, or other title identifier as appropriate
    a) Tables of Contents
    b) Abstracts
    c) Articles (or essays, poems, chapters, etc., as appropriate)
    d)  Other (e.g., image / AV files, ads, reviews, etc., as appropriate)

D.  Usage levels

  1. Per time period
    a) Queries or Sessions, Turnaways
        (1) By day, month, year
        (2) By time of day
    b) Peak simultaneous use as appropriate
  2. Per interface used
    a)By Web, Telnet, or Z39.50 as appropriate

E.  Total hours of server downtime by month as appropriate

2.  PRIVACY AND USER CONFIDENTIALITY: Statistical reports or data that reveal confidential information about users must not be released by resource providers without permission. 

Providers do not have the right to release statistical usage information about institutions without permission.

3.  COMPARATIVE STATISTICS:  Resource providers should provide comparative statistics that give participants a context in which to analyze statistics for their institutions.  A grouping for purposes of comparison might be compiled by the resource provider (e.g., statistics from an anonymous selection of similar institutions), or it might be a grouping composed on demand (e.g., statistics from all campuses in a consortium, presented either anonymously or not, as desired by the participating institutions).

4.  ACCESS / DELIVERY MECHANISMS / REPORT FORMATS:  Access to statistical reports should be restricted by IP address or another form of security such as passwords.  Institutions should be able to allow access to their usage data by other institutions if they desire. Resource providers should maintain access to tabular statistical data through their website (updated monthly)  which a participant can access, aggregate, and manipulate on demand. When appropriate, these data also should be available in flat files containing specified data elements that can be downloaded and manipulated locally.  Resource providers are also encouraged to present data as graphs and charts.

Developed by JSTOR Web Statistics Task Force:  David Farrell, Berkeley, Chair; Jim Mullins, Villanova; Kimberly Parker, Yale; Dave Perkins, CSU-Northridge;  Sue Phillips, Texas;  Camille Wanat, Berkeley;  Kristen Garlock, JSTOR.   

3.2 Checked!