Collections Review Tool

A tool to engage the campus community in decisions about cuts to the journal and database collection.

Overview

The Collections Review project brought together staff from the Collection Management Department (now Collections & Research Strategy) and Digital Library Initiatives to engage the NCSU campus community in the decisions necessary to achieve a substantial cut to the journal and database collection.

How We Did It

In fiscal year 2009/2010 the NC State University Libraries was facing substantial cuts to its collections budget. As part of those cuts, journal subscriptions had to be canceled. In order to make the best decisions on which journals to cancel, the Libraries needed to gather as much campus feedback as possible on its list of 1,112 journals proposed for cancellation.  We created the Collections Review Tool to share information about the journals for our campus to consider to cancel.  We used the tool again in FY 2014/2015 to manage cuts to the collections budget of approximately $750,000 – about 7.5% of the 2013/2014 allocation. Projected budget reductions from the university, combined with expected inflation for journals and databases of $550,000 (at a 7% annual inflation rate), necessitated preparations for steep reductions to the collection.  A comprehensive review process that included input from faculty, staff, and students identified 628 journals for cancellation (effective as of January 2015) and 34 databases (termination varies depending on renewal date). 

In an article published in Against the Grain we go into detail about the process.  If you have further questions about our Collections Review process, please contact Hilary Davis.

Collections Review Tool

To gather this feedback the Libraries designed and built the Collections Review tool, a web form where users could easily record and submit their responses to the proposed cancellation list. The form provided tools to help users filter and manage the list. The form presented key data points to enable campus to make decisions on keeping or canceling a title.

The web form required authentication using campus credentials which allowed the form to pre-populate data such as name, email, affiliation and status. The form could be sorted by any of the data points available and filtered by broad subject groupings. Users could also search for specific titles.

To rank the titles, users were offered three possible ranking options:

Must keep
Keep if possible
Can Cancel

Users were asked to ignore titles that were not relevant for their field/interests. The form also provided a comment box so that a user could communicate additional thoughts on the titles and the review process.
 

The form could be saved so that a user could come back and complete his/her rankings over a period of time before submitting them. Once submitted they were captured in a large comma-delimited spreadsheet for review and manipulation by collection management librarians.

Methodology For Processing the Feedback

1,365 users logged into the web form and of those, 700 submitted feedback. A total of 12,710 journal title rankings were submitted. In most instances users only ranked titles that were relevant to their subject interests. But in some cases respondents ranked every title or ranked a broad spectrum of titles clearly outside their scope. In order to process this feedback and make use of it in collections decision-making, a weighted ranking was created. This was then used in concert with the subject knowledge of collection management librarians to make the best decisions on cancellation.

Method 1: Weighted Ranking - Broad approach using college affiliation and journal subject from web form
This method weighted the rankings by the community of users who provided feedback based on how closely their research and teaching subject areas matched the journal subject areas. This approach was to help minimize the tendency of users to want to cancel journals that were not relevant to their research and teaching (e.g., a biology researcher may have issued a suggestion to cancel all history journals). The following is an example of a weighting scheme to describe a user's association with a journal subject ("association weight factors"):

Weight of 1.0 for direct associations between users and journal subject areas
Weight of 0.8 for close associations between users and journal subject areas
Weight of 0.5 for partial associations between users and journal subject areas
Weight of 0.1 for tangential or unrelated associations between users and journal subject areas

Points were assigned for each ranking of "must keep," "keep if possible," and "can cancel." This point system allowed the prioritization of journals that were regarded by our users as titles that should be retained. The scheme for assigning points to the ranking system is as follows:

Must keep rank = 10 points
Keep if possible = 5 points
Can cancel = 1 point

The ranking points were then multiplied by the association weight factors and the total number of rankings, then summed for each journal title. The higher results brought to the surface the titles for which the more closely associated users ranked journals that they deemed were necessary to keep in the collection. Below is an example of how the metric was calculated on feedback received for the title Astronomy Letters.
 

Method 2: Data Metric - weighted ranking using key data points
To supplement the campus feedback, other data points such as journal impact factors and citation and publication patterns were used to help in the decision-making process. Journal impact factors and citation and publication data were derived from Thomson ISI databases (Journal Citation Reports for the impact factor data and Local Journal Utilization Reports for the publication and citation data). The formula below was used to combine these data points and was designed to give more weight to data points that were valued highly by NCSU Libraries and reflected a journal's relevance to the NCSU community.
Sum of the following:

Average of 2 most recent years of use data
Number of citations by NCSU researchers to the journal
(2 x Number of publications by NCSU researchers in the journal) x (impact factor +1)

Below is an example of the two methods applied to a cross section of titles in the collections review.

Team