Discovery Environment Usability Test (2011)
In the spring and summer of 2010, the ITAC (IT Advisory Committee) Discovery Environment Team designed the tabbed search interface for the homepage of the Libraries' redesigned website. We also designed the search box in the header of all the website's pages beyond the homepage. An earlier usability test (see http://www.lib.ncsu.edu/userstudies/studies/2010_navigation_testing) informed those design decisions. That was our pre-test. After the redesigned website went live in August 2010, the Discovery Environment Team had one remaining task: a post-test, a usability test of the new site. We conducted the test in February 2011.
Scope: The tasks for our test are about the site as a research portal. Out of scope: The site's marketing aspects and "about the library" information.The Libraries have been investigating ways to improve discovery and retrieval of library resources, particularly for undergraduate students who can find the complex array of databases and e-journals difficult to navigate. The "Summon" application, developed by Serials Solutions, attempts to address these issues by providing a unified search interface.
We had 10 participants, all students. Of the 10, 2 are graduate students and 8 are undergraduates (2 sophomores, 3 juniors, and 3 seniors). 6 are women and 4 are men. They represent nearly every college at NCSU: Engineering, Management, Natural Resources, Physical and Mathematical Sciences, Humanities and Social Sciences, Education, Textiles, and Agriculture and Life Sciences.
Recommendations and outcomes
Add the catalog's search field pull-down menu (anywhere, in title, etc.) to the Books & Media tab
Task #11 "Are there any copies of the book 'The Sun Also Rises' by Ernest Hemingway..." had only a 50% success rate. Note that all 10 participants found their way into the catalog. This suggests the website got participants where they needed to go, and then half of them had problems once in the catalog.
All 5 participants who failed mistook a book about "The Sun Also Rises" for the book itself. Another succeeded with difficulty, at one point saying "What am I doing wrong?"
Participant #2 succeeded by using the catalog's "Advanced" search tab and searching for "sun also rises" in title and "hemingway" in author. She then said "Easier access to an advanced search would be nice."
The 5 successful participants used the catalog's search field pull-down menu to specify author or title searching, author facets, or the catalog's "Advanced" search tab.
Outcome: We proposed adding the catalog's search field pull-down menu to the Books & Media tab, and that happened in October 2011.
Improve Journal Titles tool's interface
Results lists in the Journal Titles tool proved difficult for some participants to interpret. In 5 of the 20 searches, participants navigated to the correct tool, used good keywords, but could not see the item in the results set. Note that two of those five ended up finding the journal Science, but they had difficulties nevertheless. Possible contributing factors to the problems included inconsistent display of search results (e.g., Science had a 6-line holdings display while others had 1 line, and Science's title was hyperlinked while others were not). A 60% success rate for finding Science meant we had work to do.
We recommended putting more visual emphasis on titles (e.g., Science) so that titles not lost amidst the other information (e.g., coverage dates).
Outcome: Journals tool changed in September 2011.
Consider different design treatments for "Best Bets" in the QuickSearch interface
We watched two participants overlook "Best Bets" in QuickSearch (the "All" tab search results).
Outcome: The ITAC QuickSearch Product Team did an A-B test in August 2011 to test a new design of Best Bets. Both old and new designs had very similar click / serve ratios. The team chose the new design. The team is not sure that the two problems observed in this test represent significant problems with the design of Best Bets; Best Bets click / serve ratios are already high.
No recommendation to resolve the problem that there is no obvious starting place to look for audio
One task asked participants to find an audio recording of a poet reading his or her work in Spanish. Some struggled to figure out where to start. We do not recommend solving this problem. A good solution would require further research to understand the breadth of the problem and assess possible solutions. We suspect the problem also applies to other specialized collections in our catalog; we just happened to have a task about audio. The labeling of "Books & Media" is intended to help with this issue ("media" covering a lot of specialized collections) but it is, of course, an imperfect solution.
No recommendation to resolve the confusion between articles vs. journals
This is a classic area of confusion for students. We are pleased to see it only affected two of our 10 participants (it affected those two in multiple tasks). Finding solutions in this area would require research beyond this usability test.
No major problems found
Overall we think this usability test shows that our website is remarkably effective and flexible. The usability issues we found were minor to medium-sized. Participants took a variety of paths and succeeded at a high rate (81%).
One example of participants' success is our citation look-up task, Task #2: "Find the article 'How to Get Our Democracy Back' by Lawrence Lessig." 10 out of 10 participants succeeded. A 2006 usability study conducted here at the Libraries had a similar task, in which participants were asked to find:
Webster, P. J., Holland, G. J., Curry, J. A., & Chang, H. R. (2005). Changes in tropical cyclone number, duration, and intensity in a warming environment. Science, 309(5742), 1844-1846
In 2006, 5 of 7 participants succeeded, a 71% success rate, vs. our 100% success rate.
Tabs effective in usability test
The search tabs on the Libraries' homepage seem very effective. We saw strikingly little confusion about which tabs to use for which purposes. The only "completely wrong" use of the tabs we saw was when one participant tried to find the journal Foreign Policy with the Articles tab.
Use of tabs in usability test environment differs greatly from web logs
Participants in the usability study used the default "All" tab 29 times and the other tabs 67 times. That is 2.3 times more use of the non-default tabs than the default. It surprised us to see so much non-default use of any user interface. Contrast that with web logs from September 2011 -- users not in a testing environment:
February 2010 Usability test
Total searches = 96
September 2011 logs
Total searches: 59,314
* tab changed names
"Find Articles" tasks easier than expected
The success rate for the "find articles" tasks was 95%. There were four tasks, three "find articles on a topic" tasks and one "find a known article from a citation" task. 4 tasks X 10 participants = 40 tasks. Participants succeeded on 38 of the 40. The 2 failures came on Task #4 "find a peer-reviewed article about salt mining that is available online." One participant could not figure out how to limit to peer-reviewed articles, saying about the article she settled on "I guess that's peer-reviewed?" The other participant found an article but did not try to determine if it was peer-reviewed.
The participants did not just succeed at a high rate; they succeeded more easily than anticipated. We used this difficulty scale:
When we crafted the tasks, we assigned "projected difficulty" levels. The four "find articles tasks had an average projected difficulty rating of 1.75, close to the "medium" range. The average actual difficulty rating turned out to be 1.15, closer to the "easy" range.
Participants easy success on the "find articles" tasks reflects well on our website's implementation of Summon.
Links on the homepage to Databases and Journal Titles effective
Participants had little trouble finding these links underneath the search box. They found and used them at appropriate times.
Usability sessions surprisingly short
The average session was 12.4 minutes, much quicker than we expected. In a 2006 usability study the Libraries conducted, tasks took an average of 2.5 minutes. This experience and others led us to expect our 12-task test to last at least 30 minutes. We speculate that the brief sessions:
- could have been caused by our lack of follow-up questions.
- could be a sign that the website has become easier to use.
- could show that many college-aged web users in 2011 are not inclined to slow explorations of a website.
Regardless of why the sessions were short, we think their brevity should inform the development of future usability tests. In the future, we should consider tests with more than 12 tasks.
Discovery Environment Team and Partners
Josh Boyer, Research & Information Services
Annette Day, Collection Management
Brian Dietz, Special Collections
Emily Lynema, Information Technology
Joe Ryan, Digital Library Initiatives
Rob Rucker, Research & Information Services
Tito Sierra, Digital Library Initiatives
Susan Teague Rector, External Relations
Josh Wilson, Research & Information Services
Recordings were made in the NCSU Libraries Usability Research Lab using Morae usability testing software.
Note that all users in these recordings signed a release form that included the following language:
Recordings made during this study will be used for research and development. Therefore, I understand that my work during the test will be recorded and viewed by the staff of the NCSU Libraries. I further understand that the NCSU Libraries may wish to use segments of these recordings to illustrate presentations offered to professional audiences. I give my consent to the NCSU Libraries to use my recorded image and voice for these purposes, with the provision that my last name will not be associated with the recordings and that these recordings will not be released to any broadcast or publication media.