Yesterday Melina and I ran our first formal usability test in nearly 2 years. We had three students in the Mary I who worked on a few tasks using the Omeka-based Digital Collections site. We had a great crowd of observers who took meticulously detailed notes and helped us whittle down a very long list of issues to a few high-priority action items for the immediate future.

The 5 task-based scenarios we had each student work through were:

  1. You’ve been tasked by a professor to use GVSU library’s digital collections for an assignment on one topic of your choice. Show us how you would go about finding the collection you will focus on. [blank empty browser window open]
  2. You are writing an essay covering American history up to 1877. Find some resources about correspondence between soldiers and family. [browser window is at the digital collections home page]
  3. You are to write a report on the history of Grand Rapids. Which particular collections would you focus on? [the browser window is at the digital collections home page]
  4. You are researching the Reserve Officers’ Training Corps (ROTC) using the library’s digital collections. Find some resources with people who went through the program. [browser window open at the digital collections homepage]
  5. Using the Young Lords in Lincoln Park Interviews, look for interviews specifically talking about fair housing issues. [browser window is at the homepage for that collection]

The first question was designed both to see how our users understood the term “Digital Collections,” as well as to see if they could find it. None of our students had any idea of what Digital Collections meant (most thought it was anything online, put into groups). But because they had an incorrect understanding of what it was, they certainly couldn’t find it.

We discussed how Digital Collections is the kind of resource that needs some kind of facilitation, to explain to users what the collections are and why they want to use them. Melanie noted that for our on-campus users who encounter digital collections through an assignment, they will either have Leigh or a liaison introduce them, or will at least have some context provided by the professor. But for others, we discussed a few ways to help clarify not just what Digital Collections are, but what all of our separate collections are. I think there are other opportunities here for sharing this information, from social media to Web ads. And navigating to our collections is also something I want to explore in future tests. We also want to explore how easy these collections are to in the Library Search - sample searches from the users in the test showed that a lot of other stuff came up before anything in our digital collections.

(As a related note: Weave Journal of Library User Experience recently published an article on this very topic, and found that while Digital Collections was a terrible term, it as the least terrible of all the others, and most libraries use it. ¯_(ツ)_/¯ Read the whole article: What We Talk About When We Talk About Digital Libraries: UX Approaches to Labeling Online Special Collections

Questions 2-5 all focused on specific, yet common, tasks in the Digital Collections system, which is a customized version of the open source tool Omeka. Let’s just summarize by saying that Omeka didn’t do so well on this test.

One quick win that we will look into is making the failed search page, where no results are returned, more user friendly. There are no hints or tips or mediated help that appear, even though this is the point where we want that mediation! Kyle and I suspect we might have to use JavaScript to make this change, but we’ve already got a JavaScript application running on top of Omeka to make some interface changes to improve usability, so that should be a big deal. (Curious about customing vendor tools with JavaScript? I wrote a whole book about it, and it’s now Open Access in ScholarWorks!)

Many other issues centered around Omeka’s search function. Our users mostly made assumptions that Omeka’s search would work a lot like Google’s or Summon’s: autosuggest, autocorrect, etc. But it doesn’t. In fact, Omeka’s advanced search requires you to explicitly use Boolean operators between keywords, but you have to use symbols, like “+” for AND and “-” for NOT. Super intuitive!

To top it off, a few years ago Kyle switched the main search functionality of Omeka over to a Solr index, that outperforms the built-in search dramatically. Unfortunately, the Advanced search doesn’t run on the Solr index. We found that the Advanced Search and Basic search would return totally different results for essentially the same search! And some buttons, like the “New Search” button, will take you to advanced search rather than to the basic search. Ugh.

Our plan right now is to do the following (although we need to do a little more research to make sure these will work and are the best options):

  1. Hide the advanced search link (Need to make sure there aren’t any use cases for it.)
  2. Redirect the “New Search” button to the basic, Solr search.
  3. Redesign the facet sidebar on the search result page to be more prominent. Most of the users didn’t see it, and the one who did seemed to only notice it as a last resort. It kind of blends in with the rest of the page.
  4. See if there is a way to limit the search box on a collection page to search only that collection - that was the behavior all of our users expected.

Kyle and I will get together over the next few weeks to look into making these changes happen. Then in the next few months, we’ll run another test for digital collections and see how the changes are received!

I’m planning on running another, more generalized test in November. Running a usability test on our website every month is a lot of work, but it has helped us really hammer away at some of the big issues facing our patrons. Thanks for participating, and I look forward to seeing everyone next month!