ASIS&T 2007: Live Usability Labs: Open Access Archives and Digital Repositories

A series of live usability tests, with volunteer repositories and testers from the audience.

1: dLIST (University of Arizona)

Not a single-institutional repository; it’s a cross-disciplinary cross-library repository. Typical user of dLIST looks for information. We’ll test 1) an author search; 2) browse for neural networks; 3) can the tester find usage stats for a specific article.

  1. Author search — no problem. Found works by specific author. Tester had a hard time finding the “search” button — it was screens down the advanced search page.
  2. Neural network. Had a hard time finding how to do phrase search. Search fields don’t specify a phrase, just keywords in any order or any keywords.
  3. Found the article and abstract/download stats handily.

Users tend to be “browsers” or “searchers” in Paul’s experience. Search box says “search titles, abstract, keywords — but doesn’t search authors. They aren’t a keyword (other searches showed that indexing sometimes includes authors, but not consistently. Also, on advanced search page — Paul Marty says “if you need a cancel search button, don’t make it bigger than the search button.”
Home page is very detailed.
Q: How much prompting of the user in a session?
A: It depends; in a purely exploratory session, you give none; in other cases, if you’re less concerned with how a task is completed than if it’s completed, you can give more.

2: Illionis Digital Environment for Access to Learning and Scholarship (IDEALS) (University of Illinois at Urbana-Champagne)

IR at UIUC. Concentration is on scholarly research and output at the university. Mostly ‘gray literature’ and content from departments that are publishing technical reports. Most people find IDEALS content through Google, etc. — roughly 10 times more access of full-text materials than through the IDEALS search interface.
Task: 1) Upload an article to the IR.
Then a page of legalese. Two volunteers — a “faculty member” and a “graduate student”. Submit an Item — not called “upload”. Then it asks for “choose collection”? What’s that? Collections don’t match expectations.
With two people, the give and take was very rich — since people don’t think aloud, having a situtation in which conversation is natural helps elicit conversation. Technique is called “constructive interactionism”.

3: Minds at UW (University of Wisconsin)

It’s a consortial collection — all 26 libraries in the Wisconsin system. This implementation is almost purely “out of the box” — DSpace is moving to a new platform soonish. Most uers come from Google direclty to an item page.
Tasks:

  1. You Googled your way to a particular work. You want to find other items by this author.
  2. Look for other examples of Urdu poetry.
  3. Look for other contributions from the same school (UW-Whitewater)

1) An author search (full-DSpace) pulls up many false hits. Browsed by author to get to him, found his works.
2) Look at record, look for subject heading — but no clickable links.
3) Went to communities list, then found UW-Whitewater, then searched.

One thought on “ASIS&T 2007: Live Usability Labs: Open Access Archives and Digital Repositories”

  1. ASIS&T 2007: Wrap-Up and Thoughts

    I had a great time at the ASIS&T 2007 conference, Joining Research and Practice: Social Computing and Information Science, in Milwaukee. I blogged most of the sessions I attended — see the list at ASIS&T Sessions. A few thoughts about…

Comments are closed.