Start With an Hour a Week: Enhancing Usability at

Similar documents
Usability Test Report: Requesting Library Material 1

Usability Test Report: get Interface 1

Usability Test Report: Bento results interface 1

Usability Test Report: Homepage / Search Interface 1

ArticlesPlus Launch Survey

VIDEO 1: WHY IS THE USER EXPERIENCE CRITICAL TO CONTEXTUAL MARKETING?

Usability Services at the University of Maryland: Who, What and How

Problem and Solution Overview: An elegant task management solution, that saves busy people time.

Julie Rand LIS Fall Usability Study

MiPhone Phone Usage Tracking

ProServeIT Corporation Century Ave. Mississauga, ON L5N 6A4 T: TF: F: W: ProServeIT.

Electronic Gateway Functional Team Website Usability Working Group Usability Test September 2005

The main website for Henrico County, henrico.us, received a complete visual and structural

Perfect Timing. Alejandra Pardo : Manager Andrew Emrazian : Testing Brant Nielsen : Design Eric Budd : Documentation

THE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW

Western Libraries Website Re-Design Update ( July 2016)

EBOOK THE BEGINNER S GUIDE TO DESIGN VERIFICATION AND DESIGN VALIDATION FOR MEDICAL DEVICES

Lehigh Walking Wizard Final Report Steven Costa & Zhi Huang

Discovering Information through Summon:

How To Create Apps For Internal Communications

Usability Report Cover Sheet

Usability Testing: A tutorial and Case Study from the NASA Goddard Space Flight Center Library

IT 220 Course Notes. Don Colton Brigham Young University Hawaii

Momental. Adrienne I. (Observer) Juliana C. (Computer) Meredith M. (Greeter/Facilitator) Nhien T. (Observer)

Usability Study of the Coates Library Website at Trinity University: Two Years Later

Improving the Usability of the University of Rochester River Campus Libraries Web Sites

The Seven Habits of Highly Effective Usability People

Usability Testing Report: Lilly Library s Homepage 1

CUE-10: Moderation Page 1. Comparative Usability Evaluation 10. Moderation. Observing usability test moderators

Usability Testing Review

interview.io Final Report

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

Digital Financial Solutions. User Testing. An Intelligent Environments White Paper

GRADY NEWSOURCE KATE DEVLIN, LEAH MOSS, ANDREA SKELLIE & ASHLEY PANTER

One a Time: Pop Up Usability Testing

Redesign Strategy: UIC Library Website Redesign

Memorandum Participants Method

Redesigning for Usability: Information Architecture and Usability Testing for Georgia Tech Library s Website

EVALUATION OF THE USABILITY OF THE LIBRARIES PAGE WITH BGSU STUDENTS

PUTTING THE CUSTOMER FIRST: USER CENTERED DESIGN

Raritan Valley Community College: Evelyn S. Field Library Homepage Usability Study. Alyssa M. Valenti, MS, MLIS

Library Website Migration and Chat Functionality/Aesthetics Study February 2013

Rachel Merriman Anna Whitney

Art, Architecture & Engineering Library (AAEL) Website Redesign & Usability 2006

Web Writing That Works. Hot Text. Reduce Scrolling

Usability Test Report: Link 2.0 1

Usability Testing: Indigo website

Connecting Library Instruction to Web Usability: The Key Role of Library Instruction to Change Students Web Behavior

National Weather Service Weather Forecast Office Norman, OK Website Redesign Proposal Report 12/14/2015

cs465 principles of user interface design, implementation and evaluation

A Heuristic Evaluation of Ohiosci.org

Professor: Angela Hicks

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

Foundation Level Syllabus Usability Tester Sample Exam Answers

This session will provide an overview of the research resources and strategies that can be used when conducting business research.

FoodBack. FoodBack facilitates communication between chefs and patrons. Problem Solution and Overview: Yes&: Andrei T, Adrian L, Aaron Z, Dyllan A

User Experience Testing: Practical applications and lessons learned Sean Cwiek, MCLS Sonya Schryer Norris, LM

WEB DESIGN SERVICES. Google Certified Partner. In-Studio Interactive CEO: Onan Bridgewater. instudiologic.com.

Usability Report for Online Writing Portfolio

Chapter 3. Enhancing LibGuides Usability and Discoverability Within a Complex Library Presence 1. Lisa Campbell, University of Michigan

Pop-Up Usability Testing

WITH INTEGRITY

Folsom Library & RensSearch Usability Test Plan

The 23 Point UX Design Checklist

Improving Usability. Putting the User First, Remotely at Cengage Learning... and Your Company

the magazine of the Marketing Research and Intelligence Association YEARS OF RESEARCH INTELLIGENCE A FUTURESPECTIVE

University of Maryland College Park College of Information Studies. INST 702 Advanced Usability Testing Spring 2019

COMPLETION OF WEBSITE REQUEST FOR PROPOSAL JUNE, 2010

Getting Them There: A Small-Scale Usability Test of a University Library Website

The explosion of computer and telecommunications technology in the. The Usability of On-line Archival Resources: The Polaris Project Finding Aid

WHITEPAPER MOVING TO A NEW BUSINESS PHONE SYSTEM

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

Everyone's Invited: A Website Usability Study Involving Multiple Library Stakeholders

Mark Kelly, Head of Research and Usability at Ryanair Labs. All-in-One Platform to Rapidly Test Usability & Measure User Experience

Moving to a New Business Phone System

Meet our Example Buyer Persona Adele Revella, CEO

Proposal for the design and development of the Compass Land Consultants website

1 Shorten Your Sales Cycle - Copyright Roundpeg 2015 All rights Reserved

We aren t getting enough orders on our Web site, storms the CEO.

Lecture 23: Domain-Driven Design (Part 1)

It s possible to get your inbox to zero and keep it there, even if you get hundreds of s a day.

REQUEST FOR PROPOSALS: ARTIST TRUST WEBSITE REDESIGN

Web of Science. Platform Release Nina Chang Product Release Date: December 10, 2017 EXTERNAL RELEASE DOCUMENTATION

Web Evaluation Report Guidelines

Dataverse Usability Evaluation: Findings & Recommendations. Presented by Eric Gibbs Lin Lin Elizabeth Quigley

USING EPORTFOLIOS TO PROMOTE STUDENT SUCCESS THROUGH HIGH- IMPACT PRACTICES

Usability Report Cover Sheet

TERMS OF REFERENCE Design and website development UNDG Website

balancer high-fidelity prototype dian hartono, grace jang, chris rovillos, catriona scott, brian yin

Improve the Order Procedure of a Student Nation s Pub

Web Systems Staff Intranet Card Sorting. Project Cover Sheet. Library Staff Intranet. UM Library Web Systems

How to Get Your Inbox to Zero Every Day

CHAPTER 18: CLIENT COMMUNICATION

Welcome to the Hyster Training Knowledge Center

CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018

Next-Generation Melvyl Pilot supported by WorldCat Local: The Future of Searching

Usability Test: 1 News, Events, and Exhibits

THE ULTIMATE SEO MIGRATION GUIDE

Weebly 101. Make an Affordable, Professional Website in Less than an Hour

Ryan Parsons Chad Price Jia Reese Alex Vassallo

Transcription:

Wayne State University Library Scholarly Publications Wayne State University Libraries 4-1-2018 Start With an Hour a Week: Enhancing Usability at Wayne State University Libraries Maria T. Nuccilli Wayne State University, mnuccilli@wayne.edu Elliot J. Polak Wayne State University Alex Binno Wayne State University Recommended Citation Nuccilli, M., Polak, E., and Binno, A. (2018). Start with an Hour a Week: Enhancing Usability at Wayne State University Libraries. Weave: Journal of Library User Experience, 1(8). DOI: http://dx.doi.org/10.3998/weave.12535642.0001.803 Available at: https://digitalcommons.wayne.edu/libsp/135 This Article is brought to you for free and open access by the Wayne State University Libraries at DigitalCommons@WayneState. It has been accepted for inclusion in Library Scholarly Publications by an authorized administrator of DigitalCommons@WayneState.

Current Archive Submit Start with an Hour a Week: Enhancing Usability at Wayne State University Libraries Maria Nuccilli, Elliot Polak, and Alex Binno WAYNE STATE UNIVERSITY Volume 1, Issue 8, 2018 DOI: http://dx.doi.org/10.3998/weave.12535642.0001.803 [http://dx.doi.org/10.3998/weave.12535642.0001.803] [http://creativecommons.org/licenses/by/3.0/] Print [javascript:window.print()] Share+ [https://www.altmetric.com/details.php? domain=quod.lib.umich.edu&citation_id=35636226] Abstract Instead of pursuing traditional testing methods, Discovery and Innovation at Wayne State University Libraries settled on an alternate path to user-centered design when redesigning our library website: running hour-long guerrilla usability tests each week for two semesters. The team found immediate successes with this simple, cost-effective method of usability testing, leading to similar redesign projects for other online resources. Emphasizing the importance of iterative design and continuous improvement, this article will detail the authors experience conducting short weekly tests, suggestions for institutions looking to begin similar testing programs, and low-stakes testing as a pathway to improved design for the library as a whole. In spring 2016, the Discovery and Innovation web team at Wayne State University Libraries began a redesign of our main library homepage [https://library.wayne.edu/], making the first major updates to the site since 2013. After launching the initial design, we wanted to improve the new website with feedback from real users, following Jakob Nielsen s decades-

old assertion that redesigning websites based on user feedback can substantially improve usability (Nielsen, 1993). Instead of using remote and large-scale testing options, we chose a lean, iterative approach, and committed to two semesters of brief, weekly guerrilla usability sessions with student users. Not only did this method produce powerful results, but its attainable, iterative approach ultimately led to an increased user focus beyond the library homepage. If you have an hour a week, a laptop, and at least one other colleague willing to help, you can start usability testing, guerrilla-style. Though we found success with this method, using such a stripped-down approach wasn t immediately obvious. At the beginning of this project, we didn t know a whole lot about conducting user testing, having relied mostly on survey responses, search logs, and Piwik website usage data to inform our initial redesign. Although we ran informal tests during the initial design process, participation was limited to library staff or student workers only. This feedback had been helpful early on, but we suspected that testing only employees may have skewed our findings. When compared to seasoned staff, many users would likely not be as familiar with library resources and website structure. The web team considered a host of research methods before embracing lean usability. We considered running focus groups, but discovered our budget rules prohibited us offering even small gift cards as compensation, generally requisite for participation in such activities. As we explored focus groups further, we also realized that the amount of preparation needed to run a session would make it hard to get regular feedback on any changes we did end up making. Hiring an expert to run the kind of large-scale user test commonly outlined in literature or outsourcing testing to a website like UserTesting.com were other methods we eliminated. We wanted a way to test users that would allow us to spend less time planning and more time testing so we could get feedback and deliver usability improvements as fast as possible. The remainder of this piece will detail our experience conducting weekly tests, and suggestions for institutions looking to begin similar testing programs,. Go Guerrilla We first discovered the concept of lean usability via Jeff Gothelf s 2013 book Lean UX: Applying Lean Principles to Improve User Experience. According to Gothelf, the concept of lean UX relies on two concepts. First, research should be continuous, meaning built into every step of the design process. Second, research should be collaborative, meaning done by the design team, not outsourced to specialists (p. 74). Further exploring research methods used by tech startups, we found guerrilla style testing, an incredibly lean, do-it-yourself process. (A similar method is also explored in Steve Krug s 2010 testing bible Rocket Surgery Made Easy, though he never users the term guerrilla.)

Whatever you call it, this type of research aims to identify usability issues and fix them as quickly as possible. It can be successful with minimal planning, minimal staffing, minimal equipment, and minimal time. Just set up a table with devices for testing in a central location where your users are likely to be (instead of recruiting representative users ahead of time for off-site testing, guerrilla testers do it on the fly). Once you find a willing participant, you have them spend five to ten minutes completing a short series of scripted tasks while you observe and record the results. Then, you repeat the same test with four more participants, enough to reveal most of your site s usability problems. The results aren t quantitative, but that s okay. Informal observation like this is useful for design teams. When we began our guerrilla testing process, we decided to commit to an hour of testing every Thursday, and soon found it to be the most productive hour of our week. Krug puts it best: It works because watching users makes you a better designer (p. 17). Start Testing Identify User Needs We identified broad goals, by first considering our user s needs. What should they be able to accomplish? What type of experience should the interface offer? We dug into our usage data to find the most heavily-used portions of our site. This led to our initial goal to provide users intuitive access to information, resources, and services with minimal effort using both navigation and search. Then, we expanded our broad goal by exploring the details. Specifically, we wanted to support users in locating information about the libraries, like hours and student printing policies. We wanted them to be able to find and use resources like books, article databases, and research guides. We also wanted users to be able to find help if needed. These translated into measurable tasks for our initial tests, such as have user locate the database Web of Science, or have user contact a reference librarian. Though we first turned to our collected usage data to understand user needs, we also collaborated with other library staff members during the task creation process. By receiving input from circulation, reference, and liaison staff that regularly interacted with students, we were able to test user tasks may have been overlooked. Don t worry about figuring everything out at once, and keep in mind that tasks are flexible. The more you test, the more tasks will change. We generally make an informal list of areas we want to test the week before, and consult it as we write the tasks.

[/w/weave/images/12535642.0001.803-00000001.jpg] Figure 1. Turning your goals into a usable script. Devise the Script Once you ve identified user tasks, it s time to turn them into testable questions for your script. When constructing these questions, make sure to word them in a way that avoids leading the user to your desired action. Rather than asking our participants to locate the 24/7 chat service, we ll propose a scenario instead. For example, Suppose you had trouble accessing an article through a database. Show us how you would get in contact with someone who could help you. Because we want it to take no more than five to ten minutes, we generally stick to six to eight questions per test. We also prepare a handful of scripted answers in case participants tried to ask specific questions. This has been helpful because when we began testing, we often found ourselves thrown off by participant questions after following a script for so long. For instance, in response to a question about what an interface is supposed to do, we ll reply, What do you think? or explain that we can answer any questions once testing is complete. If a participant seems stuck on a question, we will also give them the option to move on to the next one. In our case, the person who runs the test is the same person who writes the script. However, collaboration and feedback are still essential during this stage. A day or two before our Thursday session, we review our questions as a team to double check content, phrasing, and tone. Once we approve the script, one of us will type the questions into a new Google Forms document [https://goo.gl/forms/txk6japjdxovlcm13], which we ll use for note

taking during the sessions. At this point, it s also good to script an introduction that you ll read to each user who participates in your session. We like to explain who we are, the project we re working on, and the purpose of the test, so the user has an understanding of the end goal. You should also stress to the participant that there are no wrong answers: you are testing the site, not the skill of its users. It s also important to encourage the participant to think out loud, and express any confusion or notable impressions. We keep this in our Google Forms document as well, so it s easy to reference during the test and each participant hears the same introduction. Run the Session When it s time to pick a location for the first test, choose a place you know your users will be, like near the entrance to your most visited library at a busy time. Discovery and Innovation finds willing participants by setting up a table with two laptops or devices one for testing, one for recording results near the entrance of our busy Undergraduate Library. We ve tried other places on campus like the Student Center and the Purdy-Kresge graduate library, without as much success. The key for us was to find a place where there's a lot of traffic in and out of the library, during a time of day when students may be free. You want to make sure that it is a time and place that you can repeat regularly. We run our sessions as a group of three: one person recruits participants, one person runs the test, one person observes and records the results. Once we re set up for the session, our recruiter approaches students with a friendly greeting, asking for five to ten minutes of their time to help test a new website we re working on. If they seem interested, our moderator explains the test further, and with their permission, starts the session. We always offer a dish of fun-size candy as an incentive, but find that most volunteers are willing to help regardless. Our designated observer takes notes on participant behavior using Google Forms, where we have each script saved for easy access. When we re done with the test, we make sure to thank users for their time, and remind them to help themselves to a piece of candy. After a session, we have a quick web team debrief, where we review the results of the test, identify areas for improvement, and propose solutions for bugs or design issues our users uncovered. Over the next few days, we ll make any necessary changes to the website that we d like to test, reviewing them at our weekly web team meeting before our next testing session. If possible, don t go live with untested improvements. We keep a separate test site for implementing changes so that we re not changing the live site without running more tests. Once our edits are live on the test site, we restart the testing process to make sure our improvements are effective. The concept of iteration has become the most important factor in our research and redesign. Regular testing lets us make small changes incrementally instead of huge redesigns, eliminating the risk of disorienting website users with abrupt, unexpected

changes. One early success concerned the library s 24-hour chat reference, branded as Ask A Librarian and consistently cited by staff members as an important service. Though this service was accessible from the site s main navigation and a secondary Ask A Librarian page, initial tests revealed that many students were unaware of its existence, and also had difficulty finding it with the existing navigation. If students were able to locate it, they weren t sure how to use it once they were there. After testing several iterations of our navigation, chat interface, and the Ask A Librarian page, we identified what worked. Within a month of implementing the final changes, we got word from our reference coordinator that use of the chat service was up by 70 percent. This was an early, measurable achievement that pushed us to continue with our testing plan. The most major improvement we ve made since redesigning the homepage involved our Millennium-based library catalog, which had gone untouched since 2006. In the spring of 2017 we began running user tests, revealing several issues with the design we d had for over a decade, long before any of the current web team members were hired. In addition to upgrading our integrated library system to Sierra, we redesigned the catalog to modernize the look, added user-friendly features like citation and map tools, and tweaked language to be more intuitive for users. These changes wouldn t have been easy to approach as a single redesign project. The small, iterative changes added up over time. What makes this type of research easy to repeat is its informal approach. Krug (2010) explains, provided that you re still getting the answers you need, there s no problem in testing fewer users, altering tasks, or even skipping a task if a participant is struggling and it s obvious why. The beauty of guerrilla usability testing is that it s flexible; the most important thing is that you do it, early and often. Some of the improvements we were able to make revealed further weaknesses within our current workflows. Which brings us to our final point this commitment to usability, starting with a mere hour of informal testing per week, has had an important effect on our journey toward a library-wide culture of usability. References Gothelf, J. (2013) Lean UX: Applying lean principles to improve user experience. Sebastopol, CA: O Reilly. Nielsen, J. (1993). Iterative user interface design. IEEE Computer, 26(11), 32 41. Krug, S. (2010). Rocket surgery made easy: The do-it-yourself guide to finding and fixing usability problems. Berkeley, CA: New Riders. Hosted by Michigan Publishing, a division of the University of Michigan Library.