Usability Tests and Heuristic Reviews Planning and Estimation Worksheets

Similar documents
The LUCID Design Framework (Logical User Centered Interaction Design)

HCI Research Methods

CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018

USER-CENTERED DESIGN KRANACK / DESIGN 4

A short introduction to. designing user-friendly interfaces

Choosing the Right Usability Tool (the right technique for the right problem)

Portfolio. Mihai Marin

SEGUE DISCOVERY PARTICIPATION IN DISCOVERY DISCOVERY DELIVERABLES. Discovery

The IDN Variant TLD Program: Updated Program Plan 23 August 2012

PRESIDENTIAL ELECTIONS GAME

SWEN 444 Human Centered Requirements and Design Project Breakdown

Foundation Level Syllabus Usability Tester Sample Exam Answers

Capgemini employ 30,000+ (2010) people in India with offices in Mumbai, Bangalore, Kolkata, Pune, Hyderabad, Chennai and Delhi/NCR.

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

Office of the Director. Contract No:

Design Heuristics and Evaluation

Product Development for Medical, Life Sciences, and Consumer Health

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

What is Sherpa? ENTER A LOCATION CHOOSE EQUIPMENT ORDER SERVICE ENJOY MORE FREE TIME. UXD Presentation Peter Zahn

IBM s approach. Ease of Use. Total user experience. UCD Principles - IBM. What is the distinction between ease of use and UCD? Total User Experience

Competitive & Comparative k Analysis k

Portfolio Classified due to NDA agreements

Ideate. Empathize. Prototype. Define. Test. CS:2520 Human-Computer Interaction. Fall 2016.

Wireframes for Testing and Design

balancer high-fidelity prototype dian hartono, grace jang, chris rovillos, catriona scott, brian yin

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

Problem and Solution Overview: An elegant task management solution, that saves busy people time.

Issues identified from heuristic evaluations:

Usability Testing CS 4501 / 6501 Software Testing

How to Choose the Right UX Methods For Your Project

Goals of Usability Evaluation

Usability Services at the University of Maryland: Who, What and How

Overview: Introducing Usability and User-Centered Design (UCD)

DRAFT. Approach 1: Emphasize evaluation/feedback with target users

Perfect Timing. Alejandra Pardo : Manager Andrew Emrazian : Testing Brant Nielsen : Design Eric Budd : Documentation

SWEN 444 Human Centered Requirements and Design Project Breakdown

Delivery guide: SAGE Video

How to Add Usability Testing to Your Evaluation Toolbox

Cognitive Disability and Technology: Universal Design Considerations

Up and Running Software The Development Process

USERINTERFACE DESIGN & SIMULATION. Fjodor van Slooten

Design Iteration: From Evidence to Design. Slides originally by: Dick Henneman

WEBINARS FOR PROFIT. Contents

cs465 principles of user interface design, implementation and evaluation

Foundation Level Syllabus Usability Tester Sample Exam

Cassandra Platform - A Case Study

Request for Proposal: Website Redesign

Table of Contents. Manual for Principal Investigators + Lab Financial Administrators - Core Facilities Management System

Evaluation techniques 1

Evaluation techniques 1

HCI and Design SPRING 2016

HPE Network Transformation Experience Workshop Service

Making a Business Case for Electronic Document or Records Management

Safety-enhanced Design EDIS 2014 R (a)(1) Computerized Provider Order Entry

Analytical &! Empirical Evaluation

D I Y. Usability Testing

Introduction to emanagement MGMT 230 WEEK 2: JANUARY 17. Planning, designing, and launching a website

Prototyping. SWE 432, Fall Web Application Development

Software specification and modelling. Requirements engineering

Usability Tests Descriptions

UX Consulting: A Look into the Design and Usability Center at Bentley

Information System Architecture. Indra Tobing

COLUMN. User-centred redesign of the FaCS intranet. A user-centred approach throughout the project ensures fewer surprises at the end

Request for Proposals Atlanta Legal Aid Society and Legal Services Law Line of Vermont

Creating an Intranet using Lotus Web Content Management. Part 2 Project Planning

WEB DESIGN SERVICES. Google Certified Partner. In-Studio Interactive CEO: Onan Bridgewater. instudiologic.com.

Who we tested [Eight] participants, having the following profile characteristics, evaluated [product name].

evision Review Project - Engagement Simon McLean, Head of Web & IT Support Information & Data Services.

Organizing Your First Website Usability Test. Cornell Drupal Camp 2016

CSc 238 Human Computer Interface Design Chapter 5 Designing the Product: Framework and Refinement. ABOUT FACE The Essentials of Interaction Design

FoodBack. FoodBack facilitates communication between chefs and patrons. Problem Solution and Overview: Yes&: Andrei T, Adrian L, Aaron Z, Dyllan A

User-Centered Design. Jeff Bos, Design Insights BlackBerry

BT Web Conferencing Quick Start Service

UX Research in the Product Lifecycle

User-centered design in technical communication

The Ranger Virtual Workshop

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

Learning Portal AE: Customization and Configuration in Microsoft Dynamics CRM 2016 Hands-On-Lab

Usability Testing Essentials

User-Centered Analysis & Design

Правим сайтовете лесни за ползване

MOBILE APP USER TESTING GUIDING THE WAY TO SUCCESS. Mobile App User Testing 1

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

SUGGESTED SOLUTION IPCC MAY 2017EXAM. Test Code - I M J

Windows Presentation Foundation Visual Studio.NET 2008

Chapter 4. Evaluating Interface Designs

HEURISTIC EVALUATION WHY AND HOW

Draft Prototypes. Steps in This Activity. HOW TO Step 1. Select a Prototyping Method. Select a Prototyping Method

Euro-BioImaging Preparatory Phase II Project

Usability Testing. November 9, 2016

Our Three Usability Tests

cs414 principles of user interface design, implementation and evaluation

UI/UX BASICS. What is UX?

UX + BA. User Experience & Business Analysis. Personas. What is UX? Customer Experience Maps. BA & UX roles. BA + UX Collaboration.

Support Notes (Issue 1) September Snap it! Certificate in Digital Applications (DA105) Coding for the Web

Human-Computer Interaction Design

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Concept Production. S ol Choi, Hua Fan, Tuyen Truong HCDE 411

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

THE CONTRAST ASSESS COST ADVANTAGE

Transcription:

For STC DC Usability SIG Planning and Estimation Worksheets Scott McDaniel Senior Interaction Designer 26 February, 2003 Eval_Design_Tool_Handout.doc Cognetics Corporation E-mail: info@cognetics.com! Web: www.cognetics.com 51 Everett Drive #103B 1320 Fenwick Lane, Suite 209 Princeton Junction NJ 08550 Silver Spring, MD 20910 609.799.5005 301.587.7549

Usability Test Planning Tool Designing and planning usability tests offers something of a chicken and egg problem. It can be difficult to design a test unless you know the time and resources available to you, yet on the other hand you need to know the best test design so that you can provide an estimate or justification of time and resources. This tool allows you to attack the problem from both directions. If you need to work out the optimal test design first, complete the Usability Test Design section and then the Project Planning and Estimation section. If, on the other hand, you are working under schedule or budget constraints, work through the Project Planning and Estimation section first to get an idea of how to spend the available resources, and then complete the Usability Test Design section. Usability Test Design Every usability test requires you to address certain issues that shape the test s design. Answer these questions before creating your testing materials (scripts, prototypes, etc.) to make certain that you have covered all of the issues and discussed their impact on the test design Goals The stage of LUCID in which you conduct a usability test affects the test s goals. Note the stage of LUCID in which this test will take place, and then state your specific goals and the design implications for those goals. Stage Typical Goals Design Implications Envision Discovery Design Foundation Benchmark previous release Benchmark competition Discover users pain points Evaluate concept sketches Discover users mental model Evaluate UI concept Evaluate UI navigation Evaluate screen layout Evaluate terminology Evaluate key workflows Evaluate key screens Summative test Measure task times, error rates, and subjective impression Summative or formative test Ask users to explain how they think sketches or paper prototypes work, and what they think each element means Ask users to perform a typical task with a prototype or existing version, letting them choose the task Formative test Ask users to navigate to screens within the prototype Ask users to explain the elements of a prototype screen Ask users to perform key tasks Design Detail Evaluate specific workflows Formative test Cognetics Corporation 2003 Page 1

Build Release Evaluate specific screens Evaluate user assistance strategy Fine-tune specific workflows Fine-tune specific screens Compare usability of new version with benchmarks Compare usability of new versions with benchmarks Benchmark current release Ask users to perform specific tasks of interest Formative or summative test Ask users to perform tasks that require live screens and/or database Measure task times, error rates, and subjective impression Summative test Measure task times, error rates, and subjective impression For formative tests, list the design decisions you intend to make based on the results of the test (there is space for 5, but list more or less as needed). Also list potential tasks that could provide results allowing you to make those decisions. If possible, avoid general design questions like Is the navigation structure working? Instead, focus on specific questions that can be clearly answered. For summative list the screens or tasks that you wish to evaluate and any usability goals. Design Decision Formative Example: Should we keep secondary navigation on the screen all the time or have user mouse-over the first level choices to see a menu of secondary navigation? Summative Example: Did we decrease the average time required to complete a sale by at least 20% as compared to the previous version? Potential Tasks & Implications Find the Account History screen Provide some users with one strategy and some users with the other After the test, ask user how the navigation works Complete a sale. Test identical tasks for both versions. Have users complete several sales to see if there are learning effects. 1) 2) 3) 4) 5) Cognetics Corporation 2003 Page 2

Participants The number of participants to test depends on what you want to get out of the test, the budget, and the time available. Redish and Dumas recommend no less than 3 users per sub-group of the population you want to test and 2 to 4 sub-groups. If you are testing a particularly crucial product, or if you want to use statistical analysis, you may want more. You may also want more if you are conducting a comprehensive test designed to measure general usability levels or if you allow users to choose the specifics of their tasks, as Spool does. Each test has its own set of constraints and needs, but you can use this grid as a starting point in choosing the number of users you want to test: Formal Informal Quick n Dirty Formative (design tool) 3 users per sub-group 2 users per sub-group 2-3 users Summative (usability measurement) 5 users per sub-group 3 users per sub-group N/A Prototype & Test Equipment A significant part of any usability test effort is the physical production of the prototype to be tested. You can use several kinds of prototype for the usability test, depending on what you need to test. The more functional the prototype, the better the results of the usability test. The extra cost and time required to build functional prototypes, however, often makes paper prototypes or non-functional prototypes the better choice. The table below shows the strengths of the various options. For Testing Paper Prototype Online, Non- Functional Prototype Online, Functional Prototype or Released Product Page Structure " " " Terminology " " " Navigation among pages " " " Basic workflow " " " Navigation within pages " " Screens/tasks that provide immediate feedback Screens/tasks that require computation or database " " Ability to manipulate objects in the UI " Detailed workflow " Error handling " " Cognetics Corporation 2003 Page 3

Analysis Decide how you will conduct the analysis. Your needs may differ, but you can use the table below as a starting point to guide your analysis strategy. Formal Informal Quick n Dirty Formative (design tool) Descriptive statistics and observation detail Observation details Observation details Summative (usability measurement) Inferential statistics (showing statistical significance) Descriptive Statistics N/A The test s goals influence the data you collect. If you need to show statistics, you will need to design the test to produce objective measurements like time to completion, error rate, indicators of confusion, number of swear words, etc. If you are not concerned with statistics, you still may want a way to categorize and rate the severity of observation details. Reporting & Presentation It is critical to plan not only the presentation of results to the larger team, but also the process for acting on those results. Here are typical reporting and presentation strategies, along with their strengths and weaknesses. Strategy Strengths Weaknesses Report Presentation Working Meeting Allows thorough documentation of process and results Allows others to replicate test or compare results of later tests Is a clear, solid deliverable Ensures that attendees know of results Allows for questions and discussion Allows for questions and discussion Has the best chance of incorporating results into new design iterations May not get read Often are not acted upon May be difficult to get everyone together May have nothing specific to show for the test except design changes May be difficult to get everyone together Project Planning and Estimation These planning methods and estimation rules of thumb provide you with answers to a Project Manager s questions of How much time? and How much money? for usability tests. By following the steps below, you will have solid reasoning for your figures. As you conduct more usability tests, you will be able to supplement these numbers with your own experience. There are two estimation methods available. You can: Cognetics Corporation 2003 Page 4

1. Begin with the project plan, sometimes called a Work Breakdown Structure by managers, estimating the number of hours you think are needed for each activity for each role. The total hours required for the project is the sum of each of these individual estimates. This is a bottom-up approach. 2. Begin with the estimation rule of thumb, that predicts that a usability test takes 2 hours per participant per screen tested. You can then distribute these hours among the activities in the project plan. This is a top-down approach. Project Plan Determine the roles and activities required for the usability test and distribute the hours accordingly. Roles Most usability tests require at least two people, and often a third or fourth person is helpful. A single person can conduct quick, informal tests if need be. The key roles are: Facilitator: Interacts with users during the test, as appropriate. Observer: Observes the user and takes notes during the test. Additional roles that are often helpful include: Recruiter: Contacts and screens potential users. A/V Technician: Handles video and audio equipment before and after the test. May assist in producing highlights tapes. Cognetics Corporation 2003 Page 5

Work Breakdown Structure A work breakdown structure is a tool that managers use to estimate, organize, and track work on projects. Estimate the time needed for each activity to determine the total person hours needed for a test. If, on the other hand, you have x hours in which to complete a project, distribute those hours among the activities shown below to determine how much time to spend on each task. Activity Facilitator Observer Recruiter A/V Tech Create Evaluation Plan Determine product availability Review and learn product Establish goals for the usability evaluation Coordinate test platform Design evaluation tasks Create evaluation plan Recruit Participants Identify participants to recruit Create Recruitment Screeners Obtain contact info for participants Recruit participants Prepare Evaluation Materials Set up product to be tested (screens, wireframes, etc.) Create/assemble scripts, forms, materials Facilities logistics (find lab, etc.) Prepare lab equipment Conduct dry run Conduct Evaluation Sessions: x participants in y days Analyze evaluation data Present Findings Prepare draft report/presentation Review draft report/presentation Prepare final report/presentation Present final report/presentation Facilitate decision-making Cognetics Corporation 2003 Page 6

Estimation Rule of Thumb Based on our experience conducting usability tests, we have arrived at a way to compute a quick starting estimate of the hours needed to conduct a usability test. Use this number as a starting point for your planning not as the last word on the effort required for the test. The estimated hours for a formal usability test is 10 hours per hour of testing (2 for preparation and analysis and 1 for the hour spent testing). Thus, if p is the number of participants and you want each test session to be h hours: Initial hours estimate = 10 x p x h Informal tests can use a similar equation, but typically use 7 or 8 hours per hour of testing. For guidance on selecting the right number of participants, see the Usability Test Design section. Factors that may prompt you to adjust these figures include: Availability of a Testable Prototype. Is a prototype ready to go, or does one need to be built? Paper prototypes take less time to create, but they still take longer than testing an existing prototype. Difficulty of Recruiting Users. Is there a standard procedure to contact and recruit users or must you start from scratch? In particular, is there internal resistance to contacting users? Specialization of the Application. Will the Facilitator and Observer need to spend time learning the system themselves? Detail of Analysis. Do you plan to record and statistically analyze in-depth usability metrics or make simple observations and summarize them in a table? Format of the Report. Do you plan to produce a complete written report with a full review cycle, conduct only a quick results/decisions meeting, or something in between? Estimate Cost and Schedule for the Test Schedule Using the activities above, and assuming 6 useful hours per workday, determine the calendar schedule for the usability test. (Note: 6 hours per day is not to suggest that we re all goofing off, but rather to acknowledge the fact that we get called into staff meetings, attend training, answer e-mail, and perform other useful but not project related activities.) Cost Estimate the labor cost of the usability test by multiplying the hourly rates of the people involved by the number of hours they are to spend on the test. Estimate additional costs using available information. Price equipment and lab rental in your area and estimate incentives based on your area and participant income. Typical costs include usability lab rental, A/V equipment rental, and incentives for participants. Cognetics Corporation 2003 Page 7

Heuristic Review Planning Tool Heuristic Review Process Every heuristic review requires you to address certain issues that shape the test s design. Answer these questions before beginning the review to make certain that you have covered all of the issues and discussed their impact on the review and its presentation. Goals The stage of LUCID in which you conduct a heuristic review affects the review s goals. Note the stage of LUCID in which this review will take place, and then state your specific goals. Stage Envision Discovery Design Foundation Design Detail Build Release Typical Goals Inventory usability issues in a previous release Prioritize usability issues Provide feedback on an emerging design Evaluate navigation, architecture, and high-level concepts in the UI Identify areas to investigate in a usability test Substitute for an early usability test Evaluate specific workflows Evaluate specific screens Evaluate user assistance strategy Fine-tune specific workflows Fine-tune specific screens Prepare for upcoming releases List the review s specific objectives below (there is space for 3, but list more or less as needed). What will be done with the report or presentation? Cognetics Corporation 2003 Page 8

Purpose of the Review Example: Discover and prioritize usability issues in a recent software version so management can better allocate resources for an upcoming version. Example: Guide the design of a still in-progress user interface. 1) Process and Reporting Implications Concentrate on high-level recommendations Provide a high-level slide show outlining issues, examples, and recommendations Provide complete observations in a separate, detailed report Hold a meeting with the UI design team to discuss each observation and decide upon a solution. 2) 3) Heuristics By definition, a heuristic review is a comparison of a user interface compliance with a given set of heuristics. The following list of heuristic sets will get you started. They are necessarily general you should look for more specific heuristics in your particular area. Source Cognetics Corporation Jakob Nielsen National Cancer Institute STC Usability SIG Scott McDaniel (for search user interfaces) Reference http://www.cognetics.com/services/heuristic_guidelines.html http://www.useit.com/papers/heuristic/heuristic_list.html http://www.usability.gov/guidelines/index.html http://www.stcsig.org/usability/resources/toolkit/toolkit.html McDaniel, S. and McDaniel, M. (2002) The Big Dig: Mining Nuggets of Value. User Experience, 1:2, 20-29. Prototype & Test Equipment Most heuristic reviews are performed on complete, functional user interfaces. They can also be performed on prototypes. The more functional the prototype, the more complete the review can be. The table below shows the strengths of the various options. Cognetics Corporation 2003 Page 9

For Reviewing Paper Prototype Online, Non- Functional Prototype Online, Functional Prototype or Released Product Page Structure " " " Terminology " " " Navigation among pages " " " Basic workflow " " " Navigation within pages " " Screens/tasks that provide immediate feedback Screens/tasks that require computation or database " " Ability to manipulate objects in the UI " Detailed workflow " Error handling " " Reporting & Presentation It is critical to plan not only the presentation of results to the larger team, but also the process for acting on those results. Here are typical reporting and presentation strategies, along with their strengths and weaknesses. Strategy Strengths Weaknesses Report Presentation Working Meeting Allows thorough documentation of process and results Is a clear, solid deliverable Ensures that attendees know of results Allows for questions and discussion Allows for questions and discussion Has the best chance of incorporating results into new design iterations May not get read Often are not acted upon May be difficult to get everyone together May have nothing specific to show for the review except design changes May be difficult to get everyone together Project Planning and Estimation These planning methods and estimation rules of thumb provide you with answers to a Project Manager s questions of How much time? and How much money? for a heuristic review. By following the steps below, you will have solid reasoning for your figures. As you conduct more reviews, you will be able to supplement these numbers with your own experience. There are two estimation methods available. You can: Cognetics Corporation 2003 Page 10

1. Begin with the project plan, sometimes called a Work Breakdown Structure by managers, estimating the number of hours you think are needed for each activity for each role. The total hours required for the project is the sum of each of these individual estimates. This is a bottom-up approach. 2. Begin with the estimation rule of thumb, that predicts that a heuristic review takes 6 hours per participant per screen tested. You can then distribute these hours among the activities in the project plan. This is a top-down approach. Project Plan Determine the roles and activities required for the heuristic review and distribute the hours accordingly. Roles You can complete a heuristic review alone, but generally having two or more reviewers produces better results. The key roles are: Primary Reviewer: Reviews the UI, leads the effort, and writes report/presentation. Secondary Reviewers: Reviews the UI and coordinates with primary reviewer. Cognetics Corporation 2003 Page 11

Work Breakdown Structure A work breakdown structure is a tool that managers use to estimate, organize, and track work on projects. Estimate the time needed for each activity to determine the total person hours needed for a review. If, on the other hand, you have x hours in which to complete a project, distribute those hours among the activities shown below to determine how much time to spend on each task. Activity Primary Reviewer Secondary Reviewer 1 Secondary Reviewer 2 Secondary Reviewer 3 Review Kickoff Establish goals of review Establish usability goals of product Establish product schedule Introduce reviewer(s) to product Q & A about product Initial Review Conduct independent reviews of product Merge observations and questions Submit questions about product Second Review Meet with project team to answer questions Complete product review Heuristic Report Write draft report Review/revise report with secondary reviewers Review/revise report with project team Finish final report Results Presentation Present results and recommendations to project team Facilitate decision-making Consult with project team during implementation of recommendations Cognetics Corporation 2003 Page 12

Estimation Rule of Thumb Based on our experience conducting heuristic reviews, we have arrived at a way to compute a quick starting estimate of the hours needed to conduct a review. Use this number as a starting point for your planning not as the last word on the effort required for the review. The estimated hours for a heuristic review is 6 hours per screen reviewed. Thus, if s is the number of screens to review: Initial hours estimate = 6 x s This estimate provides a starting point that you will have to adjust based on your specific situation. Factors that may prompt you to adjust these figures include: Availability of a Reviewable Prototype. Is a prototype ready to go, or does one need to be built? Paper prototypes take less time to create, but reviews of functional prototypes provide better results. Specialization of the Field. Will the reviewers need to spend time understanding basic concepts of the system s domain in order to understand it? Specialization of the Application. Will the reviewers need to spend time learning the system before they can review it effectively? Format of the Report. Do you plan to produce a complete written report with a full review cycle, conduct only a quick results/decisions meeting, or something in between? Estimate Cost and Schedule for the Test Schedule Using the activities above, and assuming 6 useful hours per workday, determine the calendar schedule for the heuristic review. (Note: 6 hours per day is not to suggest that we re all goofing off, but rather to acknowledge the fact that we get called into staff meetings, attend training, answer e-mail, and perform other useful but not project related activities.) Cost Estimate the labor cost of the heuristic review by multiplying the hourly rates of the people involved by the number of hours they are to spend on the test. Heuristic reviews do not typically result in costs other than labor costs unless travel is involved. Cognetics Corporation 2003 Page 13