Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Similar documents
Analytical Evaluation

Usability Testing. November 9, 2016

Chapter 15: Analytical evaluation

Analytical evaluation

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

cs465 principles of user interface design, implementation and evaluation

15/16 CSY2041 Quality and User-Centred Systems

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

Introducing Evaluation

HCI and Design SPRING 2016

User Interface Evaluation

3 Evaluating Interactive Systems

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

3 Prototyping and Iterative Evaluations

cs414 principles of user interface design, implementation and evaluation

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

CogSysIII Lecture 9: User Modeling with GOMS

Usability analysis and inspection

Evaluation Types GOMS and KLM PRICPE. Evaluation 10/30/2013. Where we are in PRICPE: Analytical based on your head Empirical based on data

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability

Design Heuristics and Evaluation

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

SFU CMPT week 11

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Additional reading for this lecture: Heuristic Evaluation by Jakob Nielsen. Read the first four bulleted articles, starting with How to conduct a

Usability Testing. November 14, 2016

Looking Back: Fitts Law

Analytical &! Empirical Evaluation

Interaction Design. Chapter 7 (June 22nd, 2011, 9am-12pm): Evaluation and Testing

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Introducing Evaluation

Lecture 14: Heuristic Evaluation. Fall UI Design and Implementation 1

HEURISTIC EVALUATION WHY AND HOW

Expert Evaluations. November 30, 2016

Usability. HCI - Human Computer Interaction

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

EVALUATION OF PROTOTYPES USABILITY TESTING

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

IPM 10/11 T1.6 Discount Evaluation Methods

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

Usability. CSE 331 Spring Slides originally from Robert Miller

Addition about Prototypes

EVALUATION OF PROTOTYPES USABILITY TESTING

MIT GSL week 4 Wednesday. User Interfaces II

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

Analytical evaluations

Homework Set 2. A brief discussion

evision Review Project - Engagement Simon McLean, Head of Web & IT Support Information & Data Services.

Evaluation. Why, What, Where, When to Evaluate Evaluation Types Evaluation Methods

CS 315 Intro to Human Computer Interaction (HCI)

Course Outline. Department of Computing Science Faculty of Science. COMP 3450 Human Computer Interaction Design (3,1,0) Fall 2015

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

SEPA diary Heuristic evaluation

MAKING YOUR GATEWAY EASY AND PLEASANT TO USE

Announcements. Usability. Based on material by Michael Ernst, University of Washington. Outline. User Interface Hall of Shame

Lecture 22: Heuristic Evaluation. UI Hall of Fame or Shame? Spring User Interface Design and Implementation 1

Goals of Usability Evaluation

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

iscreen Usability INTRODUCTION

User-centered design in technical communication

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

User Experience Report: Heuristic Evaluation

Learnability of software

Evaluation Continued. Lecture 7 June 20, 2006

Übung zur Vorlesung Mensch-Maschine-Interaktion. e5: Heuristic Evaluation

Heuristic Evaluation of NUIG Participate Module 1

Heuristic Evaluation. Ananda Gunawardena. Carnegie Mellon University Computer Science Department Fall 2008

Interaction Design DECO1200

Interaction Design. Human-Computer. COGS120/CSE170 - Intro. HCI. Instructor: Philip Guo. Week 3 - Heuristic Evaluation ( )

Evaluation and Testing. Andreas Butz, LMU Media Informatics slides partially taken from MMI class

Usability. Daniela Rosner. Web Architecture, October 9, School of Information UC Berkeley

Heuristic Evaluation. Jon Kolko Professor, Austin Center for Design

Programmiersprache C++ Winter 2005 Operator overloading (48)

dt+ux: Design Thinking for User Experience Design, Prototyping & Evaluation!

This quiz is closed book, closed notes. You have 80 minutes to complete it. Your name:

USERINTERFACE DESIGN & SIMULATION. Fjodor van Slooten

CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara

Amsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app

UC Irvine Law Library Website Usability Project Initial Presentation

NUR - Introduction to HCI. Big picture, design process, UCD, UI issues

UX Design Principles and Guidelines. Achieve Usability Goals

Human-Computer Interaction

03 Usability Engineering

CS 4317: Human-Computer Interaction

Implementing Games User Research Processes Throughout Development: Beyond Playtesting

Why User Interface Design? What

User-Centered Design. Jeff Bos, Design Insights BlackBerry

Jakob Nielsen s Heuristics (

A Heuristic Evaluation of Ohiosci.org

CS 544 Discount Usability Engineering

Transcription:

Overview of Today s Lecture Analytical Evaluation / Usability Testing November 17, 2017 Analytical Evaluation Inspections Recapping cognitive walkthrough Heuristic evaluation Performance modelling 1 2 Reminder: Cognitive Walkthrough (Process) One or more experts: use these profiles, scenarios, and tasks, to step-by-step walk through a task with the interface (sketch, paper prototype, working system, etc.) At each step, ask 3 questions Will the user know the correct action? i.e. will the user know what to do in the interface to achieve the task? Will the user notice how to do the correct action? Can users see the button or menu item, or be expected to know the hotkey, etc., for the next action? Is it apparent when it is needed? Will the user associate and interpret the response from the action correctly? Will users know from the feedback that they have made a correct or incorrect choice of action? ex: find a book at Amazon.ca via search task: to buy a copy of a textbook from amazon.ca using search typical users: students who use the web regularly step 1. find the book using search Q: will users know what to do? A1: Yes they know that they must find search. A2: May know that they can filter search by department Q: will users see how to do it? A1: Yes they have seen website searches before, this uses standard icons. A2: The search has a standard dropdown menu, default to all. If user notices this is a menu, they will understand as a standard menu. May not notice it s a menu Q: will users understand from feedback whether the action was correct or not? step 2. purchase the book 3 Fall 2017 COMP 3020 4 1

Cognitive Walkthroughs Can be integrated throughout your whole design process A simple way to keep yourself grounded in the practicalities of the interface Heuristic Evaluation Define: heuristic a rule of thumb a principle that is a shortcut for solving a problem or making decisions Not always right/true, but cognitive shortcuts Fall 2017 COMP 3020 5 6 Design Heuristics Broad usability guidelines that can guide a developer s design efforts Derived from common design problems across many systems Several researchers and practitioners have developed different sets of heuristics (e.g. domain specific) Heuristic Evaluation Systematic inspection of an interface design to see if an interface complies with a set of usability heuristics, or usability guidelines. General process:» 3-5 inspectors (usability engineers, experts )» inspect interface in isolation (~1-2 hr for simple interfaces)» results are aggregated afterwards single evaluator catches ~35% usability problems 5 evaluators catch ~75% 7 8 2

Heuristic Evaluation Heuristic Evaluation: Background Developed by Jacob Nielsen in the early 1990s Proportion of Usability Problems Found 100% 75% 50% 25% Based on heuristics distilled from an empirical analysis of 249 usability problems These heuristics have been revised for current technology Heuristics still needed for some emerging technologies (e.g., VR, AR, Wearables, etc). 5 10 15 Number of Evaluators Design guidelines form a basis for developing heuristics 9 10 Nielsen s heuristics Nielsen s heuristics Visibility of system status - Are users kept informed at all times? Match between system and real world - Is the UI language simple? User control and freedom - Are there easy escapes from unexpected locations? Consistency and standards - Is performing similar action consistent? Help users recognize, diagnose, recover from errors - Are error messages helpful? Error prevention - Is it easy to make errors? Recognition rather than recall - Are objects, actions and options always visible? Flexibility and efficiency of use - Are there accelerators? Aesthetic and minimalist design - Is any unnecessary and irrelevant information provided? Help and documentation - Is help provided that can be easily searched? Could be too general and might need to be tailored to the environment E.g., HOMERUN suggested for corporate web site evaluation High-quality content Often updated Minimal download time Ease-of-use Relevant to users needs Unique to the online medium Netcentric corporate culture 11 12 3

Heuristic Evaluation: Advantages A few guidelines identify many common usability problems Fewer practical and ethical issues to deal with no participants Cheap and fast: a few guidelines identify many common usability problems Provides common evaluation template (to compare approaches, systems) Heuristic Evaluation: Problems Principles may be too general Subtleties involved in use Designer may not be able to overcome being defensive / experts may disagree You may actually have the wrong design altogether Can be hard to find experts False positives: does the rule always apply? Not complete: will miss problems Not a replacement for user testing 13 14 Performance Modelling Fitts Law Using mathematical models to generate quantitative predictions of certain interface actions or sequences of actions One of most tested, lasting models in HCI Models target acquisition performance T = Time A = Amplitude (distance) W = Width (size) of target a/b = empirically derived constants 15 16 4

Fitts Law What does the law tell us? How can you use the law? Hick-Hyman Law Fitts Law predicts how long it will take users to acquire targets once they know which target to select What about decision time? Hick-Hyman Law models the time it takes users to decide between n alternatives 17 18 Hick-Hyman Law Hick-Hyman Law When items are equi-probable: Applications? E.g., When certain items are more likely to be chosen than others: Deciding on menu depth vs. breadth 19 20 5

Hick-Hyman Law Keystroke Level Model Models decision time, not searching time Given a task consisting of a sequence of steps If the user is not familiar with the interface elements: How long will it take the user to perform those steps given a specific interface? Time to search through n items is linear, not logarithmic Keystroke Level Model (KLM) Models performance given a sequence of steps for an expert user 21 22 KLM Activity How is performance calculated? Individual steps described using operators: K = keystroking = 0.35s P = pointing = 1.10s (Fitts Law for greater precision) B = Button press or release (mouse) = 0.10 seconds (BB for mouse click = 0.20seconds) H = homing = 0.4s D = drawing = variable with length of line M = Mental operator = 1.35s R = response operator by system = 1.2s Sum up times for individual steps 23 How long would it take the user to replace all occurrences of a 4-letter word with a new 4-letter word 24 6

Activity Description Operation Time (sec) Reach for mouse H 0.40 Move pointer to Replace button P 1.10 Click on Replace button BB 0.2 Home on keyboard H 0.40 Type old word M, K4 2.75 Reach for mouse H 0.4 Move pointer to correct field P 1.10 Click on field BB 0.2 Home on keyboard H 0.4 Type new word M, K4 2.75 Reach for mouse H 0.4 Move pointer to Replace All P 1.10 Click Replace All BB 0.2 Total 11.4 According to this model, the time is 11.4 secs Performance Modelling: Advantages Can evaluate components of interface prior to building it Good for comparing different interface possibilities Evaluations with human subjects are expensive and experts can be hard to find Can get the kinks out of interface prior to full user testing/experimentation 25 26 Performance Modelling: Problems Difficult to model complex tasks E.g., consider designing a KLM for your complete project Most models consider only expert behaviour For really accurate predictions, coefficients need to be determined empirically Today s Messages: Analytical Evaluations Analytical evaluations are a set of structured techniques for evaluating interfaces without (necessarily) involving end users Often used to address major usability issues and inefficiencies prior to involving end users Should always complement rather than replace evaluations with end users 27 28 7

Usability test Usability Testing A usability test is a formal method for evaluating whether a design is learnable, efficient, memorable, can reduce errors, meets users expectations, etc. users are not being evaluated the design is being evaluated 29 30 Usability Test Rough Outline Bring in real users Have them complete tasks with your design, while you watch (ideally with your entire team) Measure and record things task completion, task time, error rates satisfaction, problem points, etc. Use the data to Usability Test Rough Outline Identify problems (major ones minor ones) Provide design suggestions to design/engineering team Iterate on the design, repeat use a think-aloud protocol, so you can hear what they are thinking 31 32 8

Important Considerations Testing Environments Usually takes place in a usability lab or other controlled space Major emphasis is on selecting representative users developing representative tasks 5-10 users typically selected Tasks usually last no more than 30 minutes The test conditions should be the same for every participant Informed consent form explains ethical issues 33 34 Testing Environments Testing Environments Best environment depends on pragmatic considerations, as well as what you re looking for Do you want your whole team to be able to view? Do you want to be able to review a test? How important are interruptions? What are you resources? 35 36 9

Pilot studies Practice runs Especially important for usability testing Make sure your plan is viable All the corners are checked (your script, questionnaires, tasks, etc., all work) It is worth doing several to iron out problems before doing the main study Ask colleagues if you can t spare real users 37 10