Chapter 15: Analytical evaluation

Similar documents
Analytical evaluation

Introducing Evaluation

15/16 CSY2041 Quality and User-Centred Systems

Analytical Evaluation

Introducing Evaluation

Analytical evaluations

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Expert Evaluations. November 30, 2016

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

SFU CMPT week 11

cs465 principles of user interface design, implementation and evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

HCI and Design SPRING 2016

CogSysIII Lecture 9: User Modeling with GOMS

HEURISTIC EVALUATION WHY AND HOW

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

Goals of Usability Evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

IPM 10/11 T1.6 Discount Evaluation Methods

Übung zur Vorlesung Mensch-Maschine-Interaktion. e5: Heuristic Evaluation

Evaluation. Why, What, Where, When to Evaluate Evaluation Types Evaluation Methods

Analytical evaluations

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

Evaluation Continued. Lecture 7 June 20, 2006

Evaluation Types GOMS and KLM PRICPE. Evaluation 10/30/2013. Where we are in PRICPE: Analytical based on your head Empirical based on data

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

User Interface Evaluation

Interaction Design. Chapter 7 (June 22nd, 2011, 9am-12pm): Evaluation and Testing

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

Addition about Prototypes

Usability. HCI - Human Computer Interaction

3 Evaluating Interactive Systems

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

Design Heuristics and Evaluation

cs414 principles of user interface design, implementation and evaluation

Design Reviews. Scott Klemmer. stanford hci group / cs147. tas: Marcello Bastea-Forte, Joel Brandt, Neil Patel, Leslie Wu, Mike Cammarano

Usability analysis and inspection

MIT GSL week 4 Wednesday. User Interfaces II

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

User Centered Design - Maximising the Use of Portal

User Experience Report: Heuristic Evaluation

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

Evaluating the Design without Users. A formalized way of imagining people s thoughts and actions when they use an interface for the first time

3 Prototyping and Iterative Evaluations

Theories of User Interface Design

Analytical &! Empirical Evaluation

User-Centered Design. Jeff Bos, Design Insights BlackBerry

Lecture 22: Heuristic Evaluation. UI Hall of Fame or Shame? Spring User Interface Design and Implementation 1

Lecture 14: Heuristic Evaluation. Fall UI Design and Implementation 1

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

A Heuristic Evaluation of Ohiosci.org

- visibility. - efficiency

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

UX Research in the Product Lifecycle

Lecture 6. Usability Testing

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

SEPA diary Heuristic evaluation

Additional reading for this lecture: Heuristic Evaluation by Jakob Nielsen. Read the first four bulleted articles, starting with How to conduct a

GOMS. Adapted from Berkeley Guir & Caitlin Kelleher

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

evision Review Project - Engagement Simon McLean, Head of Web & IT Support Information & Data Services.

Heuristic Evaluation of NUIG Participate Module 1

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

Heuristic Evaluation. Jon Kolko Professor, Austin Center for Design

Engineering Human Computer Interaction

Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering

UI Evaluation: Cognitive Walkthrough. CS-E5220 User Interface Construction

Amsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app

iscreen Usability INTRODUCTION

Human-Computer Interaction (CS4317/5317)

How to Choose the Right UX Methods For Your Project

Programmiersprache C++ Winter 2005 Operator overloading (48)

User Interface Evaluation

Heuristic Evaluation. Ananda Gunawardena. Carnegie Mellon University Computer Science Department Fall 2008

Foundation Level Syllabus Usability Tester Sample Exam

1. Select/view stores based on product type/category- 2. Select/view stores based on store name-

Evaluation and Testing. Andreas Butz, LMU Media Informatics slides partially taken from MMI class

Announcements. Usability. Based on material by Michael Ernst, University of Washington. Outline. User Interface Hall of Shame

Homework Set 2. A brief discussion

NPTEL Computer Science and Engineering Human-Computer Interaction

Input Performance. KLM, Fitts Law, Pointing Interaction Techniques. Input Performance 1

Chapter 4. Evaluating Interface Designs

EVALUATION OF PROTOTYPES USABILITY TESTING

GOMS Lorin Hochstein October 2002

User Interaction and Product Design?

Jakob Nielsen s Heuristics (

Why User Interface Design? What

EVALUATION OF PROTOTYPES USABILITY TESTING

Transcription:

Chapter 15: Analytical evaluation

Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain how to do doing heuristic evaluation and walkthroughs. Describe how to perform GOMS and Fitts Law, and when to use them. Discuss the advantages and disadvantages of analytical evaluation.

Inspections Several kinds. Experts use their knowledge of users & technology to review software usability. Expert critiques (crits) can be formal or informal reports. Heuristic evaluation is a review guided by a set of heuristics. Walkthroughs involve stepping through a pre-planned scenario noting potential problems.

Heuristic evaluation Developed Jacob Nielsen in the early 1990s. Based on heuristics distilled from an empirical analysis of 249 usability problems. These heuristics have been revised for current technology. Heuristics being developed for mobile devices, wearables, virtual worlds, etc. Design guidelines form a basis for developing heuristics.

Nielsen s heuristics Visibility of system status. Match between system and real world. User control and freedom. Consistency and standards. Error prevention. Recognition rather than recall. Flexibility and efficiency of use. Aesthetic and minimalist design. Help users recognize, diagnose, recover from errors. Help and documentation.

Discount evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

No. of evaluators & problems

3 stages for doing heuristic evaluation Briefing session to tell experts what to do. Evaluation period of 1-2 hours in which: Each expert works separately; Take one pass to get a feel for the product; Take a second pass to focus on specific features. Debriefing session in which experts work together to prioritize problems.

Advantages and problems Few ethical & practical issues to consider because users not involved. Can be difficult & expensive to find experts. Best experts have knowledge of application domain & users. Biggest problems: Important problems may get missed; Many trivial problems are often identified; Experts have biases.

Cognitive walkthroughs Focus on ease of learning. Designer presents an aspect of the design & usage scenarios. Expert is told the assumptions about user population, context of use, task details. One of more experts walk through the design prototype with the scenario. Experts are guided by 3 questions.

The 3 questions Will the correct action be sufficiently evident to the user? Will the user notice that the correct action is available? Will the user associate and interpret the response from the action correctly? As the experts work through the scenario they note problems.

Pluralistic walkthrough Variation on the cognitive walkthrough theme. Performed by a carefully managed team. The panel of experts begins by working separately. Then there is managed discussion that leads to agreed decisions. The approach lends itself well to participatory design.

A project for you http://www.id-book.com/catherb/ provides heuristics and a template so that you can evaluate different kinds of systems. More information about this is provided in the interactivities section of the id-book.com website.

Predictive models Provide a way of evaluating products or designs without directly involving users. Less expensive than user testing. Usefulness limited to systems with predictable tasks - e.g., telephone answering systems, mobiles, cell phones, etc. Based on expert error-free behavior.

GOMS Goals - the state the user wants to achieve e.g., find a website. Operators - the cognitive processes & physical actions needed to attain the goals, e.g., decide which search engine to use. Methods - the procedures for accomplishing the goals, e.g., drag mouse over field, type in keywords, press the go button. Selection rules - decide which method to select when there is more than one.

Keystroke level model GOMS has also been developed to provide a quantitative model - the keystroke level model. The keystroke model allows predictions to be made about how long it takes an expert user to perform a task.

Response times for keystroke level operators (Card et al., 1983) Operator Description Time (sec) K Pressing a single key or button Average skilled typist (55 wpm) Average non-skilled typist (40 wpm) Pressing shift or control key Typist unfamiliar with the keyboard 0.22 0.28 0.08 1.20 P P1 Pointing with a mouse or other device on a display to select an object. This value is derived from Fitts Law which is discussed below. Clicking the mouse or similar device 0.40 0.20 H Bring home hands on the keyboard or other 0.40 device M Mentally prepare/respond 1.35 R(t) The response time is counted only if it causes the user to wait. t

Fitts Law (Fitts, 1954) Fitts Law predicts that the time to point at an object using a device is a function of the distance from the target object & the object s size. The further away & the smaller the object, the longer the time to locate it and point to it. Fitts Law is useful for evaluating systems for which the time to locate an object is important, e.g., a cell phone, a handheld devices.

A project for you Use the web and other resources to research claims that heuristic evaluation often identifies problems that are not serious and may not even be problems. Decide whether you agree or disagree. Write a brief statement arguing your position. Provide practical evidence & evidence from the literature to support your position.

A Project for you Fitts Law Visit Tog s website and do Tog s quiz, designed to give you fitts! http://www.asktog.com/columns/0 22DesignedToGiveFitts.html

Key points Expert evaluation: heuristic & walkthroughs. Relatively inexpensive because no users. Heuristic evaluation relatively easy to learn. May miss key problems & identify false ones. Predictive models are used to evaluate systems with predictable tasks such as telephones. GOMS, Keystroke Level Model, & Fitts Law predict expert, error-free performance.