SFU CMPT week 11

Similar documents
cs465 principles of user interface design, implementation and evaluation

UI Evaluation: Cognitive Walkthrough. CS-E5220 User Interface Construction

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

Chapter 15: Analytical evaluation

User Interface Evaluation

Design Heuristics and Evaluation

Analytical evaluation

HCI and Design SPRING 2016

Evaluating the Design without Users. A formalized way of imagining people s thoughts and actions when they use an interface for the first time

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

iscreen Usability INTRODUCTION

Analytical Evaluation

Jakob Nielsen s Heuristics (

1. Select/view stores based on product type/category- 2. Select/view stores based on store name-

Lose It! Weight Loss App Heuristic Evaluation Report

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

Goals of Usability Evaluation

HEURISTIC EVALUATION WHY AND HOW

Übung zur Vorlesung Mensch-Maschine-Interaktion. e5: Heuristic Evaluation

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

CogSysIII Lecture 9: User Modeling with GOMS

15/16 CSY2041 Quality and User-Centred Systems

Evaluation Types GOMS and KLM PRICPE. Evaluation 10/30/2013. Where we are in PRICPE: Analytical based on your head Empirical based on data

User Experience Report: Heuristic Evaluation

Expert Evaluations. November 30, 2016

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

Due on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction

IPM 10/11 T1.6 Discount Evaluation Methods

Human-Computer Interaction IS 4300

Additional reading for this lecture: Heuristic Evaluation by Jakob Nielsen. Read the first four bulleted articles, starting with How to conduct a

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

Introducing Evaluation

Group #: Evaluation Proposal For Digital Libraries, LIS 5472, Fall 2010, Professor Sanghee Oh

Usability. Daniela Rosner. Web Architecture, October 9, School of Information UC Berkeley

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

How to Conduct a Heuristic Evaluation

Heuristic Evaluation. Ananda Gunawardena. Carnegie Mellon University Computer Science Department Fall 2008

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

Usability. HCI - Human Computer Interaction

NPTEL Computer Science and Engineering Human-Computer Interaction

Usability analysis and inspection

User Interface Evaluation

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

Heuristic Evaluation

Evaluation Plan for Fearless Fitness Center Website

Analytical &! Empirical Evaluation

Usability in Multimedia. By Pınar Koçer Aydın and Özgür Bayram

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

cs414 principles of user interface design, implementation and evaluation

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

A Heuristic Evaluation of Ohiosci.org

Programmiersprache C++ Winter 2005 Operator overloading (48)

Design Reviews. Scott Klemmer. stanford hci group / cs147. tas: Marcello Bastea-Forte, Joel Brandt, Neil Patel, Leslie Wu, Mike Cammarano

Heuristic Evaluation of Covalence

Addition about Prototypes

Interaction Techniques. SWE 432, Fall 2016 Design and Implementation of Software for the Web

Interac(ve Form: Inspec(on methods. Eva Ragnemalm, IDA

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

Heuristic Evaluation. Jon Kolko Professor, Austin Center for Design

Part 5. Verification and Validation

Learnability of software

Applying Usability to elearning

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

Sri Vidya College of Engineering & Technology Question bank (unit 2) UNIT 2 TWO MARKS

Introducing Evaluation

Cognitive Walkthrough Evaluation

Lecture 22: Heuristic Evaluation. UI Hall of Fame or Shame? Spring User Interface Design and Implementation 1

Lecture 14: Heuristic Evaluation. Fall UI Design and Implementation 1

Homework Set 2. A brief discussion

Interaction Techniques. SWE 432, Fall 2017 Design and Implementation of Software for the Web

Methods: Deciding What To Design

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Hyper Mesh Code analyzer

UX Research in the Product Lifecycle

Cognitive Walkthrough

Crab Shack Kitchen Web Application

Announcements. Usability. Based on material by Michael Ernst, University of Washington. Outline. User Interface Hall of Shame

Usability Report. Author: Stephen Varnado Version: 1.0 Date: November 24, 2014

Theories of User Interface Design

Perfect Timing. Alejandra Pardo : Manager Andrew Emrazian : Testing Brant Nielsen : Design Eric Budd : Documentation

Problem and Solution Overview: An elegant task management solution, that saves busy people time.

Heuristic Evaluation of [ Quest ]

Using the Keystroke-Level Model to Estimate Execution Times

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

COMMON ISSUES AFFECTING SECURITY USABILITY

Heuristic Evaluation of igetyou

Introduction to Software Testing

Cognitive Disability and Technology: Universal Design Considerations

Transcription:

SFU CMPT-363 2004-2 week 11 Manuel Zahariev E-mail: manuelz@cs.sfu.ca Based on course material from Arthur Kirkpatrick, Alissa Antle and Paul Hibbits July 21, 2004 1

Analytic Methods Advantages can be used early before users and prototypes are ready for empirical testing are a cost effective way to evaluate can inform design of empirical study what to aspects to study what to measure what hypothesis to test Disadvantages feels like designers are being evaluated 2

Inspections Inspections are one type of analytic method. Common in industry Rapid feedback Often identify problems but not solutions Types of inspections: Cognitive Walkthroughs Formal Usability Inspections Heuristic Evaluations 3

Cognitive Walkthroughs Focus: identify errors that might occur crossing the gulf of execution Expert critique of a user interface that involved simulating the use of the system and analyzing possible problems in goal selection, planning or action execution. Analyze in details users goals, expectation and reactions during tasks. A formalized way of imagining people s thoughts and actions as they work through tasks. 4

Who and when? Small piece of an interface: you can do your own informal in your head walkthrough to monitor the design as you work Larger parts of an interface: get together with a group a people, including other designers and users, and do a walkthrough for a complete task Can be performed any time throughout the design process but especially useful in the earlier stages. Observation 1 A cognitive walkthrough is a tool for developing the interface, not validating it. The expected result of the walkthrough is to find things that can be improved. 5

What you need 1. A description or prototype of the interface 2. A task description 3. A complete, written list of the actions needed to complete the task with the interface (e.g. scenario) 4. An idea of who the users will be and what kind of experience they will bring to the job 6

Steps in a Walkthrough 1. Select one of the tasks that the design is intended to support 2. Try to tell a believable story about each action a user has to take to do the task 3. To make the story believable you have to motivate each of the users actions, relying on the users general knowledge and on the prompts and feedback provided by the interface 4. If you cannot tell a believable story then you have located a problem with the interface 7

Problems to look out for 1. Question assumptions about what the users will be thinking 2. Identify controls that are obvious to the design engineer but may be hidden from the users point of view 3. Suggest difficulties with labels and prompts 4. Note inadequate feedback 5. Uncover shortcomings in the current specification (i.e. not in the interface itself but in the way that it is described) 6. Help insure that the specs are complete 7. May require the designer to go back and talk to the user 8

Conducting the Walkthrough Keep the following key questions in mind as you critique the story: 1. Will users be thinking what the designer assumes they will be thinking? 2. Will users see the control (button, menu, switch, etc.) for the action? 3. Once users find the control, will they recognize that it produces the effect they want? 4. After the action is taken, will users understand the feedback they get, so they can go on to the next action with confidence? 9

Documenting a Walkthrough Important to document the cognitive walkthrough to keep a good record of what needs improvement. Use standard evaluation forms. 10

Documenting a Walkthrough Cover form: description of the prototype description of the task written list of the actions description of the users date and time of the walkthrough names of the evaluators 11

Documenting a Walkthrough Individual problem forms: For each action, fill out a separate form. Document all negative answers one per form. Indicate the system being built (the version, if any). date. Evaluator s names. Detailed description of the usability problems, severity of the problem (How often it might occur and the impact on the user). 12

Common Mistakes The written list of the actions is merged into the walkthrough: the evaluator makes up the actions as the walkthrough proceeds often ends up stumbling through the interface start with a correct list of the individual actions The walkthrough is tested on real users on the system: the walkthrough is an evaluation tool that helps you and your coworkers apply your design expertise to the evaluation of the interface (NOT tested with real users) Because you have identified the behaviour of the whole class of users, you should identify more problems than you would find a single, unique user to find. 13

What to do with results? Fix the interface!! Many of the fixes will be obvious Harder problem exists when the user does not have any reason to think that an action needs to be performed 14

Variation: Formal Usability Inspection Formal Usability Inspection: Takes the software inspection methodology (code inspections) and adapts it to usability evaluation. Inspectors walkthrough tasks with the user s goals and purpose in mind, similar to cognitive walkthroughs, although the emphasis is less on cognitive theory and more on encountering defects. Technique reduces the time required to discover defects in a tight product design cycle and can be done early on specs or protoypes. 15

Method The Formal Usability Inspection formalizes the review of a specification or early prototype. Basic steps assemble a team of inspectors (4 8). assign each a special role in the context of the inspection. distribute the design documents to be inspected and instructions. have the inspectors go off on their own to do their inspection. convene later in a formal inspection meeting. Defects found are assigned to responsible parties to be fixed, and the cycle continues. 16

Predictive Models Goal: to use established theories to build a predictive model of user behavior. Predictive models provide various measures of user performance without actually testing users. Used to evaluate alternative systems before buying one. Used evaluate rival designs at the specification stage (before prototype is built). Used to predict execution time of tasks that skilled users are likely to perform. 17

Predictive Models Observation 2 Predictive Models are most useful for applications where alternate solutions are being compared and saved seconds can result in saved $$$. Examples: Telephone operator workstation. Registration system upgrade. 18

GOMS Developed by Stu Card et al. in early 1980s. A family of models. Vary in their level of detail about what aspects of human performance they make predictions about. Examples: Time to execute tasks. Best strategy to execute task. Consistency across tasks. 19

GOMS Attempts to model the knowledge and cognitive processes involved when users interact with a system. Model is composed of: Goals, Operators, Methods and Selection rules. More recent models allow for parallel task execution. 20

G. O. Goals = state that the user wants to achieve (e.g. Find a web site on interface design). Operators = cognitive processes and physical actions that need to be performed in order to attain goal(s) (e.g. Decide on which search engine to use, think up and then enter words in search engine). 21

M. S. Methods = learned procedures for accomplishing the goal(s); exact sequence of step required (e.g., Drag mouse over entry field, type in keywords, press the go button). Selection Rules = used to determine which method to select when there is available for a given stage of a task. (e.g., Once keywords have been entered, a selection rule would determine whether to use the enter key on the keyboardor used mouse to click the go button to progress the search). 22

Example: delete a word in a sentence Goal: delete word Method: use menu options Step 1: Recall that a word to be deleted has to be highlighted. Step 2: Recall that the command is cut. Step 3: Recall that the command cut is in the edit menu. Step 4: Execute select and cut commands. Step 5: Return with goal accomplished. 23

Second Method Method: use delete key Step 1: Recall where to position cursor in relation to word to be deleted. Step 2: Position cursor to delete word. Step 3: Recall which is the delete key. Step 4: Press delete key to delete each letter. Step 5: Return with goal accomplished. 24

Operators and Selection Rules Operators to use in above methods: Click mouse. Drag cursor over text. Select menu. Move cursor to command. Press keyboard key. Selection rules to decide which method to use: 1. Delete text using mouse and menu if large amount of text is to be deleted. 2. Delete text using delete key if small number of letters are to be deleted. 25

When to use GOMS? To evaluate alternative systems before buying one. To evaluate alternate designs at the specification stage before prototype is built. 26

GOMS Variation: Keystroke Model Provides actual numerical predictions of user performance (i.e. execution time of tasks that skilled users are likely to perform). Usage: compare tasks in terms of time it takes using different strategies. 27

Analytic Methods: Heuristic Evaluation Definition 1 (Heuristic Evaluation) A Heuristic Evaluation is an evaluation of a user interface to a checklist of design rules or heuristics. Observation 3 A heuristic evaluation can quickly and effectively reveal key user interface issues. Nielsen article: http://www.useit.com/papers/heuristic/heuristic evaluation.html 28

Advantages/Disadvantages (+) Reasonably cheap and fast (+) Can be used at any time in design process (+) Identify many usability problems (-) More effective with fully defined designs (-) Can over-emphasize minor problems (-) Does not identify task-based problems (-) Does not identify domain-related problems 29

Heuristic Evaluations: History In 1990 Jakob Nielsen (SUN Microcomputers, now Nielsen-Norman Group) and Rolf Molich developed the first list of usability heuristics. Later revised (1994) by Nielsen into somewhat more abstract items based on analysis of 249 usability problems. 30

Number of evaluators? Most effective when multiple evaluators are involved - optimal number 3 to 5. Possible evaluators: User interface designers Project team members Representative/actual end-users More than one evaluator? How many evaluators? 31

Nielson s Heuristics 1. Simple and natural dialog 2. Speak the users language 3. Minimize user memory load 4. Be consistent 5. Provide feedback 6. Provide clearly marked exits 7. Provide shortcuts 8. Good error messages 9. Prevent errors 10. Help and documentation 32

Other Heuristics Other heuristic lists: Gary Perlman: Practical Usability Heuristics (1997). Jill Gerhardt-Powals: Cognitive Engineering Principles for Enhancing Human-Computer Performance (1996). Donald Norman s principles of good design can also serve as an effective heuristic list. 33

Heuristic Evaluations: Overview Select heuristics list. Have 3-5 evaluators (each alone) examine the interface (twice) judging the compliance with a set of heuristics. Combine findings on one list. Have evaluators independently rate severity. Present findings to design team, highlighting positive findings. 34

Recording Results (possible scenarios) Evaluators write up reports: Written record provides formal report. Extra effort to aggregate. Observer records what evaluator says during examination: Overhead of an observer. Decreases workload of evaluators. Faster and easier to write up summary. Observer can help evaluators in case of problems. 35

Assessing Severity The severity of a usability problem can be accessed by considering: Frequency Impact Persistence Cost 36

Assessing Severity 0 Disagree about usability problem: Agree that this is not a usability problem at all. 1 Cosmetic Only: Need not be fixed unless extra time is available on project 2 Minor usability problem: Fixing this should be given low priority 3 Major usability problem: Important to fix, so should be given high priority 4 Usability catastrophe (Critical usability problem): Imperative to fix this before product can be released 37

Simple and natural dialogue In dialogues with the user: Simple: no irrelevant or rarely used information Natural: the order of presentation matches the task order 38

Speak the users language Use words and concepts from the users world. Do not use system-specific engineering terms. 39

Minimize user memory load Do not make the user remember things from one action to the next. Leave information on the screen until it is not needed. 40

Be consistent! Users should be able to learn an action sequence in one part of the system and apply it again to get similar results in other places. 41

Provide feedback Let users know what effect their actions have on the system. The system should continuously inform the user about what it is doing and how it is interpreting the users input. 42

Provide clearly marked exits If users get into part of the system that does not interest them, they should always be able to get out quickly without damaging anything. 43

Provide shortcuts Shortcuts can help experienced users avoid lengthy dialogs and informational messages that they do not need. keyboard shortcuts mouse shortcuts 44

Good error messages Observation 4 Good error messages let the user know what the problem is and how to correct it. 45

Prevent errors Suggestion 1 Whenever you write an error message you should also ask, can this error be avoided? 46

Types of errors Errors of the user: Remove all unnecessary ones. Provide clear explanations, using the language of the user. Provide adequate solutions. Make sure the solutions offered represent the vast majority of applicable cases. 47

Types of errors Errors of the programmer. Remove all. Convert them to them up as Technical Support Incidents to be submitted. Errors in third party components. Trap all. Convert them to Technical Support Incidents to be submitted by the user. 48

Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. 49