Assistant Professor Computer Science. Introduction to Human-Computer Interaction

Similar documents
CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara

HCI and Design SPRING 2016

User Interface Evaluation

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

User Experience Report: Heuristic Evaluation

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

Jakob Nielsen s Heuristics (

Heuristic Evaluation. Jon Kolko Professor, Austin Center for Design

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

iscreen Usability INTRODUCTION

Design Heuristics and Evaluation

1. Select/view stores based on product type/category- 2. Select/view stores based on store name-

Heuristic Evaluation

HEURISTIC EVALUATION WHY AND HOW

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

Computer Systems & Application

cs465 principles of user interface design, implementation and evaluation

Hyper Mesh Code analyzer

Heuristic Evaluation. Ananda Gunawardena. Carnegie Mellon University Computer Science Department Fall 2008

Craigslist Heuristic Evaluation

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

Übung zur Vorlesung Mensch-Maschine-Interaktion. e5: Heuristic Evaluation

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

Usability in Multimedia. By Pınar Koçer Aydın and Özgür Bayram

Usability. Daniela Rosner. Web Architecture, October 9, School of Information UC Berkeley

Group #: Evaluation Proposal For Digital Libraries, LIS 5472, Fall 2010, Professor Sanghee Oh

Design Reviews. Scott Klemmer. stanford hci group / cs147. tas: Marcello Bastea-Forte, Joel Brandt, Neil Patel, Leslie Wu, Mike Cammarano

Due on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction

Lose It! Weight Loss App Heuristic Evaluation Report

Heuristic Evaluation of [ Quest ]

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

CogSysIII Lecture 9: User Modeling with GOMS

Introduction to Internet Applications

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

IPM 10/11 T1.6 Discount Evaluation Methods

Analytical &! Empirical Evaluation

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

Heuristic Evaluation of NUIG Participate Module 1

NPTEL Computer Science and Engineering Human-Computer Interaction

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

Usability analysis and inspection

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability

Crab Shack Kitchen Web Application

Addition about Prototypes

15/16 CSY2041 Quality and User-Centred Systems

SFU CMPT week 11

UX DESIGN BY JULIA MITELMAN

1. Problem Mix connects people who are interested in meeting new people over similar interests and activities.

Usability. HCI - Human Computer Interaction

Heuristic Evaluation of [Slaptitude]

dt+ux: Design Thinking for User Experience Design, Prototyping & Evaluation!

Heuristic Evaluation of Enable Ireland

How to Conduct a Heuristic Evaluation

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

Analytical Evaluation

Lecture 14: Heuristic Evaluation. Fall UI Design and Implementation 1

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

1. The Best Practices Section < >

Heuristic Evaluation of Math Out of the Box

A Heuristic Evaluation of Ohiosci.org

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

Heuristic Evaluation of PLATELIST

Heuristic Evaluation of igetyou

Heuristic Evaluation of Covalence

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

Applying Usability to elearning

Analytical evaluation

CS 147 Autumn 2017: Assignment 9 (Heuristic Evaluation Group Template) Instructor: James Landay. Fix: make web also block the desktop screen.

Introduction to Usability and its evaluation

Wikitude Usability testing and heuristic evaluation

SEPA diary Heuristic evaluation

Chapter 15: Analytical evaluation

Severity Definitions:

Additional reading for this lecture: Heuristic Evaluation by Jakob Nielsen. Read the first four bulleted articles, starting with How to conduct a

Introducing Evaluation

Human-Computer Interaction

Lecture 22: Heuristic Evaluation. UI Hall of Fame or Shame? Spring User Interface Design and Implementation 1

Heuristic Evaluation of MPowered Entrepreneurship s Slack Workspace. Presented By Dylan Rabin. SI 110 Section 003

activated is a platform that allows students to create, organize, and share the steps on their road to college.

User-Centered Design. Jeff Bos, Design Insights BlackBerry

Heuristic Evaluation as A Usability Engineering Method

User Experience Design

USERINTERFACE DESIGN & SIMULATION. Fjodor van Slooten

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

CS 544 Discount Usability Engineering

Property of Shree Dhavale. Not for public distribution. Practicum Report. Shree Dhavale

Coursera Assignment #3 - Heuristic Evaluation / Grant Patten

Heuristic Evaluation of Groupware. How to do Heuristic Evaluation of Groupware. Benefits

Evaluation Plan for Fearless Fitness Center Website

Transcription:

CMSC434 Introduction to Human-Computer Interaction Week 12 Lecture 24 Nov 21, 2013 Intro to Evaluation Human Computer Interaction Laboratory @jonfroehlich Assistant Professor Computer Science

Hall of Fame Hall of Shame

Mozilla Thunderbird

I have to do a extra save as because just opening it puts it into a weird place. Especially problematic if I don t have a default application for it.

Once I hit save it will close and I ll have to go that location to open it

Today 1. Schedule 2. In-Class Activity 3. Evaluation

Work on your projects!

n-class ctivity

1 2 3 4 5 6

7 8 9

valuation

UserResearchMethods Formative Build Evaluative

How long does it take the average user to make a flight reservation on this website?

How many users are successful when trying to record all episodes of their favorite TV show?

What percentage of users have Google Instant Search turned off?

How do users feel when playing this video game?

These questions require user evaluation.

Why do we evaluate our designs?

Although Democrats are listed 2 nd on the left column, they are the 3 rd hole on the ballot. Punching the 2 nd hole casts a vote for the Reform party!

In order to evaluate, we need to define metrics.

sability etrics

Usability metrics reveal something about the user experience about the personal experience of the human being using the thing.

A usability metric reveals something about the interaction between the user and the thing [Tullis & Albert, Measuring the User Experience, 2008, p8]

What are some primary usability metrics? Effectiveness Efficiency Satisfaction [Tullis & Albert, Measuring the User Experience, 2008, p8]

A usability metric reveals something about the interaction between the user and the thing: Effectiveness Efficiency Satisfaction [Tullis & Albert, Measuring the User Experience, 2008, p8]

A usability metric reveals something about the interaction between the user and the thing: Effectiveness Efficiency Satisfaction Being able to complete a task [Tullis & Albert, Measuring the User Experience, 2008, p8]

A usability metric reveals something about the interaction between the user and the thing: Effectiveness Efficiency Satisfaction Being able to complete a task Amount of effort required to complete the task [Tullis & Albert, Measuring the User Experience, 2008, p8]

A usability metric reveals something about the interaction between the user and the thing: Effectiveness Efficiency Satisfaction Being able to complete a task Amount of effort required to complete the task Degree to which the user was happy with his/her experience while completing the task [Tullis & Albert, Measuring the User Experience, 2008, p8]

A usability metric reveals something about the interaction between the user and the thing: Effectiveness Efficiency Satisfaction Being able to complete a task Amount of effort required to complete the task Degree to which the user was happy with his/her experience while completing the task These metrics can help answer these critical questions: Will users like the product? Is this new product more efficient than past products? How does the usability of this product/version compare to others? What are the most significant usability problems with this product? Are improvements being made from one design iteration to another? [Tullis & Albert, Measuring the User Experience, 2008, p8]

types of valuation

Two Types of Evaluation: Formative Summative

Formative evaluation: like a chef who periodically checks a dish while it s being prepared & makes adjustments to positively impact the end result.

The chef evaluates, adjusts, and re-evaluates. The goal is to make improvements in the design, so formative evaluation occurs before the design is finalized.

Two Types of Evaluation: Formative Summative Key Questions: What are the top usability issues? What aspects work well? What don t users understand? What are common errors? Is the design getting better? What issues will likely remain in the final design? [See Chapter 3 in Tullis & Albert, Measuring the User Experience, 2008]

Summative evaluation: evaluating the dish after it comes out of the oven. Is it as good as we expected? Why not?

Two Types of Evaluation: Formative Summative Key Questions: What are the top usability issues? What aspects work well? What don t users understand? What are common errors? Is the design getting better? What issues will likely remain in the final design? Key Questions: How does our product compare against the competition? Are we doing better than before? What are the final usability metrics of this design? Identify areas with most potential for improvement for 2.0 version. [See Chapter 3 in Tullis & Albert, Measuring the User Experience, 2008]

Regardless of whether we are conducting formative or summative evaluations, important questions need to be answered about our evaluation approach. The why, what, where, and when of evaluation.

Evaluation Planning Questions What is the focus of our evaluation? Who are our users/participants? How many participants do we need? What is our budget / timeline? What evaluation method to employ? What kind of data to collect?

Genres of Assessment

Genres of Assessment Inspection-Based Methods Based on the skills and experience of evaluators These are sometimes also called Expert Reviews

Genres of Assessment Inspection-Based Methods Based on the skills and experience of evaluators Automated Methods Usability measures computed by software

Genres of Assessment Inspection-Based Methods Based on the skills and experience of evaluators Automated Methods Usability measures computed by software Formal Methods Models and formulas to calculate and predict measures semi-automatically

Genres of Assessment Inspection-Based Methods Based on the skills and experience of evaluators Automated Methods Usability measures computed by software Formal Methods Models and formulas to calculate and predict measures semi-automatically Empirical Methods Evaluation assessed by testing with real users

Genres of Assessment Inspection-Based Methods Based on the skills and experience of evaluators: Automated Methods Usability measures computed by software 1. Heuristic Evaluation 2. Walkthroughs Formal Methods Models and formulas to calculate and predict measures semi-automatically Empirical Methods Evaluation assessed by testing with real users

Discount Usability Techniques Heuristic Evaluation Assess interface based on a predetermined list of criteria Cognitive Walkthroughs Put yourself in the shoes of a user

Heuristic Evaluation Heuristic evaluation is a method for finding the usability problems in a user interface design so that they can be attended to as part of an iterative design process. JakobNielsen, Ph.D. "The Guru of Web Page Usability" (NYT) Inventor of Heuristic Evaluation [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods.]

Heuristic Evaluation Heuristic evaluation involves having a small set of evaluators examine the interface and judge its compliance with recognized usability principles (the "heuristics"). JakobNielsen, Ph.D. "The Guru of Web Page Usability" (NYT) Inventor of Heuristic Evaluation [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods.]

Performing Heuristic Evaluation Small set of evaluators (experts) examine UI Evaluators check compliance with usability heuristics Different evaluators will find different problems Evaluators communicate afterwards to aggregate findings Designers use violations to redesign/fix problems Can be performed at various stages in the iterative design cycle (e.g., on sketches, on interactive prototypes, or final design) [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

Nielsen s 10 Heuristics [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

Nielsen s 10 Heuristics 1. Visibility of System Status System should always keep users informed, through appropriate feedback at reasonable times. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

76

Nielsen s 10 Heuristics 1. Visibility of System Status 2. Match System & Real World The system should speak the user s language, with familiar words. Information should appear in natural and logical order. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

x 78

Nielsen s 10 Heuristics 1. Visibility of System Status 2. Match System & Real World 3. User Control & Freedom Users often choose functions by mistake and need a clearly marked emergency exit. Support undo and redo. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

80

Nielsen s 10 Heuristics 1. Visibility of System Status 2. Match System & Real World 3. User Control & Freedom 4. Consistency & Standards. Users should not have to wonder whether different words/actions mean the same thing. Follow platform conventions. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

x

Similar menus and primary options on the Ribbon

Nielsen s 10 Heuristics 1. Visibility of System Status 2. Match System & Real World 3. User Control & Freedom 4. Consistency & Standards. 5. Error Prevention Even better than good error messages is a careful design that prevents the problem in the 1 st place. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

secondary action primary action auto-complete cuts down on misspellings [http://www.slideshare.net/sacsprasath/ten-usability-heuristics-with-example]

Nielsen s 10 Heuristics 1. Visibility of System Status 2. Match System & Real World 3. User Control & Freedom 4. Consistency & Standards. 5. Error Prevention 6. Recognition Over Recall Minimize the user s memory load by making/actions, options visible. The user shouldn t have to remember from one dialog to next. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

Komodo Edit auto-complete [http://www.slideshare.net/sacsprasath/ten-usability-heuristics-with-example]

Nielsen s 10 Heuristics 1. Visibility of System Status 2. Match System & Real World 3. User Control & Freedom 4. Consistency & Standards. 5. Error Prevention 6. Recognition Over Recall 7. Flexibility & Efficiency Accelerators (unseen by novice users) often speed up interaction for expert users. Allow users to tailor frequent actions. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

Nielsen s 10 Heuristics 1. Visibility of System Status 2. Match System & Real World 3. User Control & Freedom 4. Consistency & Standards. 5. Error Prevention 6. Recognition Over Recall 7. Flexibility & Efficiency 8. Aesthetic & Minimalism Interfaces shouldn t contain irrelevant information. Every unit of info competes for attention & diminishes relative visibility. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

x

Nielsen s 10 Heuristics 1. Visibility of System Status 2. Match System & Real World 3. User Control & Freedom 4. Consistency & Standards. 5. Error Prevention 6. Recognition Over Recall 7. Flexibility & Efficiency 8. Aesthetic & Minimalism 9. Help Users Recognize, Diagnose, & Recover from Errors. Error msgs in plain language, precisely indicate problem, suggest solution. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

Nielsen s 10 Heuristics 1. Visibility of System Status 2. Match System & Real World 3. User Control & Freedom 4. Consistency & Standards. 5. Error Prevention 6. Recognition Over Recall 7. Flexibility & Efficiency 8. Aesthetic & Minimalism 9. Help Users Recognize, Diagnose, & Recover from Errors. 10. Help & Documentation Best to not need documentation but when necessary, should be easy to search, focused on user tasks, and list concrete steps. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

Phases of Heuristic Evaluation 1. Pre-evaluation training: Give evaluators needed domain knowledge & information on the scenario 2. Evaluation: For ~1-2 hours, independently inspect the product using heuristics for guidance. Each expert should take more than one pass through the interface. 3. Severity rating: Determine how severe each problem is 4. Aggregation: Group meets & aggregates problems (with ratings) 5. Debriefing: Discuss the outcome with design team 96

Severity Ratings 0 don't agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix [H4 Consistency] [Severity 3] [Fix 0] (fairly severe, but easy to fix) The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function. 97

How Many Evaluators? In principle, individual evaluators can perform a heuristic evaluation of a user interface on their own but [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

How Many Evaluators? In principle, individual evaluators can perform a heuristic evaluation of a user interface on their own but Hard to Find Usability Problem Usability Problem (ordered from easiest to find to hardest to find) Each row represents a usability problem Easy to Find Usability Problem [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

How Many Evaluators? In principle, individual evaluators can perform a heuristic evaluation of a user interface on their own but Hard to Find Usability Problem Usability Problem (ordered from easiest to find to hardest to find) Each row represents a usability problem Easy to Find Usability Problem Evaluator (ordered from least successful evaluator to most successful) Each column is an individual evaluator [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

How Worst evaluator Many only found Evaluators? 3 usability problems (and they were the easiest to find) In principle, individual evaluators can perform a heuristic evaluation of a user interface on their own but Hard to Find Usability Problem Best evaluator found 10 usability problems (but not the two hardest ) Usability Problem (ordered from easiest to find to hardest to find) Easy to Find Usability Problem Evaluator (ordered from least successful evaluator to most successful) [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

How Many Evaluators? Well, then, how many evaluators should we use? [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]

How Many Evaluators? Well, then, how many evaluators should we use? Nielsen recommends ~5 evaluators (at least 3), which balances cost/benefit. Single evaluators found, on average, ~35% of usability problems. [Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, CHI'90; Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods; http://www.useit.com/papers/heuristic/]