Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

Similar documents
Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

HCI and Design SPRING 2016

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Design Reviews. Scott Klemmer. stanford hci group / cs147. tas: Marcello Bastea-Forte, Joel Brandt, Neil Patel, Leslie Wu, Mike Cammarano

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

dt+ux: Design Thinking for User Experience Design, Prototyping & Evaluation!

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

HEURISTIC EVALUATION WHY AND HOW

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

Design Heuristics and Evaluation

User Experience Report: Heuristic Evaluation

Heuristic Evaluation of [ Quest ]

cs465 principles of user interface design, implementation and evaluation

Analytical &! Empirical Evaluation

User Interface Evaluation

Heuristic Evaluation of Covalence

Lose It! Weight Loss App Heuristic Evaluation Report

1. Problem Mix connects people who are interested in meeting new people over similar interests and activities.

CS 147 Autumn 2017: Assignment 9 (Heuristic Evaluation Group Template) Instructor: James Landay. Fix: make web also block the desktop screen.

1. Select/view stores based on product type/category- 2. Select/view stores based on store name-

Jakob Nielsen s Heuristics (

Heuristic Evaluation of igetyou

Due on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction

Usability. HCI - Human Computer Interaction

Heuristic Evaluation. Ananda Gunawardena. Carnegie Mellon University Computer Science Department Fall 2008

How to Conduct a Heuristic Evaluation

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

Heuristic Evaluation of Math Out of the Box

Heuristic Evaluation of [Slaptitude]

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

IPM 10/11 T1.6 Discount Evaluation Methods

CS 544 Discount Usability Engineering

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

Heuristic Evaluation

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

iscreen Usability INTRODUCTION

Heuristic Evaluation. Jon Kolko Professor, Austin Center for Design

Heuristic Evaluation of PLATELIST

Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

Hyper Mesh Code analyzer

dt+ux Design Thinking for User Experience Design, Prototyping & Evaluation Autumn 2016 Prof. James A. Landay Stanford University

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

Crab Shack Kitchen Web Application

Übung zur Vorlesung Mensch-Maschine-Interaktion. e5: Heuristic Evaluation

Usability. Daniela Rosner. Web Architecture, October 9, School of Information UC Berkeley

CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara

Usability in Multimedia. By Pınar Koçer Aydın and Özgür Bayram

Heuristic Evaluation

Addition about Prototypes

Heuristic Evaluation of NUIG Participate Module 1

15/16 CSY2041 Quality and User-Centred Systems

CogSysIII Lecture 9: User Modeling with GOMS

activated is a platform that allows students to create, organize, and share the steps on their road to college.

Lecture 14: Heuristic Evaluation. Fall UI Design and Implementation 1

Introduction to Internet Applications

Introduction to Usability and its evaluation

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

SFU CMPT week 11

Lecture 22: Heuristic Evaluation. UI Hall of Fame or Shame? Spring User Interface Design and Implementation 1

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

Usability Testing! Hall of Fame! Usability Testing!

Ten Usability Heuristics J. Nielsen

Computer Systems & Application

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

Group #: Evaluation Proposal For Digital Libraries, LIS 5472, Fall 2010, Professor Sanghee Oh

MIT GSL week 4 Wednesday. User Interfaces II

Lecture 14: Heuristic Evaluation

Analytical Evaluation

SEPA diary Heuristic evaluation

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

HUMAN COMPUTER INTERACTION

Additional reading for this lecture: Heuristic Evaluation by Jakob Nielsen. Read the first four bulleted articles, starting with How to conduct a

A Heuristic Evaluation of Ohiosci.org

Lecture 6. Usability Testing

Heuristic Evaluation of Team Betamax

Applying Usability to elearning

Chapter 15: Analytical evaluation

Heuristic Evaluation of Enable Ireland

Severity Definitions:

1. The Best Practices Section < >

Human-Computer Interaction: An Overview. CS2190 Spring 2010

Usability analysis and inspection

User-Centered Design. Jeff Bos, Design Insights BlackBerry

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability

Human-Computer Interaction IS4300

Craigslist Heuristic Evaluation

Heuristic Evaluation of WAGER

Evaluating the Design without Users. A formalized way of imagining people s thoughts and actions when they use an interface for the first time

Heuristic Evaluation of MPowered Entrepreneurship s Slack Workspace. Presented By Dylan Rabin. SI 110 Section 003

Introduction to Human-Computer Interaction

USERINTERFACE DESIGN & SIMULATION. Fjodor van Slooten

Heuristic Evaluation Report. The New York Philharmonic Digital Archives archives.nyphil.org

SkillSwap. A community of learners and teachers

Transcription:

CS 147: HCI+D UI Design, Prototyping, and Evaluation, Autumn 2014 HCI+D: USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Hall of Fame or Shame? Heuristic Evaluation Computer Science Department Autumn 2014 October 30, 2014 Pocket By Read It Later Hall of Fame or Shame? 2 Hall of Fame or Shame? Items can be saved from iphone, ipad, Android devices, web app, Chrome/Firefox extension, or email. 3 Hall of Fame 4 Hall of Fame or Shame? forecast.io courtesy of William D. Good beautiful visual design support across a large number of devices means you can always use it optimized for quickly and easily saving and reading articles Bad significantly reformats content saves link from tweets, but not the message Pocket By Read It Later 5 6 1

CS 147: HCI+D UI Design, Prototyping, and Evaluation, Autumn 2014 HCI+D: USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Hall of Fame forecast.io courtesy of William D. Heuristic Evaluation Good uncluttered visual design key info large (current weather) simple understandable icons easy to scan week s weather optional details & animations Computer Science Department Bad Autumn 2014 Precip Map takes a lot of space advertising seems out of place October 30, 2014 7 Visual Design Review Outline Start with Context, what is the nature of the information? What is the most important? Visual Design Review HE Process Overview The Heuristics Team Break How to Perform Heuristic Evaluation Heuristic Evaluation vs. Usability Testing Avoid clutter, focus on the essence of your tasks CUT Design first in? grayscale to focus on hierarchy Small changes help us see key differences (e.g., small multiples) Use color properly not for ordering Only use one or two colors at a time, unless absolutely necessary Evaluation About figuring out how to improve design Issues with lo-fi tests? 9 10 Evaluation About figuring out how to improve design Issues with lo-fi tests? Not realistic visuals & performance Not on actual interface can t test alone Need participants 2

Heuristic Evaluation Developed by Jakob Nielsen Helps find usability problems in a UI design Small set (3-5) of evaluators examine UI independently check for compliance with usability principles ( heuristics ) 3-5 evaluators because? different evaluators will find different problems evaluators only communicate afterwards findings are then aggregated Can perform on working UI or on sketches Why Multiple Evaluators? Every evaluator doesn t find every problem Good evaluators find both easy & hard ones 13 14 Heuristic Evaluation Process Evaluators go through UI several times inspect various dialogue elements compare with list of usability principles consider other principles/results that come to mind Usability principles Nielsen s heuristics supplementary list of category-specific heuristics competitive analysis & user testing of existing products Heuristics (original) H1-1: Simple & natural dialog H1-2: Speak the users language H1-3: Minimize users memory load H1-4: Consistency H1-5: Feedback H1-6: Clearly marked exits H1-7: Shortcuts H1-8: Precise & constructive error messages H1-9: Prevent errors H1-10: Help and documentation Use violations to redesign/fix problems 15 16 Heuristics (revised set) H2-1: Visibility of system status keep users informed about what is going on example: pay attention to response time 0.1 sec: no special indicators needed, why? 1.0 sec: user tends to lose track of data 10 sec: max. duration if user to stay focused on action for longer delays, use percent-done progress bars Bad example: Mac desktop Dragging disk to trash should delete it, not eject it H2-2: Match between system & real world follow real world conventions speak the users language 17 18 3

Wizards must respond to Q before going to next for infrequent tasks e.g., router config. not for common tasks H2-3: User control & freedom good for beginners exits for mistaken choices, undo, have 2 versions (WinZip) redo don t force down fixed paths like that BART ticket machine H2-4: Consistency & standards 19 20 H2-5: Error prevention H2-6: Recognition rather than recall make objects, actions, options, & directions visible/easily retrievable bad % rm rf * % good H2-4: Consistency & standards use the same language, placement, etc. everywhere H2-4: Consistency & standards follow platform conventions 21 22 H2-7: Flexibility and efficiency of use accelerators for experts (e.g., gestures, kb shortcuts) allow users to tailor frequent actions (e.g., macros) 23 H2-8: Aesthetic & minimalist design no irrelevant information in dialogues 24 4

Good Error Messages H2-9: Help users recognize, diagnose, & recover from errors error messages in plain language precisely indicate the problem constructively suggest a solution Clearly indicate what has gone wrong Human readable Polite Describe the problem Explain how to fix it Highly noticeable 25 26 H2-10: Help and documentation easy to search focused on the user s task list concrete steps to carry out not too large 27 28 Mobile Heuristics Enrico Bertini, Silvia Gabrielli, and Stephen Kimani. 2006. Appropriating and assessing heuristics for mobile computing. In Proceedings of the working conference on Advanced visual interfaces (AVI '06). ACM, New York, NY, USA, 119-126. DOI=10.1145/1133265.1133291 http://doi.acm.org/10.1145/1133265.1133291 29 30 5

Q U I Z URL ONLY AVAILBLE IN CLASS HCI+D: User Interface Design, Prototyping, and Evaluation 31 HCI+D: User Interface Design, Prototyping, and Evaluation 32 Phases of Heuristic Evaluation TEAM BREAK (20 MINUTES) 1) Pre-evaluation training give evaluators needed domain knowledge & information on the scenario 2) Evaluation individuals evaluates UI & makes list of problems 3) Severity rating determine how severe each problem is 4) Aggregation group meets & aggregates problems (w/ ratings) 5) Debriefing discuss the outcome with design team 33 34 How to Perform Evaluation Examples At least two passes for each evaluator first to get feel for flow and scope of system second to focus on specific elements If system is walk-up-and-use or evaluators are domain experts, no assistance needed otherwise might supply evaluators with scenarios Each evaluator produces list of problems explain why with reference to heuristic or other information be specific & list each problem separately Can t copy info from one window to another Violates Minimize the users memory load (H1-3) fix: allow copying Typography uses different fonts in 3 dialog boxes Violates Consistency and standards (H2-4) slows users down probably wouldn t be found by user testing fix: pick a single format for entire interface 35 36 6

How to Perform Heuristic Evaluation Severity Rating Why separate listings for each violation? risk of repeating problematic aspect may not be possible to fix all problems Where problems may be found single location in UI two or more locations that need to be compared problem with overall structure of UI something that is missing common problem with paper prototypes note: sometimes features are implied by design docs and just haven t been implemented relax on those Used to allocate resources to fix problems Estimates of need for more usability efforts Combination of frequency impact persistence (one time or repeating) Should be calculated after all evals. are in Should be done independently by all judges 37 38 Severity Ratings (cont.) Debriefing 0 don t agree that this is a usability problem 1 - cosmetic problem 2 - minor usability problem 3 - major usability problem; important to fix 4 - usability catastrophe; imperative to fix Conduct with evaluators, observers, and development team members Discuss general characteristics of UI Suggest potential improvements to address major usability problems Dev. team rates how hard things are to fix Make it a brainstorming session little criticism until end of session 39 40 Severity Ratings Example HE vs. User Testing 1. [H1-4 Consistency] [Severity 3][Fix 0] The interface used the string Save on the first screen for saving the user's file, but used the string Write file on the second screen. Users may be confused by this different terminology for the same function. HE is much faster 1-2 hours each evaluator vs. days-weeks HE doesn t require interpreting user s actions User testing is far more accurate (by def.) takes into account actual users and tasks HE may miss problems & find false positives Good to alternate between HE & user testing find different problems don t waste participants 41 42 7

Results of Using HE Decreasing Returns Discount: benefit-cost ratio of 48 [Nielsen94] cost was $10,500 for benefit of $500,000 value of each problem ~15K (Nielsen & Landauer) how might we calculate this value? in-house productivity; open market sales Correlation between severity & finding w/ HE Single evaluator achieves poor results only finds 35% of usability problems 5 evaluators find ~ 75% of usability problems why not more evaluators???? 10? 20? adding evaluators costs more & won t find more probs problems found * Caveat: graphs for a specific example benefits / cost 43 44 Summary Have evaluators go through the UI twice Ask them to see if it complies with heuristics note where it doesn t & say why Combine the findings from 3 to 5 evaluators Have evaluators independently rate severity Alternate with user testing Further Reading Evaluation Books Usability Engineering, by Nielsen, 1994 Web Sites & mailing lists http://www.nngroup.com/articles/ 45 46 Next Time User Testing No Reading Individual assignment Simple Heuristic Evaluation (individual) Due Tue, Nov. 4 at 11:59 PM (via CourseWork) Looking ahead week off from major team project work you will be doing Heuristic Evaluation of other team s next week next week get basic web sites together (for HE) 47 8