Addition about Prototypes

Similar documents
Übung zur Vorlesung Mensch-Maschine-Interaktion. e5: Heuristic Evaluation

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

Vorlesung Mensch-Maschine-Interaktion. Methods & Tools

6 Designing Interactive Systems

Interaction Design. Chapter 7 (June 22nd, 2011, 9am-12pm): Evaluation and Testing

User Interface Evaluation

Evaluation and Testing. Andreas Butz, LMU Media Informatics slides partially taken from MMI class

15/16 CSY2041 Quality and User-Centred Systems

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

HEURISTIC EVALUATION WHY AND HOW

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

HCI and Design SPRING 2016

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

Introducing Evaluation

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

IPM 10/11 T1.6 Discount Evaluation Methods

cs465 principles of user interface design, implementation and evaluation

CSCI 3160: User Interface Design

Chapter 15: Analytical evaluation

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

Design Heuristics and Evaluation

7 Implementing Interactive Systems

Analytical evaluation

Analytical &! Empirical Evaluation

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Giving instructions, conversing, manipulating and navigating (direct manipulation), exploring and browsing, proactive computing

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

A Heuristic Evaluation of Ohiosci.org

Lecture 14: Heuristic Evaluation. Fall UI Design and Implementation 1

3 Evaluating Interactive Systems

Jakob Nielsen s Heuristics (

iscreen Usability INTRODUCTION

Mensch-Maschine-Interaktion 1. Chapter 5 (June 9, 2011, 9am-12pm): Designing Interactive Systems

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Implementing Games User Research Processes Throughout Development: Beyond Playtesting

Vorlesung Advanced Topics in HCI (Mensch-Maschine-Interaktion 2)

Analytical Evaluation

User Experience Report: Heuristic Evaluation

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

Design Reviews. Scott Klemmer. stanford hci group / cs147. tas: Marcello Bastea-Forte, Joel Brandt, Neil Patel, Leslie Wu, Mike Cammarano

SFU CMPT week 11

User Centered Design - Maximising the Use of Portal

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

Mensch-Maschine-Interaktion 1. Chapter 7 (July 15, 2010, 9am-12pm): Implementing Interactive Systems

03 Usability Engineering

Lose It! Weight Loss App Heuristic Evaluation Report

Introducing Evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Lecture 14: Heuristic Evaluation

dt+ux: Design Thinking for User Experience Design, Prototyping & Evaluation!

Hyper Mesh Code analyzer

About Texts and Links. Quick tour of basic design guidelines (1) Vorlesung Advanced Topics in HCI (Mensch-Maschine-Interaktion 2)

Towards Systematic Usability Verification

Expert Evaluations. November 30, 2016

NPTEL Computer Science and Engineering Human-Computer Interaction

Introducing Evaluation

CS415 Human Computer Interaction

1. Select/view stores based on product type/category- 2. Select/view stores based on store name-

Lecture 22: Heuristic Evaluation. UI Hall of Fame or Shame? Spring User Interface Design and Implementation 1

CS415 Human Computer Interaction

Interaction Design DECO1200

Heuristic Evaluation Report. The New York Philharmonic Digital Archives archives.nyphil.org

3 Prototyping and Iterative Evaluations

User-centered design in technical communication

SEPA diary Heuristic evaluation

Applying Usability to elearning

Usability Evaluation Lab classes

Human-Computer Interaction: An Overview. CS2190 Spring 2010

Heuristic Evaluation of Covalence

Due on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

Gleanings Ch 5, 6, 7. Design Naviga3on Organiza3on and layout

USER-CENTERED DESIGN KRANACK / DESIGN 4

How to Conduct a Heuristic Evaluation

Introduction to Internet Applications

Heuristic Evaluation of Math Out of the Box

Usability analysis and inspection

Heuristic Evaluation of Groupware. How to do Heuristic Evaluation of Groupware. Benefits

Heuristic Evaluation

Human-Computer Interaction

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability

Evaluation Continued. Lecture 7 June 20, 2006

Severity Definitions:

User Interface Evaluation Methods

Heuristic Evaluation of [Slaptitude]

User-Centered Design. Jeff Bos, Design Insights BlackBerry

evision Review Project - Engagement Simon McLean, Head of Web & IT Support Information & Data Services.

Group #: Evaluation Proposal For Digital Libraries, LIS 5472, Fall 2010, Professor Sanghee Oh

Transcription:

Vorlesung Mensch-Maschine-Interaktion Evaluation Ludwig-Maximilians-Universität München LFE Medieninformatik Heinrich Hußmann & Albrecht Schmidt WS2003/2004 http://www.medien.informatik.uni-muenchen.de/ 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 1 Addition about Prototypes http://www.useit.com/papers/guerrilla_hci.html 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 2

1984 Olympic Message System A human centered approach A public system to allow athletes at the Olympic Games to send and receive recorded voice messages (between athletes, to coaches, and to people around the world) Challenges New technology Had to work delays were not acceptable (Olympic Games are only 4 weeks long) Short development time Design Principles Early focus on users and tasks Empirical measurements Iterative design Looks obvious but it is not! it worked! But why? 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 3 1984 Olympic Message System Methods Scenarios instead of a list of functions Early prototypes & simulation (manual transcription and reading) Early demonstration to potential users (all groups) Iterative design (about 200 iterations on the user guide) An insider in the design team (ex-olympian from Ghana) On side inspections (where is the system going to be deployed) Interviews and tests with potential users Full size kiosk prototype (initially non-functional) at a public space in the company to get comments Prototype tests within the company (with 100 and with 2800 people) free coffee and doughnuts for lucky test users Try-to-destroy-it test with computer science students Pre-Olympic field trail The 1984 Olympic Message System: a test of behavioral principles of system design John D. Gould, Stephen J. Boies, Stephen Levy, John T. Richards, Jim Schoonard Communications of the ACM September 1987 Volume 30 Issue 9 http://www.research.ibm.com/compsci/spotlight/hci/p758-gould.pdf 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 4

Table of Content An example of user centred design What to evaluate? Why Evaluate? Approaches to evaluation Inspection and expert review Model extraction Observations Experiments Ethical Issues 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 5 What to evaluate? The usability of a system! it depends on the stage of a project Ideas and concepts Designs Prototypes Implementations Products in use it also depends on the goals Approaches Formative evaluation throughout the design, helps to shape a product Summative evaluation quality assurance of the finished product. 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 6

Why evaluate? Goals of user interface evaluation Ensure functionality (effectiveness) Assess (proof) that a certain task can be performed Ensure performance (efficiency) Assess (proof) that a certain task can be performed given specific limitations (e.g. time, resources) Customer / User acceptance What is the effect on the user? Are the expectations met? Identify problems For specific tasks For specific users Improve development life-cycle Secure the investment (don t develop a product that can only be used by fraction of the target group or not at all!) 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 7 There is not a single way Different approaches Inspections Model extraction Controlled studies Experiments Observations Field trails Usage context Different results Qualitative assessment Quantitative assessment 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 8

Usability Methods are often not used! Why Developers are not aware of it The expertise to do evaluation is not available People don t know about the range of methods available Certain methods are to expensive for a project (or people think they are to expensive) Developers see no need because the product works Teams think their informal methods are good enough starting points Discount Usability Engineering http://www.useit.com/papers/guerrilla_hci.html Heuristic Evaluation http://www.useit.com/papers/heuristic/ 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 9 Inspections & Expert Review Throughout the development process Performed by developers and experts External or internal experts Tool for finding problems May take between an hour and a week Structured approach is advisable reviewers should be able to communicate all their issues (without hurting the team) reviews must not be offensive for developers / designers the main purpose is finding problems solutions may be suggested but decisions are up to the team 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 10

Inspection and Expert Review Methods Guideline review Check that the UI is according to a given set of guidelines Consistency inspection Check that the UI is consistent (in itself, within a set of related applications, with the OS) Birds s eye view can help (e.g. printout of a web site and put it up on the wall) Consistency can be enforced by design (e.g. css on the web) Walkthrough Performing specific tasks (as the user would do them) Heuristic evaluation Check that the UI violates a set (usually less than 10 point) rules 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 11 Informal Evaluation Expert reviews and inspections are often done informally UIs and interaction is discussed with colleagues People are asked to comment, report problems, and suggest additions Experts (often within the team) assess the UI for conformance with guidelines and consistency Results of informal reviews and inspections are often directly used to change the product still state of the art in many companies! Informal evaluation is important but in most cases not enough Making evaluation more explicit and documenting the findings can increase the quality significantly Expert reviews and inspections are a starting point for change 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 12

Discount Usability Engineering Low cost approach Small number of subjects Approximate Get indications and hints Find major problems Discover many issues (minor problems) Qualitative approach observe user interactions user explanations and opinions anecdotes, transcripts, problem areas, Quantitative approach count, log, measure something of interest in user actions speed, error rate, counts of activities 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 13 Heuristic Evaluation http://www.useit.com/papers/heuristic/ Heuristic evaluation is a usability inspection method systematic inspection of a user interface design for usability goal of heuristic evaluation to find the usability problems in the design As part of an iterative design process. Basic Idea: Small set of evaluators examine the interface and judge its compliance with recognized usability principles (the "heuristics"). 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 14

Heuristic Evaluation http://www.useit.com/papers/heuristic/ How many evaluators? Example: total cost estimate with 11 evaluators at about 105 hours, see http://www.useit.com/papers/guerrilla_hci.html 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 15 Heuristic Evaluation - Heuristics Heuristics suggested by Nielsen Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation Depending of the product and goals a different set may be appropriate 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 16

Heuristic Evaluation - Steps Preparation Assessing appropriate ways to use heuristic evaluation Define Heuristics Having outside evaluation expert learn about the domain and scenario Finding and scheduling evaluators Preparing the briefing Preparing scenario for the evaluators Briefing (system expert, evaluation expert, evaluators) Preparing the prototype (software/hardware platform) for the evaluation Evaluation Evaluation of the system by all evaluators Observing the evaluation sessions Analysis Debriefing (evaluators, developers, evaluation expert) compiling list of usability problems (using notes from evaluation sessions) Writing problem descriptions for use in severity-rating questionnaire Severity rating 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 17 Heuristic Evaluation Severity Rating Severity ratings are used to prioritize problems Decision whether to release a system or to do further iterations The severity of a usability problem is a combination of three factors: The frequency with which the problem occurs: Is it common or rare? The impact of the problem if it occurs: Will it be easy or difficult for the users to overcome? The persistence of the problem: Is it a one-time problem that users can overcome once they know about it or will users repeatedly be bothered by the problem 0 to 4 rating scale to rate the severity of usability problems: 0 = I don't agree that this is a usability problem at all 1 = Cosmetic problem only: need not be fixed unless extra time is available on project 2 = Minor usability problem: fixing this should be given low priority 3 = Major usability problem: important to fix, so should be given high priority 4 = Usability catastrophe: imperative to fix this before product can be released 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 18

Observations & Protocols Paper and pencil Cheap and easy but unreliable Make structured observations sheets / tool Audio/video recording Cheap and easy Creates lots of data, potentially expensive to analyze Good for review/discussion with the user Computer logging Reliable and accurate Limited to actions on the computer Include functionality in the prototype / product User notebook Request to user to keep a diary style protocol 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 19 Structured observations Observation sheet time 14:00 14:01 typing reading screen consulting manual phoning 14:02 14:03 14:04 Electronic version 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 20

Observations and Protocols What are observations and Protocols good for? Demonstrating that a product improves productivity Basis for qualitative and quantitative findings Hint Minimize the chance for human error in observation and protocols Most people are pretty bad at doing manual protocols Combine with computer logging - Log what you get from the system - Observer makes a protocol on external events 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 21 Video protocol Integrate multiple views Capture screen with pointer View of the person interacting with the system View of the environment Poor man s usability lab Computer for the test user, - run application to test - export the screen (e.g. VNC) Computer for the observer - See the screen from the subject - Attach 2 web cams and display them on the screen - Have an editor for observer notes - Capture this screen (e.g. camtasia) Discuss with the user afterwards Why did you do this? What did you try here?. Test system Subjects screen Subjects screen Editor Observer system Cam1 Cam2 time 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 22

Screen video 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 23 References Alan Dix, Janet Finlay, Gregory Abowd and Russell Beale. (1998) Human Computer, Interaction (second edition), Prentice Hall, ISBN 0132398648 (new Edition announced for October 2003) Ben Shneiderman. (1998) Designing the User Interface, 3rd Ed., Addison Wesley; ISBN: 0201694972 Discount Usability Engineering http://www.useit.com/papers/guerrilla_hci.html Heuristic Evaluation http://www.useit.com/papers/heuristic/ 29/01/04 LMU München Mensch-Maschine-Interaktion WS03/04 Schmidt/Hußmann 24