Evaluation Continued. Lecture 7 June 20, 2006
|
|
- Violet Warren
- 5 years ago
- Views:
Transcription
1 Evaluation Continued Lecture 7 June 20, 2006
2 This week and next Midterm chat Anatomy of Meltdown/Review More on evaluation Case study from radiology Remember: June 27th is midterm and project day
3 Discuss Anatomy of a Meltdown Scott Adams story, and discussion ltdown.html What happened? Had a collection of 500 blog comments in a temporary holding database Clicked publish then deleted the temporary database only it wasn t temporary
4 Anatomy of Meltdown: Whose fault was it? Scott s or the designers?
5 Anatomy of a Meltdown ToG says Not Scott, here s why: Chain of 5 errors None were Scott s errors Accident waiting to happen in effect Also, no point to blaming users
6 The Scott Adams Investigation Error One: Mental (User) Model did not reflect Conceptual (Design) Model Scott: 2 databases Designers: 1 database Recall that the conceptual model is communicated via the look & feel of the interface The goal is to have the user recreate that model as his mental model through interacting with the program
7 The Scott Adams Investigation Why didn t Scott catch on? User can fail in two ways: 1. The user ends up with a fragmented model 2. The user has a complete, wrong model #2 is dangerous The users for minutes, days, or even months, systematically misterpret what the designer says
8 Example Consider the following instructions for entering a swimming pool: 1. Go to the edge 2. Confirm that the bottom is more than six feet down. 3. Hold your hands over your head 4. Begin leaning forward while bending at your knees 5. Push off the edge for maximum lift and ensure that your head points straight down for proper entry
9 The Scott Adams Investigation Users intermingle past experiences with their current experience of the interface Where there is discrepancy or ambiguity, users will misinterpret Instead Do your research! Understand users expected past experiences Then stay inline with that experience Or, clearly indicate deviations
10 The Scott Adams Investigation Error Two: Misleading Metaphor Designers rely on previous experience in attempting invoking a particular metaphor Instant deep understanding E.g. trashcan Must be true to original Features must consistent; expectations met E.g. eject by using trashcan on Mac
11 The Scott Adams Investigation Back to poor Scott Adams Publish has very specific connotations Mass replication and distribution of a document, with the original document, the draft, having little or no further usefulness after publication has been accomplished Wrt databases, publish has been drastically redefined mass replicate --> set a little flag Disaster looms
12 The Scott Adams Investigation The moral: Choose only metaphors that can be implemented in a manner sufficiently analogous to the real world and explicitly and repeatedly tell users of any necessary violations
13 The Scott Adams Investigation Error Three: Ambiguous Confirmation Dialogues Warnings must not be misterperted Scott s warning messages told him he was destroying the temporary database Instead: remove from publication destroy the only copy of this information in existence? Microsoft save changes? Evokes particular conception in new users for changing something that already exists.
14 The Scott Adams Investigation Moral: Work with a writer Ask people other than designers and engineers When you discover an ambiguity, rework and retest the dialog until ambiguity ceases to exist.
15 The Scott Adams Investigation Error Four: Confirmation Substituted for Undo Confirmations that regularly pop up are habit forming The only effect of such dialogs is to make the developers feel good: The users may be screwing up, but we warned them, so it is their own fault. Any time your user loses any work, consider it your fault Then figure out how to prevent it from happening to anyone else.
16 The Scott Adams Investigation Bizarre Can undo a single character deletion Can t recover an entire document In the case of a deletion: 1. We throw up a confirmation dialog 2. The user confirms 3. We delete the file 4. We tell the user we deleted it.
17 The Scott Adams Investigation Solution: Creating the Illusion of Deletion Leave out step 3 (delete file) Tell the user the file is gone They still can undo Later delete the file if not undone Long enough for the user to realize what he has done For the blog entries -- flag them as deleted so they do not show up, but do not delete until end of session, say Be conscious of privacy
18 The Scott Adams Investigation Moral: Offer a universal undo, and then only confirm in unusual cases
19 The Scott Adams Investigation Error Five: No Usability Evaluation Although the last four errors were inexcusable, they can be unavoidable Safety net == user testing!!!
20 The Scott Adams Investigation This disaster was preventable!!
21 Part 2: An Evaluation Framework
22 Objectives By the end of this section, you should be able to Explain key evaluation concepts & terms. Describe the evaluation paradigms & techniques used in interaction design. Discuss the conceptual and practical issues that must be considered when planning evaluations. Describe and use the DECIDE framework.
23 Evaluation Paradigms & Techniques All evaluation is affected by beliefs and bias The beliefs and practices associated with any underlying theory is called an evaluation paradigm This is different from interaction paradigm
24 4 Core Evaluation Paradigms Quick and dirty evaluation Usability testing Field studies Predictive evaluation
25 Quick and dirty evaluation Designers informally get feedback/confirmation from users or consultants Done at any stage Emphasize fast input not careful documentation
26 Usability Testing Origin 1980s Measure users performance on carefully prepared system-typical tasks Users and their software sessions are recorded Questionnaires and interviews Generally measure: Number of user errors Time to complete task Strongly controlled by evaluator Lab-like conditions
27 Usability Testing Unlike research experiments No variable manipulation Num of participants too small for stat analysis Changes in design are then engineered Recall usability engineering
28 Field Studies Done in natural settings Goal: increase understanding about what users do naturally and what impacts them Can be used to.. Identify opportunities for new technology Determine requirements for design Facilitate introduction of technology Evaluate technology
29 Field Studies Recall: Interviews Observation Ethnography Two general approaches: Outsider Observe and record while looking on Insider (or participant) E.g. ethnography
30 Predictive Evaluation Experts apply knowledge of typical users Predict usability problems No users involved Quick Inexpensive Limited E.g. Cognitive Walkthrough Heuristic Evaluation
31 Heuristic Evaluation Review by experts guided by a set of heuristics Always provide clearly marked exits Speak the user s language Called discount evaluation Report all places where heuristic is violated Design guidelines form a basis for developing heuristics Heuristics are needed that reflect the state of technology Evaluating web-based products, mobile devices, computerized toys, etc
32 Heuristic Evaluation In a study, single evaluators found only 35% of the usability problems Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems. Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used.
33 Nielsen s heuristics Visibility of system status Match between system and real world (Speak the user s language) User control and freedom (Clearly marked exits) Consistency and standards Help users recognize, diagnose, recover from errors Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation
34 Heuristic Evaluation E.g. HOMERUN guide for web evaluation: (H)igh Quality Content (O)ften Updated (M)inimal Download Time (E)ase of Use (R)elevant to user s needs (U)nique to the online medium (N)etcentric corporate culture
35 How many evaluators? Law of Diminishing Returns 100% Proportion of Usability Problems Found (%) 75% 50% 25% 0% Number of Evaluators
36 Stages for doing heuristic evaluation Briefing session to tell experts what to do Evaluation period of 1-2 hours in which: - Each expert works separately - Take one pass to get a feel for the product - Take a second pass to focus on specific features Report results Debriefing session in which experts work together to prioritize problems
37 Heuristic Evaluation of Domain-Dependent Systems evaluators will be naïve with respect to the domain of the system: must be assisted to use the interface supply evaluators with typical usage scenarios, listing steps a user would take to perform realistic tasks constructed on the basis of a task analysis of the actual users and their work in order to be a representative as possible
38 Heuristic Evaluation: Output List of usability problems in the interface Reference violated usability principles Does not provide a systematic way to fix usability problems Does not assess the quality of any redesign Problems often easy to correct based on the guidelines for each usability principle violated
39 Heuristic Evaluation: Debriefing Session Extends heuristic evaluation to provide some design advice Participants include Evaluators Any observers of the evaluation sessions Representatives of the design team Basically a brainstorming session Focus on possible redesigns to address major usability problems A good opportunity to discuss positive aspects of the design
40 Advantages and Disadvantages Advantages Few ethical & practical issues Best experts have knowledge of application domain & users, as well as usability Disadvantages Can be difficult & expensive to find experts Biggest problems Important problems may get missed Many trivial problems are often identified
41 For more information Provides heuristics and a template so that you can evaluate different kinds of systems. More information about this is provided in the interactivities section of the idbook.com website.
42 Key Points Expert evaluation: heuristic & walkthroughs Relatively inexpensive because no users Heuristic evaluation relatively easy to learn May miss key problems & identify false ones
43 Evaluation Paradigms Summary Key aspects: Role of user Who controls Relationship btwn evaluator and user Location When used Type of data collected/how analyzed How evaluation findings are fed back Philosophy and theory underlying paradigm
44 DECIDE: A framework to guide evaluation Determine the goals the evaluation addresses. Explore the specific questions to be answered. Choose the evaluation paradigm and techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, interpret and present the data.
45 Determine the goals Questions What are the high-level goals of the evaluation? Who wants it and why? The goals influence the paradigm for the study Some examples of goals: Identify user needs Identify the best metaphor on which to base the design. Check to ensure that the final interface is consistent. Investigate how technology affects working practices. Improve the usability of an existing product.
46 Explore the questions All evaluations need goals & questions to guide them so time is not wasted on ill-defined studies. E.g., the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub-questions: What are customers attitudes toward these new tickets? Are they concerned about security? Is the interface for obtaining them poor? What questions might you ask about the design of a cell phone?
47 Choose the evaluation paradigm & techniques Evaluation paradigm influences the techniques used & how data is analyzed and presented. E.g. field studies do not involve modeling
48 Identify practical issues For example, how to: select users stay on budget stay on schedule find evaluators select equipment
49 Decide on ethical issues Develop an informed consent form Participants have a right to: Know the goals of the study Know what will happen to the findings Privacy of personal information Not be quoted without their agreement Leave when they wish Be treated politely
50 Evaluate, interpret & present data Data analysis and presentation depends on paradigm and techniques The following also need to be considered: Reliability Can the study be replicated? (same results on different occasions under the same circumstances) Validity Is it measuring what you thought? Biases Is the process creating biases? Scope Can the findings be generalized? Ecological validity Is the environment of the study influencing it?
51 Pilot studies A small trial run of a study Ensures your plan is viable. Pilot studies check: That you can conduct the procedure That interview scripts, questionnaires, experiments, etc. work appropriately Also give you some initial data to help you form a better hypothesis Ask colleagues if you can t spare real users.
52 Key points An evaluation paradigm is an approach that is influenced by particular theories and philosophies. Five categories of techniques were identified: observing users, asking users, asking experts, user testing, modeling users. The DECIDE framework has six parts: Determine the overall goals Explore the questions that satisfy the goals Choose the paradigm and techniques Identify the practical issues Decide on the ethical issues Evaluate ways to analyze & present data Doing a pilot study is usually worthwhile
53 Evaluation Techniques 1. Observing users 2. Asking users their opinions 3. Asking experts their opinions 4. Testing users performance Usually in a controlled lab setting 5. Modeling users task performance Predict speed & efficiency based on #keystrokes, distance to travel, etc. Good for simple, repetitive tasks
54 Observing Users Identify needs Techniques: Notes, audio, video, interaction logs Challenge Observe but not disturb How to analyse the data Large quantities Different sources
55 Observing Users Some overlap with requirements observation techniques Different goals, though How to observe Controlled environments In the field Participant observation & ethnography
56 Observing Users Data Collection Notes plus still camera Audio recording plus still camera Video Diaries Interaction Logging
57 Observing Users Analyzing Data Qualitative analysis to tell a story for categorization Quantitative data analysis Feeding findings back into design
58 Asking Users Asking users what they think of a product Does it do what they want? Do they like it? Aesthetic appeal? Problems using it? Would they use it again? E.g. interviews, questionnaires
59 Asking Experts Software inspections/reviews Good for code and structure analysis Similar techniques for usability Heuristic evaluation Cognitive walkthrough Inexpensive Experts often have solutions to offer
60 Next Week Project Due Exam More on Phase 2 of project and presentations User Testing and Modeling
Chapter 15: Analytical evaluation
Chapter 15: Analytical evaluation Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain how to do doing heuristic evaluation and walkthroughs.
More informationIntroducing Evaluation
Chapter 12 Introducing Evaluation 1 The aims Explain the key concepts used in evaluation. Introduce different evaluation methods. Show how different methods are used for different purposes at different
More information15/16 CSY2041 Quality and User-Centred Systems
15/16 CSY2041 Quality and User-Centred Systems INTERACTION DESIGN 1 Heuristic evaluation and walkthroughs 2 1 Aims: Describe the key concepts associated with inspection methods. Explain how to do heuristic
More informationAnalytical evaluation
Chapter 15 Analytical evaluation 1 Aims: Describe the key concepts associated with inspection methods. Explain how to do heuristic evaluation and walkthroughs. Explain the role of analytics in evaluation.
More informationOverview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search
Overview of Today s Lecture Analytical Evaluation / Usability Testing November 17, 2017 Analytical Evaluation Inspections Recapping cognitive walkthrough Heuristic evaluation Performance modelling 1 2
More informationCS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test
CS 160: Evaluation Professor John Canny Spring 2006 2/15/2006 1 2/15/2006 2 Iterative Design Prototype low-fi paper, DENIM Design task analysis contextual inquiry scenarios sketching 2/15/2006 3 Evaluate
More informationCS 160: Evaluation. Professor John Canny Spring /15/2006 1
CS 160: Evaluation Professor John Canny Spring 2006 2/15/2006 1 Outline User testing process Severity and Cost ratings Discount usability methods Heuristic evaluation HE vs. user testing 2/15/2006 2 Outline
More informationIntroducing Evaluation
Chapter 12 Introducing Evaluation 1 The aims Explain the key concepts used in evaluation. Introduce different evaluation methods. Show how different methods are used for different purposes at different
More informationHCI and Design SPRING 2016
HCI and Design SPRING 2016 Topics for today Heuristic Evaluation 10 usability heuristics How to do heuristic evaluation Project planning and proposals Usability Testing Formal usability testing in a lab
More informationAnalytical Evaluation
Analytical Evaluation November 7, 2016 1 Questions? 2 Overview of Today s Lecture Analytical Evaluation Inspections Performance modelling 3 Analytical Evaluations Evaluations without involving users 4
More informationAssignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing
HCI and Design Topics for today Assignment 5 is posted! Heuristic evaluation and AB testing Today: Heuristic Evaluation Thursday: AB Testing Formal Usability Testing Formal usability testing in a lab:
More informationcs465 principles of user interface design, implementation and evaluation
cs465 principles of user interface design, implementation and evaluation Karrie G. Karahalios 24. September 2008 1. Heuristic Evaluation 2. Cognitive Walkthrough 3. Discuss Homework 3 4. Discuss Projects
More informationHEURISTIC EVALUATION WHY AND HOW
HEURISTIC EVALUATION WHY AND HOW REF: Scott Klemmer Jacob Nielsen James Landay HEURISTIC EVALUATION Multiple ways to evaluate Empirical: Assess with real users trying s/w Formal: Models and formulas to
More informationCSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation
CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation Lecture 12: Inspection-Based Methods James Fogarty Daniel Epstein Brad Jacobson King Xia Tuesday/Thursday 10:30 to 11:50
More informationHeuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an
Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an analysis of 249 usability problems (Nielsen, 1994). -Preece
More informationAnalytical &! Empirical Evaluation
Analytical &! Empirical Evaluation Informatics 132 5/22/2012 TODAY Evaluation Due: A3 Paper Prototyping UPCOMING Friday: Group Project Time Monday: Memorial Day, No Class Wednesday: HCI in the Real World
More informationDesign Heuristics and Evaluation
Design Heuristics and Evaluation Rapid Evaluation Selected material from The UX Book, Hartson & Pyla Heuristic Evaluation Another method for finding usability problems in a UI design Validation during
More informationCSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation
CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation Lecture 11: Inspection Tuesday / Thursday 12:00 to 1:20 James Fogarty Kailey Chan Dhruv Jain Nigini Oliveira Chris Seeds
More informationIPM 10/11 T1.6 Discount Evaluation Methods
IPM 10/11 T1.6 Discount Evaluation Methods Licenciatura em Ciência de Computadores Miguel Tavares Coimbra Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg
More informationHeuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation
1 USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Hall of Fame or Shame? Heuristic Evaluation Prof. James A. Landay University of Washington Pocket By Read It Later 11/1/2012 2 Hall of Fame or Shame?
More informationUSER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440
USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Heuristic Evaluation Prof. James A. Landay University of Washington CSE 440 February 19, 2013 Hall of Fame or Shame? Pocket By Read It Later Jan. 14-18,
More informationAddition about Prototypes
Vorlesung Mensch-Maschine-Interaktion Evaluation Ludwig-Maximilians-Universität München LFE Medieninformatik Heinrich Hußmann & Albrecht Schmidt WS2003/2004 http://www.medien.informatik.uni-muenchen.de/
More informationFoundation Level Syllabus Usability Tester Sample Exam
Foundation Level Syllabus Usability Tester Sample Exam Version 2017 Provided by German Testing Board Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged.
More informationDesign Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines
INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET Design Principles SMD157 Human-Computer Interaction Fall 2005 Nov-4-05 SMD157, Human-Computer Interaction 1 L Overview User-center design Guidelines
More informationDesign Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines
INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET Design Principles SMD157 Human-Computer Interaction Fall 2003 Nov-6-03 SMD157, Human-Computer Interaction 1 L Overview User-center design Guidelines
More informationHeuristic Evaluation of igetyou
Heuristic Evaluation of igetyou 1. Problem i get you is a social platform for people to share their own, or read and respond to others stories, with the goal of creating more understanding about living
More informationHeuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!
CS 147: HCI+D UI Design, Prototyping, and Evaluation, Autumn 2014 HCI+D: USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Hall of Fame or Shame? Heuristic Evaluation Computer Science Department Autumn
More informationExpert Evaluations. November 30, 2016
Expert Evaluations November 30, 2016 Admin Final assignments High quality expected Slides Presentation delivery Interface (remember, focus is on a high-fidelity UI) Reports Responsive Put your best foot
More informationCogSysIII Lecture 9: User Modeling with GOMS
CogSysIII Lecture 9: User Modeling with GOMS Human Computer Interaction Ute Schmid Applied Computer Science, University of Bamberg last change June 26, 2007 CogSysIII Lecture 9: User Modeling with GOMS
More informationevaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1
topics: evaluation techniques usability testing references: cisc3650 human-computer interaction spring 2012 lecture # II.1 evaluation techniques Human-Computer Interaction, by Alan Dix, Janet Finlay, Gregory
More informationUsability. HCI - Human Computer Interaction
Usability HCI - Human Computer Interaction Computer systems optimization for easy access and communication Definition design Basic principles Testing assessment implementation Quality Utility funcionality
More informationUser Interface Evaluation
User Interface Evaluation Heuristic Evaluation Lecture #17 Agenda Evaluation through Expert Analysis Cognitive walkthrough Heuristic evaluation Model-based evaluation Cognitive dimension of notations 2
More informationNektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece
Kostaras N., Xenos M., Assessing Educational Web-site Usability using Heuristic Evaluation Rules, 11th Panhellenic Conference on Informatics with international participation, Vol. B, pp. 543-550, 18-20
More informationUser Experience Report: Heuristic Evaluation
User Experience Report: Heuristic Evaluation 1 User Experience Report: Heuristic Evaluation Created by Peter Blair for partial fulfillment of the requirements for MichiganX: UX503x Principles of Designing
More informationPage 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement
Ideas to windows Lecture 7: Prototyping & Evaluation How do we go from ideas to windows? Prototyping... rapid initial development, sketching & testing many designs to determine the best (few?) to continue
More informationHeuristic Evaluation
Heuristic Evaluation For CS4760 & HU4628 Group 3 -- BAER Can Li (canli@mtu.edu) 2/17/2015 Background: The primary mission of BAER Teams is to protect lives, property and sensitive habitats from post-fire
More informationSoftware Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.
Institut für Informatik Software Quality Lecture 7 UI Design, Usability & Testing Thomas Fritz Martin Glinz Many thanks to Meghan Allen and Daniel Greenblatt. Overview Introduction to UI design User-centered
More informationUsability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web
Usability & User Centered Design SWE 432, Fall 2018 Design and Implementation of Software for the Web Review: Mental models Only single temperature sensor. Controls not independent, need to adjust both.
More informationCO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation
CO328- Human Computer Interaction Michael Kölling Caroline Li Heuristic Evaluation Signage: What does this tells you? - History, what went earlier there before - Tells you more about a problematic situation
More informationEvaluating the Design without Users. A formalized way of imagining people s thoughts and actions when they use an interface for the first time
Evaluating the Design without Users Cognitive Walkthrough Action Analysis Heuristic Evaluation Cognitive walkthrough A formalized way of imagining people s thoughts and actions when they use an interface
More informationÜbung zur Vorlesung Mensch-Maschine-Interaktion. e5: Heuristic Evaluation
Übung zur Vorlesung Mensch-Maschine-Interaktion e5: Heuristic Evaluation Sara Streng Ludwig-Maximilians-Universität München Wintersemester 2007/2008 Ludwig-Maximilians-Universität München Sara Streng Übung
More informationEvaluation. Why, What, Where, When to Evaluate Evaluation Types Evaluation Methods
Evaluation Why, What, Where, When to Evaluate Evaluation Types Evaluation Methods H. C. So Page 1 Semester B 2017-2018 Why, What, Where, When to Evaluate Iterative design and evaluation is a continuous
More informationHeuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.
Heuristic Evaluation Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics. Usability heuristics are best practices developed and identified
More informationLose It! Weight Loss App Heuristic Evaluation Report
Lose It! Weight Loss App Heuristic Evaluation Report By Manuel Ryan Espinsa Manuel Ryan Espinosa 1-27-2017 Heuristic Evaluation IN4MATX 283 LEC A: USER EXPERIENCE (37000) TABLE OF CONTENTS EXECUTIVE SUMMARY
More informationHuman-Computer Interaction IS 4300
Human-Computer Interaction IS 4300 Prof. Timothy Bickmore Overview for Today Brief review. Affordances & Cognitive Models. Norman's Interaction Model Heuristic Evaluation. Cognitive Walk-through Evaluation
More informationInteraction Design. Heuristic Evaluation & Cognitive Walkthrough
Interaction Design Heuristic Evaluation & Cognitive Walkthrough Interaction Design Iterative user centered design and development Requirements gathering Quick design Build prototype Evaluate and refine
More informationUser-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web
User-Centered Design SWE 432, Fall 2017 Design and Implementation of Software for the Web In class exercise As you come in and take a seat Write down at least 3 characteristics that makes something usable
More informationMIT GSL week 4 Wednesday. User Interfaces II
MIT GSL 2018 week 4 Wednesday User Interfaces II User Centered Design Prototyping! Producing cheaper, less accurate renditions of your target interface! Essential in spiral design process, useful in later
More informationExpert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)
: Usability Methods II Heuristic Analysis Heuristics versus Testing Debate Some Common Heuristics Heuristic Evaluation Expert Reviews (1) Nielsen & Molich (1990) CHI Proceedings Based upon empirical article
More informationAssistant Professor Computer Science. Introduction to Human-Computer Interaction
CMSC434 Introduction to Human-Computer Interaction Week 12 Lecture 24 Nov 21, 2013 Intro to Evaluation Human Computer Interaction Laboratory @jonfroehlich Assistant Professor Computer Science Hall of Fame
More informationFolsom Library & RensSearch Usability Test Plan
Folsom Library & RensSearch Usability Test Plan Eric Hansen & Billy Halibut 1 Table of Contents Document Overview!... 3 Methodology!... 3 Participants!... 3 Training!... 4 Procedure!... 4 Roles!... 4 Ethics!5
More informationUsability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003
INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET Usability Inspection Methods SMD157 Human-Computer Interaction Fall 2003 Nov-20-03 SMD157, Usability Inspection Methods 1 L Overview Usability
More informationUC Irvine Law Library Website Usability Project Initial Presentation
UC Irvine Law Library Website Usability Project Initial Presentation Informatics 132 Prof. Alfred Kobsa Spring 2011 April 18, 2011 Group Members Julie Darwish Michelle Houang Marcel Pufal Ryan Wade Law
More informationAfter the Attack. Business Continuity. Planning and Testing Steps. Disaster Recovery. Business Impact Analysis (BIA) Succession Planning
After the Attack Business Continuity Week 6 Part 2 Staying in Business Disaster Recovery Planning and Testing Steps Business continuity is a organization s ability to maintain operations after a disruptive
More informationHyper Mesh Code analyzer
Hyper Mesh Code analyzer ABSTRACT Hyper Mesh Code Analyzer (HMCA) is a text based programming environment, designed for programmers to write their source code in a more organized and manageable fashion.
More informationHow to Choose the Right UX Methods For Your Project
How to Choose the Right UX Methods For Your Project Bill Albert, Ph.D. Director, Bentley Design and Usability Center May 4, 2011 walbert@bentley.edu 781.891.2500 www.bentley.edu/usability Motivation Our
More informationEvaluating myat&t Redesign. Conner Drew
Evaluating myat&t Redesign Conner Drew Types of 1. Think Aloud Protocol Evaluation 2. Cognitive Walkthrough 3. Heuristic Evaluation Think Aloud Protocol Think Aloud Protocol Evaluate the usability of designs
More informationConcepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?
Concepts of Usability Usability Testing What is usability? How to measure it? Fang Chen ISO/IS 9241 Usability concept The extent to which a product can be used by specified users to achieve specified goals
More informationHow to Conduct a Heuristic Evaluation
Page 1 of 9 useit.com Papers and Essays Heuristic Evaluation How to conduct a heuristic evaluation How to Conduct a Heuristic Evaluation by Jakob Nielsen Heuristic evaluation (Nielsen and Molich, 1990;
More informationFoundation Level Syllabus Usability Tester Sample Exam Answers
Foundation Level Syllabus Usability Tester Sample Exam s Version 2017 Provided by German Testing Board Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged.
More informationCPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018
CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018 OVERVIEW... 2 SUMMARY OF MILESTONE III DELIVERABLES... 2 1. Blog Update #3 - Low-fidelity Prototyping & Cognitive Walkthrough,
More informationChapter 4. Evaluating Interface Designs
Chapter 4 Evaluating Interface Designs 1 Introduction Designers may fail to evaluate their designs adequately. Experienced designers know that extensive testing is a necessity. Few years ago, evaluation
More informationiscreen Usability INTRODUCTION
INTRODUCTION Context and motivation The College of IST recently installed an interactive kiosk called iscreen, designed to serve as an information resource for student/visitors to the College of IST. The
More informationDue on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction
Week 6 Assignment: Heuristic Evaluation of Due on: May 12 2013 Team Members: Arpan Bhattacharya Collin Breslin Thkeya Smith INFO 608-902 (Spring 2013): Human-Computer Interaction Group 1 HE Process Overview
More information8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques
8 Evaluation 8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques Ludwig-Maximilians-Universität München Prof. Hußmann Mensch-Maschine-Interaktion
More informationHeuristic Evaluation. Jon Kolko Professor, Austin Center for Design
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design Heuristic Evaluation Compare an interface to an established list of heuristics best practices to identify usability problems. Heuristic
More informationProgrammiersprache C++ Winter 2005 Operator overloading (48)
Evaluation Methods Different methods When the evaluation is done How the evaluation is done By whom the evaluation is done Programmiersprache C++ Winter 2005 Operator overloading (48) When the evaluation
More informationHeuristic Evaluation. Ananda Gunawardena. Carnegie Mellon University Computer Science Department Fall 2008
Heuristic Evaluation Ananda Gunawardena Carnegie Mellon University Computer Science Department Fall 2008 Background Heuristic evaluation is performed early in the development process to understand user
More informationAmsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app
Amsterdam Medical Center Department of Medical Informatics Improve Usability evaluation of the sign up process of the Improve app Author L.J.M. Heerink Principal investigator Prof. Dr. M.W.M Jaspers Supervised
More information8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques
8 Evaluation 8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques Ludwig-Maximilians-Universität München Prof. Hußmann Mensch-Maschine-Interaktion
More informationGoals of Usability Evaluation
Goals of Usability Evaluation Formative Evaluation: What and how to re-design? Design Construction Summative Evaluation: How well did we do? 1 Formative and Summative Goals Formative: during development,
More informationHUMAN COMPUTER INTERACTION
HUMAN COMPUTER INTERACTION 3. USABILITY AND CONCEPTUAL MODEL I-Chen Lin, National Chiao Tung University, Taiwan "One most unfortunate product is the type of engineer who does not realize that in order
More informationUsability in Multimedia. By Pınar Koçer Aydın and Özgür Bayram
Usability in Multimedia By Pınar Koçer Aydın and Özgür Bayram 1 OUTLINE: Part 1: What is usability and what it isn t Part 2: Jakob Nielsen s 10 principles Part 3: The user experience testing Part 4: Application
More informationSWEN 444 Human Centered Requirements and Design Project Breakdown
SWEN 444 Human Centered Requirements and Design Project Breakdown Team Status Reports: (starting in Week 2) Your team will report weekly project status to your instructor, and as you wish, capture other
More informationSFU CMPT week 11
SFU CMPT-363 2004-2 week 11 Manuel Zahariev E-mail: manuelz@cs.sfu.ca Based on course material from Arthur Kirkpatrick, Alissa Antle and Paul Hibbits July 21, 2004 1 Analytic Methods Advantages can be
More information3 Evaluating Interactive Systems
3 Evaluating Interactive Systems Viktoria Pammer-Schindler March 23, 2015 Evaluate 1 Android Sensing Tutorial Day on April 23 Save the Date!!! Evaluate 2 Days and Topics March 11 March 16 Intro Designing
More informationComputer Systems & Application
For updated version, please click on http://ocw.ump.edu.my Computer Systems & Application Computer System and Application Development Process By Mr. Nor Azhar Ahmad Faculty of Computer Systems & Software
More informationInteraction Design. Chapter 7 (June 22nd, 2011, 9am-12pm): Evaluation and Testing
Interaction Design Chapter 7 (June 22nd, 2011, 9am-12pm): Evaluation and Testing 1 Evaluation and Testing Intro to evaluation and testing analytical methods empirical methods choosing a method intuitive
More informationSWEN 444 Human Centered Requirements and Design Project Breakdown
SWEN 444 Human Centered Requirements and Design Project Breakdown Team Status Reports: (starting in Week 2) Your team will report bi-weekly project status to your instructor, and as you wish, capture other
More informationSpark is a mobile application that allows teachers to capture, track, and and share their students important learning moments.
Heuristic Evaluation of Spark Problem Spark is a mobile application that allows teachers to capture, track, and and share their students important learning moments. Violations Found 1 [H2-10: Help & Documentation]
More information1. Select/view stores based on product type/category- 2. Select/view stores based on store name-
COMPETITIVE ANALYSIS Context The world of mobile computing is fast paced. There are many developers providing free and subscription based applications on websites like www.palmsource.com. Owners of portable
More informationUsability Testing. November 14, 2016
Usability Testing November 14, 2016 Announcements Wednesday: HCI in industry VW: December 1 (no matter what) 2 Questions? 3 Today Usability testing Data collection and analysis 4 Usability test A usability
More informationCS 315 Intro to Human Computer Interaction (HCI)
1 CS 315 Intro to Human Computer Interaction (HCI) 2 3 Acceptance Tests Set goals for performance Objective Measurable Examples Mean time between failures (e.g. MOSI) Test cases Response time requirements
More informationUser-centered design in technical communication
User-centered design in technical communication Information designer & information architect Sharing knowledge is better than having it. Tekom - TC Europe November 19-20, 2003 Nov. 19-20, 2003 User-centered
More informationJakob Nielsen s Heuristics (
Jakob Nielsen s Heuristics (http://www.useit.com/papers/heuristic/heuristic_list.html) What is heuristic evaluation: Heuristic evaluation is done as a systematic inspection of a user interface design for
More informationApplying Usability to elearning
Applying Usability to elearning 6 December 08 John Wooden, PhD Director of Usability Services Fredrickson Communications jwooden@fredcomm.com We make information useful! About Fredrickson Communications
More informationassignment #9: usability study and website design
assignment #9: usability study and website design group #3 christina carrasquilla sarah hough stacy rempel executive summary This report will examine the usability of the website http://iwantsandy.com
More informationChapter 10 Interactive Systems And Usability Organizational Requirements Engineering
Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering Prof. Dr. Armin B. Cremers Sascha Alda Overview Introduction: What is usability? Why is usability an important non-functional
More information03 Usability Engineering
CS -213 Human Computer Interaction Spring 2016 03 Usability Engineering Imran Ihsan Assistant Professor (CS) Air University, Islamabad www.imranihsan.com www.opuseven.com opuseven iimranihsan imranihsan
More informationUsability Testing! Hall of Fame! Usability Testing!
HCI+D: USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Hall of Fame or Shame? Usability Testing Computer Science Department Apple One Button Mouse Autumn 2014 November 4, 2014 Hall of Shame Hall of Fame
More informationUsability Testing Essentials
Usability Testing Essentials Ready, Set...Test! Carol M. Barnum ELSEVIER Amsterdam Boston Heidelberg London New York Oxford Paris San Diego San Francisco Singapore Sydney Tokyo Morgan Kaufmann is an imprint
More informationUsable Privacy and Security Introduction to HCI Methods January 19, 2006 Jason Hong Notes By: Kami Vaniea
Usable Privacy and Security Introduction to HCI Methods January 19, 2006 Jason Hong Notes By: Kami Vaniea Due Today: List of preferred lectures to present Due Next Week: IRB training completion certificate
More information3 Prototyping and Iterative Evaluations
3 Prototyping and Iterative Evaluations Viktoria Pammer-Schindler March 15, 2016 Prototyping and Iterative Evaluations 1 Days and Topics March 1 March 8 March 15 April 12 April 19/21 April 26 (10-13) April
More informationCOMP 388/441 HCI: Introduction. Human-Computer Interface Design
Human-Computer Interface Design About Me Name: Sebastian Herr Born and raised in Germany 5-year ( BS and MS combined) degree in Business & Engineering from the University of Bamberg Germany Work experience
More informationHeuristic Evaluation
Heuristic Evaluation Assignment 11: HE of Prototypes (Individual) PROBLEM PlateList is a mobile application designed to help people overcome small obstacles when trying to cook by allowing users to (1)
More informationCrab Shack Kitchen Web Application
Crab Shack Kitchen Web Application EVALUATION ASSIGNMENT 2 HEURISTIC EVALUATION Author: Sachin FERNANDES Graduate 8 Undergraduate Team 2 Instructor: Dr. Robert PASTEL February 16, 2016 LIST OF FIGURES
More informationIntroduction to Internet Applications
to Internet Applications Internet Applications, ID1354 1 / 36 Contents 2 / 36 Section 3 / 36 Local Application We are familiar with an architecture where the entire application resides on the same computer.
More information1. The Best Practices Section < >
DRAFT A Review of the Current Status of the Best Practices Project Website and a Proposal for Website Expansion August 25, 2009 Submitted by: ASTDD Best Practices Project I. Current Web Status A. The Front
More informationHuman-Computer Interaction
Human-Computer Interaction The Trouble With Computers (and other computer-based devices) 2 The Trouble With Computers (and other computer-based devices) Confusion Panic Boredom Frustration Inefficient
More informationACSD Evaluation Methods. and Usability Labs
ACSD Evaluation Methods and Usability Labs Department of Information Technology Uppsala University Why Evaluation? Finding out problems Checking for quality of task support Changing design 2/24/11 #2 Three
More information