Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Similar documents
Hyper Mesh Code analyzer

iscreen Usability INTRODUCTION

3 Prototyping and Iterative Evaluations

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

3 Evaluating Interactive Systems

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

HCI and Design SPRING 2016

Analytical Evaluation

Jakob Nielsen s Heuristics (

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

NPTEL Computer Science and Engineering Human-Computer Interaction

SFU CMPT week 11

15/16 CSY2041 Quality and User-Centred Systems

IPM 10/11 T1.6 Discount Evaluation Methods

Human-Computer Interaction: An Overview. CS2190 Spring 2010

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

User Interface Evaluation

Amsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app

Design Heuristics and Evaluation

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

evision Review Project - Engagement Simon McLean, Head of Web & IT Support Information & Data Services.

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

Addition about Prototypes

Chapter 4. Evaluating Interface Designs

User Experience Report: Heuristic Evaluation

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

Analytical &! Empirical Evaluation

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

Chapter 15: Analytical evaluation

cs465 principles of user interface design, implementation and evaluation

MIT GSL week 4 Wednesday. User Interfaces II

Accessibility/Usability Through Advanced Interactive Devices

Übung zur Vorlesung Mensch-Maschine-Interaktion. e5: Heuristic Evaluation

CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara

Ten Usability Heuristics J. Nielsen

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

1. Select/view stores based on product type/category- 2. Select/view stores based on store name-

A Heuristic Evaluation of Ohiosci.org

User Interface Evaluation

User-Centered Design. Jeff Bos, Design Insights BlackBerry

Heuristic Evaluation of NUIG Participate Module 1

What is interaction design? What is Interaction Design? Example of bad and good design. Goals of interaction design

CS 315 Intro to Human Computer Interaction (HCI)

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

SEPA diary Heuristic evaluation

CogSysIII Lecture 9: User Modeling with GOMS

Analytical evaluation

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Usability. Daniela Rosner. Web Architecture, October 9, School of Information UC Berkeley

User Centered Design - Maximising the Use of Portal

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

ACSD Evaluation Methods. and Usability Labs

UC Irvine Law Library Website Usability Project Initial Presentation

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

Severity Definitions:

Due on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction

CS-5200 Design Project

The Pluralistic Usability Walk-Through Method S. Riihiaho Helsinki University of Technology P.O. Box 5400, FIN HUT

HCI Research Methods

USER RESEARCH Website portfolio prototype

HEURISTIC EVALUATION WHY AND HOW

HCI CA1 (ii) Redesign Implementation

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

Lecture 14: Heuristic Evaluation. Fall UI Design and Implementation 1

Heuristic Evaluation

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

Introducing Evaluation

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

Interaction Design. Human-Computer. COGS120/CSE170 - Intro. HCI. Instructor: Philip Guo. Week 3 - Heuristic Evaluation ( )

(Milano Lugano Evaluation Method) A systematic approach to usability evaluation. Part I LAB HCI prof. Garzotto - HOC-POLITECNICO DI MILANO a.a.

Foundation Level Syllabus Usability Tester Sample Exam

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

Heuristic Evaluation of Enable Ireland

Interaction Design. Chapter 7 (June 22nd, 2011, 9am-12pm): Evaluation and Testing

Evaluation and Testing. Andreas Butz, LMU Media Informatics slides partially taken from MMI class

E2: Heuristic Evaluation A usability analysis of decorativethings.com. Jordana Carlin LIS Spring 2014

HUMAN COMPUTER INTERACTION

Heuristic Evaluation Report. The New York Philharmonic Digital Archives archives.nyphil.org

Learnability of software

Heuristic Evaluation

Lose It! Weight Loss App Heuristic Evaluation Report

Heuristic Evaluation of Covalence

Foundation Level Syllabus Usability Tester Sample Exam Answers

Introduction to Internet Applications

EVALUATION OF PROTOTYPES USABILITY TESTING

Evaluation techniques 1

Evaluation techniques 1

Usability of interactive systems: Current practices and challenges of its measurement

Verification and Validation

Transcription:

An Introduction to Information Visualization Techniques for Exploring Large Database Jing Yang Fall 2005 1 Evaluation in Information Visualization Class 3 2 1

Motivation What are the advantages and limitations of this technique (system)? Are there any improvement of this new technique over the existing ones? Is this technique (system) just cool or actually helps people? 3 Measures Different possible ways to measure: Impact on community as a whole, influential ideas Assistance to people in the tasks they care about - Dr. John Stasko, Slides of CS7500 at Gatech 4 2

Strong View Unless a new technique or tool helps people in some kind of problem or task, it doesn t have any value - Dr. John Stasko, Slides of CS7500 at Gatech 5 Broaden Thinking Sometimes the chain of influence can be long and drawn out System X influences System Y influences System Z which is incorporated into a practical tool that is of true value to people This is what research is all about (typically) - Dr. John Stasko, Slides of CS7500 at Gatech 6 3

Survey of Evaluation InfoVis top conference in information visualization area. Started from 1995. I read through Infovis 95- Infovis 2001 for papers that included some form of evaluation. 7 8 4

Evaluation In InfoVis Evaluation has recently been regarded as important. Many evaluations are ad hoc. Ad hoc questionnaires. Ad hoc processes. Insufficient subjects. In HCI field, there is a long history of evaluation and systematic theory and approaches 9 Evaluating UI vs. InfoVis Difference Usability vs. Utility Imagine a visualization that are very usable but not useful More domain knowledge and situated use are required 10 5

Evaluating InfoVis in General Very difficult in InfoVis to compare apples to apples Hard to compare System A to System B Different tools were built to address different user tasks UI can heavily influence utility and value of visualization technique Borrow UI evaluation methods, adapt it to InfoVis 11 Usability Inspection in UI Usability inspection - generic name for methods based on having evaluators inspect or examine usability-related aspects of a user interface [NL94] General approaches user testing, heuristic evaluation, feature inspection, heuristic estimation, consistency inspection, standards inspection, pluralistic walkthrough, and cognitive walkthrough [Nie95] 12 6

What to Measure: Usability Metrics Three categories: Preference metrics Performance metrics Predictive metrics (design metrics) 13 Preference metrics Subjective evaluations and preferences of users Questionnaires and rating scales are used. Standard forms or questionnaires are more reliable than ad hoc surveys Approaches: User testing Heuristic evaluation 14 7

Performance Metrics Actual use of working software Example: How fast? How many? Approach User testing 15 Design Metrics Quality of designs or prototypes. They can be measured without working systems! Examples: Essential efficiency (brief interaction) Task concordance (frequent tasks less steps) Task visibility (show what users need) Layout uniformity (moderately uniform and orderly layouts) Visual coherence (related things put together) 16 8

Possible Design Metrics in Information Visualization Space usage efficiency Overlap extent Emphasized regions vs. other regions Color usage Connection between related views 17 Usability Inspection in UI Figure 6.1: Survey with regard to frequencies of methods used and usefulness rating of them [Nie95] 18 9

We will look at Approach 1: Heuristic evaluation Approach 2: User testing 19 Heuristic Evaluation in UI Definition: having a small set of evaluators to examine the interface and judge its compliance with recognized usability principles (the "heuristics"). 20 10

Nielsen s 10 Heuristics for UI (1) Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention 21 Nielsen s 10 Heuristics (2) Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation 22 11

How to Conduct a Heuristic Evaluation Heuristics Running prototype or simple screen dumps Three to five evaluators Evaluators work independently Output: A list of usability problems. Accumulate the results 23 Heuristic Evaluation Advantages: Cheap Intuitive and easy to motivate people to do it Does not require advance planning Can be used early in the development process. 24 12

Formal User Testing VS. Heuristic Evaluation Expensive Discover local problems Usually must have running system Formal Real users, Real tasks Performance metrics Cheap Discover global problems With or without running system Informal Can t measure performance metrics 25 Formal User Testing (User Study) Definition: A large set of real users performing real tasks in a working system 26 13

Essential Components of User Testing Hard components Working systems, environment, subjects, inspectors, datasets, tasks Motivation There are ten possible approaches, I can only choose one Compare system A with system B Hypothesis A is better than B in C aspect The testing methodology Information collection average time users use to find D using A/B Questionnaires and rating scales, video, screen capture Result analysis Statistic methods Visualization methods Multi-media analysis 27 Two Types of User Testing Task-oriented user testing Users are asked to accomplish a number of tasks time out Tasks: Find Charlotte, Count 3s, Is A larger than B? Results: speed, correctness Features: Objective Easy to measure and compare Rigid Task design is critical 28 14

Two Types of User Testing Open-ended think aloud user testing Users are asked to find insights from data Insight: unit of discovery Facts (actual finding), hypothesis, directed or unexpected Results distinct facts (actual findings), speed, domain value of insights, correctness Features: Harder to analyze results Broader 29 User Testing Step by Step Step 1: Decision Motivation, hypothesis Figure out most important factors A visualization system is complex! Try to focus! Create a testing plan 30 15

User Testing Step by Step Step 2: Preparation (It is time-consuming) Running prototype (minimize unrelated factors) Subjects (typical users) Hardware (room, projectors, camcorders, computers) Task sets or open-ended tasks (balanced) Subject information survey Training materials (presentation, training tasks) Questionnaires (rating scales) Open-end questions (feedback and clarification) 31 User Testing Step by Step Step 2: Preparation about result recording How to record results such as speed, correctness? Automatically accurate, less distractive for subjects Manually The more automatic, the better Don t count on your own memory - you can t imagine how busy you will be in the experiment day! Don t count on the users memories they are as overwhelmed as you in the experiment! You won t need to input many numbers from paper to computer if they are already there 32 16

User Testing Step by Step Step 2: Preparation pilot experiments Goal: calibrate the formal testing Formal testing is expensive! Subjects: one or two users Process: iterative 33 User Testing Step by Step Step 3: Experiment general processes Approach 1: Presentation (speaker) Training (instructor) Task performing (observer) Questionnaire and open questions (listener) Approach 2: Conductor sends experiment materials to subjects Subjects learn by themselves Subjects run testing by themselves Subjects sent results to conductor 34 17

User Testing Step by Step Step 3: Experiment useful hints Alleviate subjects distressing (subject cried!) One-to-one vs. group testing. Don t show individual subject s name in report. Thank the subjects after the testing. Point out that they leads to improvement to your system. 35 User Testing Step by Step Step 4: Result analysis and report Statistic analysis Visual exploration and representation Multi-media analysis 36 18

Example User Testing Motivation: Is HPC better than traditional PC? 37 Example User Testing Methodology: use two groups of users, ask them to find as more insights as possible using HPC and PC respectively 38 19

Example User Testing Preparing prototypes - HPC 39 Example User Testing Preparing prototypes - PC 40 20

Example User Testing Invite users, learn their backgrounds, dividing them into two balanced groups, and make time schedules (one day. Morning: PC. Afternoon: HPC) Reserve a large classroom. Reserve pizza! 41 Example User Testing Pilot experiments They were very important! 42 21

Example User Testing Feedback Form Name OVERALL REACTIONS TO THE SOFTWARE terrible wonderful 0 1 2 3 4 5 6 7 8 9 difficult easy 0 1 2 3 4 5 6 7 8 9 frustrating satisfying 0 1 2 3 4 5 6 7 8 9 inadequate power adequate power 0 1 2 3 4 5 6 7 8 9 dull stimulating 0 1 2 3 4 5 6 7 8 9 rigid flexible 0 1 2 3 4 5 6 7 8 9 SCREEN Characters on the computer screen hard to read easy to read 0 1 2 3 4 5 6 7 8 9 Highlighting on the screen simplifies task not at all very much 0 1 2 3 4 5 6 7 8 9 Organization of information on screen confusing very clear 0 1 2 3 4 5 6 7 8 9 TERMINOLOGY AND SYSTEM INFORMATION Use of terms throughout system inconsistent consistent 0 1 2 3 4 5 6 7 8 9 43 Example User Testing Open questions: If you have more time, do you think that you can still find more features? Is it hard or easy for you to find features using this system? Why? Do you think that there is anything we can do to improve this system? Are there any good points of this system? Any words you want to say to us... 44 22

Example User Testing Result analysis images saved with time stamps Classify insights Collect numbers and time stamps of insights Answers of open questions Answers of questionnaire An example insight 45 Example User Testing 46 23

Example User Testing 47 Preferences Nielsen, J., Technology Transfer of Heuristic Evaluation and Usability Inspection. http://www.useit.com/papers/heuristic/learning_inspection.h tml Nielsen, J., and Mack, R. L. (Eds.) (1994). Usability Inspection Methods. John Wiley & Sons, New York. Suzie Weisband, MIS 441: User Interface Design, Prototyping, and Evaluation class notes. Arizona University. L. L. Constantine and L. A. D. Lockwook, Software for Use: A Practical Guide to the Models and Methods of UsageCentered Design, 1999, ACM press. 48 24