Evaluation techniques 1

Similar documents
Evaluation techniques 1

CSCI 3160: User Interface Design

Chapter 4. Evaluating Interface Designs

User Interface Evaluation

3 Evaluating Interactive Systems

3 Prototyping and Iterative Evaluations

The LUCID Design Framework (Logical User Centered Interaction Design)

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

Overview of the course. User-Centred Design. Group. Practical issue. Writting the report. Project work. Fang Chen

User-centered design in technical communication

HCI Research Methods

ACSD Evaluation Methods. and Usability Labs

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 5 Diploma in IT PRINCIPLES OF USER INTERFACE DESIGN

User Interface Evaluation

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Incorporating Usability into an Object Oriented. Development Process

EVALUATION OF THE USABILITY OF EDUCATIONAL WEB MEDIA: A CASE STUDY OF GROU.PS

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

HCI and Design SPRING 2016

IBM s approach. Ease of Use. Total user experience. UCD Principles - IBM. What is the distinction between ease of use and UCD? Total User Experience

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Human-Computer Interaction

THE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW

..in a nutshell. credit: Chris Hundhausen Associate Professor, EECS Director, HELP Lab

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

Introducing Evaluation

iscreen Usability INTRODUCTION

Usability of interactive systems: Current practices and challenges of its measurement

PG Certificate Web Design and Development. Course Structure. Course Overview. Web Development and User Experience - ARMC243S7 Overview

CS 4317: Human-Computer Interaction

EVALUATION OF PROTOTYPES USABILITY TESTING

Information System Architecture. Indra Tobing

Towards Systematic Usability Verification

User Experience Report: Heuristic Evaluation

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

Improving the Usability of the University of Rochester River Campus Libraries Web Sites

NPTEL Computer Science and Engineering Human-Computer Interaction

Interaction Design DECO1200

Usability Tests and Heuristic Reviews Planning and Estimation Worksheets

Carbon IQ User Centered Design Methods

CS-5200 Design Project

Overview: Introducing Usability and User-Centered Design (UCD)

Incorporating Usability into an Object Oriented. Development Process

The Pluralistic Usability Walk-Through Method S. Riihiaho Helsinki University of Technology P.O. Box 5400, FIN HUT

Introduction to Usability and its evaluation

Analytical Evaluation

CHAPTER 4 HUMAN FACTOR BASED USER INTERFACE DESIGN

KM COLUMN. How to evaluate a content management system. Ask yourself: what are your business goals and needs? JANUARY What this article isn t

Lecture 14: Heuristic Evaluation. Fall UI Design and Implementation 1

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

User Centered Design Approach to an Integrated Dynamic Positioning System

UX Research in the Product Lifecycle

Setting Usability Requirements For A Web Site Containing A Form Sarah Allen Miller and Caroline Jarrett

Analytical &! Empirical Evaluation

APPLYING HUMAN FACTORS ENGINEERING TO IMPROVE USABILITY AND WORKFLOW IN PATHOLOGY INFORMATICS

Up and Running Software The Development Process

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT REALISING THE USER INTERFACE

SOFTWARE REQUIREMENTS ENGINEERING LECTURE # 7 TEAM SKILL 2: UNDERSTANDING USER AND STAKEHOLDER NEEDS REQUIREMENT ELICITATION TECHNIQUES-IV

Chapter Types of Usability Testing or Usability Inspection

Towards a Pattern Based Usability Inspection Method for Industrial Practitioners

UC Irvine Law Library Website Usability Project Initial Presentation

Introduction. Heuristic Evaluation. Methods. Heuristics Used

Chapter 15: Analytical evaluation

Cognitive Walkthrough

Human-Computer Interaction (CS4317/5317)

Course Outline. Department of Computing Science Faculty of Science. COMP 3450 Human Computer Interaction Design (3,1,0) Fall 2015

SFU CMPT week 11

Using Principles to Support Usability in Interactive Systems

Heuristic Evaluation of Groupware. How to do Heuristic Evaluation of Groupware. Benefits

Higher National Unit specification. General information for centres. Unit code: F6BV 35

Requirements Validation and Negotiation

Interface Design Issues Lecture 8

User Centered Design Interactive Software Lifecycle

Human Computer Interaction - An Introduction

Interaction Design. Human-Computer. COGS120/CSE170 - Intro. HCI. Instructor: Philip Guo. Week 3 - Heuristic Evaluation ( )

Addition about Prototypes

Hyper Mesh Code analyzer

Level 4 Diploma in Computing

Additional reading for this lecture: Heuristic Evaluation by Jakob Nielsen. Read the first four bulleted articles, starting with How to conduct a

Bringing Usability to Industrial Control Systems by Marcus Reul, RWTH Aachen University, Aachen, Germany, aachen.

Evaluation studies: From controlled to natural settings

UI Evaluation: Cognitive Walkthrough. CS-E5220 User Interface Construction

EVALUATION OF PROTOTYPES USABILITY TESTING

cs465 principles of user interface design, implementation and evaluation

The Agile Samurai: How Agile Masters Deliver Great Software PDF

Learnability of software

Cognitive Walkthrough. Design and User Evaluation of Augmented-Reality Interfaces Jorma Nieminen

Goals of Usability Evaluation

Software Development and Usability Testing

Experimental Evaluation of Effectiveness of E-Government Websites

CS3205 HCI IN SOFTWARE DEVELOPMENT PROTOTYPING STRATEGIES. Tom Horton. * Material from: Floryan (UVa) Klemmer (UCSD, was at Stanford)

Verification and Validation. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 1

1. Select/view stores based on product type/category- 2. Select/view stores based on store name-

User Centered Design - Maximising the Use of Portal

COGNITIVE WALKTHROUGH REPORT. The International Children s Digital Library

Usability Evaluation

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

Student Usability Project Recommendations Define Information Architecture for Library Technology

CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018

Transcription:

IMS5302 Human-computer interaction Lecture 6 Other Evaluation Techniques Overview Other evaluation methods Expert reviews Field studies Developing scenarios Selecting an evaluation method IMS5302 2 Scenarios A Scenario is a story (Dumas and Redish). Types of scenarios: Brief scenario: brief story, gives just facts of a real situation for a user, no detail on how the user goes about the task. Vignettes: brief narrative may be with pictures provides broad a high-level view of user environments and current way of doing things. Elaborated scenarios: provide more details. Complete task scenarios: take story from beginning to end. IMS5302 3 Evaluation techniques 1

Developing scenarios and planning evaluations Must be mindful of the real user environment in setting tasks and planning evaluations. Often appropriate to conduct evaluations in the environment in which the system will be used. Example airline check-in system, environments with interruptions. S& P suggest a useful approach is to write scenario and then acted out as a form of theatre. (128). Useful particularly for emergency situations. IMS5302 4 Scenario writing Can be quite complex or quite simple. Depends on the purpose of the exercise. S&P describe situations where numerous scenarios were written depending on the audience. Some scenarios, they argue are quite complex and involve videos to help set the scene. Scenarios do not need to be this complex. Often simple scenarios will suffice providing you are careful and understand the audience. IMS5302 5 Expert (usability experts) reviews Getting feedback from users, customers, colleagues. Can occur early or late in the design process. Take from half a day to a week. Outcome can be a formal report with identified problems and recommendations. Could result in further discussion with design teams. May occur at different times during development. IMS5302 6 Evaluation techniques 2

Expert reviews Long established technique. Guided by heuristics. Experts step through tasks role play typical users and try to identify problems. Expert reviewers should be placed in a situation that is similar to the real users. IMS5302 7 The danger with expert reviews is that the experts may not have an adequate understanding of the task domain or user communities. Experts come in many flavours, and conflicting, advice can confuse the situation. (S&P 144) Important therefore to choose: Knowledgeable experts Experts who are familiar with the project Those who have longer term association with the organisation ie know it well. IMS5302 8 Expert review techniques Heuristic evaluation: uses expert reviewers to critique an interface. Does it conform with the standard interface design guidelines Guidelines review: interface is checked to see that it meets the original directions/guidelines set out at the beginning of the development process IMS5302 9 Evaluation techniques 3

Expert review techniques - Cognitive walkthrough Experts use the system as if they were real users and try and carry out tasks the users would perform with the system. Would begin with tasks that are performed frequently. Goal is to evaluate strengths and weaknesses of the interface. Walk-throughs consist of answering a set ofa questions about each of the decisions users are making as they use the interface. IMS5302 10 Consistency inspection: check consistency across the interface for things such as colour, language, layout workflow etc. Formal usability inspection: experts discuss interface merits and weaknesses with the designers, designers answer questions. (see S&P pg 142-144 for more detail) IMS5302 11 Heuristic evaluation Guidelines review Consistency inspection Cognitive walkthrough Formal usability inspection Usability of the website for the target audience Checking that position of icons and work flow through the system are the same. Checking menus work and are logical. Checking early usability. Ensure original goals and design decisions were implemented IMS5302 12 Evaluation techniques 4

Field studies Aim is to observe users in their natural setting. Increases understanding of what users do and the impact of technology. Opportunity to collect a variety of data such as notes, pictures, audio, video, artefacts. Most important part of being there is to observe, ask questions and record what users are doing and saying. Can involve interviews. IMS5302 13 Other considerations Experience of the development team in implementing different evaluation techniques. Type of system and experience / access of users. Time and $$ available Level of involvement of users throughout the development process. IMS5302 14 Becker and Mottay (2001) A global perspective on website usability IMS5302 15 Evaluation techniques 5

Fitzpatrick and Dix response Fitzpatrick and Dix (1999) argue that there are problems with these simplified matrices: 1. Uses only observational methods for real users with real computers. 2. User reports are only available when real users use representational computers. 3. Reports from specialist only obtained from representational users. IMS5302 16 Alternative matrix Fitzpatrick and Dix (1999) This matrix proposes that the strategy chosen reflect the development life cycle timing. Strategy depends on resources available at the time for example user availability. IMS5302 17 Evaluation methods suggested matrix Fitzpatrick and Dix (1999) IMS5302 18 Evaluation techniques 6

Dumas and Redish (1994) conclusions on evaluation methods Conclude from the various studies that: Usability testing uncovers more usability problems, finds more global, unique and local problems than other evaluation methods. Usability testing takes more time but is more cost-effective overall. Heuristic evaluation done by usability specialists is better at finding usability problems than walk-throughs. Heuristic evaluation more valuable when used with several usability experts working independently. Cognitive walk-throughs least effective. Software engineers very poor at uncovering usability problems. IMS5302 19 Other considerations/ issues Often conducted with people who are not real users. Fails to take into account user goals and motives Scenarios based on system functions not actual user needs and activities. Often language used in the test is not language typical users are familiar with. Lack of enthusiasm by the users Does not take into account users long-term use of the system. Focuses on the system not the users. IMS5302 20 Test/evaluation Facilities Use of multi-purpose rooms (labs, portable computers, empty room) Full usability labs (see next slide) Opportunistic usability testing (do mini usability tests when/where ever); Quick and dirty (convenience rather than ideal users). IMS5302 21 Evaluation techniques 7

Time and money Usability tests cost money and take time. Can take up to half a day per user. To make most of limited number of people must Design carefully characteristics most important so defined subgroups will be useful Collect other relevant information to help account for other differences that show in results Choose people representative of full subgroup with full range of qualifications. IMS5302 22 Some final thoughts Common principles often mentioned to contribute to improved system design: Create multidisciplinary teams and involve them in all stages. Understand the product life cycle. Incorporate lessons learned from predecessor systems into new designs. Include all team members in the planning phase. Understand stakeholder needs and operations early. Involves users in the design process. IMS5302 23 Improved product design continuously throughout its life cycle. Obtain management commitment and support. IMS5302 24 Evaluation techniques 8

Further reading and references Fitzpatrick & Dix (1999) A Process for Appraising Commercial Usability Evaluation Methods, Human- Computer Interaction: Ergonomics and User Interfaces, Proceedings of HCI International Dumas, J & Redish, J (1994) A practical guide to usability testing Ablex Publishing Mandel, T (1997) The elements of user interface design Wiley: New York Mayhew, D (1992) Principles and Guidelines in Software User Interface Design, Prentice Hall Preece, J et al (1994) Human computer interaction Addison-Wesley, Essex England IMS5302 25 Nielsen, J Usability heuristics (1993) Apple Computer Human interface guidelines (1987) Rudisill, Lewis, Polson, McKay (1996) Human Computer Interface Design. Oz, E and Sosik J, Why Information Systems projects are abandoned, Journal of Computer Information Systems, Fall (2000) Becker and Mottay (2001) A global perspective on website usability IEEE Software Jan/Feb Hackos and Redish (1998) User and Task Analysis for Interface Design, Wiley IMS5302 26 Evaluation techniques 9