CS3724 Human-computer Interaction. Usability Specifications. Copyright 2005 H. Rex Hartson and Deborah Hix.

Similar documents
CS/ISE 5714 Usability Engineering. Topics. Introduction to Systems Analysis MENU. Systems Analysis

CS/ISE 5714 Usability Engineering. Topics. Introduction to Rapid Prototyping. Rapid Prototyping in User Interaction Development & Evaluation

Introduction to User Stories. CSCI 5828: Foundations of Software Engineering Lecture 05 09/09/2014

Usability Testing CS 4501 / 6501 Software Testing

INTRODUCTION TO DESIGN. Scenarios and conceptual design * Interaction objects, properties, relationships * Different views * Access and operations

Programmiersprache C++ Winter 2005 Operator overloading (48)

03 Usability Engineering

EVALUATION OF PROTOTYPES USABILITY TESTING

CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018

Software Development and Usability Testing

Usability Testing. November 9, 2016

Design Heuristics and Evaluation

PART 2: THE PROCESS AN ITERATIVE, EVALUATION- CENTERED LIFE CYCLE FOR INTERACTION DEVELOPMENT

HCPro Lecture 10 ( ):

Ensure Great User Experience for your Software Product

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Process of Interaction Design and Design Languages

Usability Testing. November 14, 2016

Product Quality Engineering. RIT Software Engineering

EVALUATION OF PROTOTYPES USABILITY TESTING

Usable Privacy and Security Introduction to HCI Methods January 19, 2006 Jason Hong Notes By: Kami Vaniea

This document will walk through the steps needed to work with the different calendar features provided into SharePoint Online 2013.

COMP6471 WINTER User-Centered Design

CS3724 Human-computer Interaction

Based on the slides available at book.com. Graphical Design

Product Development for Medical, Life Sciences, and Consumer Health

Web-Accessibility Tutorials 1 Development and Evaluation of Web-Accessibility Tutorials

Chapter 18: Usability Testing U.S. Dept. of Health and Human Services (2006)

Homework , Fall 2013 Software process Due Wednesday, September Automated location data on public transit vehicles (35%)

First-Time Usability Testing for Bluetooth-Enabled Devices

Introducing Evaluation

HCI and Design SPRING 2016

Foundation Level Syllabus Usability Tester Sample Exam

cs465 principles of user interface design, implementation and evaluation

An introduction to the basics

Running head: PROJECT 4: THE EVALUATION PLAN 1. Project 4: The Evaluation Plan. Rachel Murch. EDCI 577 Section 4.

Evaluation and Design Issues of Nordic DC Metadata Creation Tool

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system

The Process of Interaction Design

Seng310 Lecture 8. Prototyping

Administrivia. Added 20 more so far. Software Process. Only one TA so far. CS169 Lecture 2. Start thinking about project proposal

The LUCID Design Framework (Logical User Centered Interaction Design)

Overview of the course. User-Centred Design. Group. Practical issue. Writting the report. Project work. Fang Chen

Analytical &! Empirical Evaluation

MiPhone Phone Usage Tracking

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Recapitulation Lecture #10 LECTURE 11 USABILITY - PROTOTYPING. Waterfall with iteration. Example: Cognitive Task Analysis

Usability Services at the University of Maryland: Who, What and How

User Experience Metric (UXM) and Index of Integration (IoI): Measuring Impact of HCI Activities

Folsom Library & RensSearch Usability Test Plan

InHealth Dashboard. User Instructions. Accessing Your InHealth Dashboard. Navigating Your Dashboard

User interface design. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 16 Slide 1

2015S, Group project 3/4 PD DI Dr. Andreas Riener Due: Friday, June 5, 2015, 10 AM. Last name: Tutor: AR. First name: Group: 1

OBJECT ORIENTED SYSTEM DEVELOPMENT Software Development Dynamic System Development Information system solution Steps in System Development Analysis

h(p://ihm.tumblr.com/post/ /word- cloud- for- hci- human- computer- interacbon CS5340 Human-Computer Interaction ! January 31, 2013!

WRAP UP AND MAKING IT ALL WORK

Chapter 4. Evaluating Interface Designs

Topics. CS5714 Usability Engineering. Typical Software Engineering LC. An Iterative, Evaluation- Centered Life Cycle For Interaction Development

IBM MANY EYES USABILITY STUDY

Experimental Evaluation of Effectiveness of E-Government Websites

Safety-enhanced Design EDIS 2014 R (a)(1) Computerized Provider Order Entry

Professor Hausi A. Müller PhD PEng FCAE Department of Computer Science Faculty of Engineering University of Victoria

USABILITY EVALUATIONS OF Health Information Technology

Human-Computer Interaction: An Overview. CS2190 Spring 2010

LECTURE 11 USABILITY -PROTOTYPING

Based on the slides available at book.com. Graphical Design

VIRTUAL REALITY BASED END-USER ASSESSMENT TOOL for Remote Product /System Testing & Support MECHANICAL & INDUSTRIAL ENGINEERING

The process of interaction design. Based on the slides available at book.com

Mensch-Maschine-Interaktion 1

Usability of interactive systems: Current practices and challenges of its measurement

MMI 2. Tutorials. Winter Term 2017/18. LMU München - LFE Medieninformatik. Prof. Andreas Butz Renate Häuslschmid Christina Schneegaß

Evaluating usability of screen designs with layout complexity

WEB OF KNOWLEDGE V5.9 RELEASE NOTES

Foundation Level Syllabus Usability Tester Sample Exam Answers

Chart Wizard - overview

A Usability Evaluation of Google Calendar

Quick Start Guide. For Students & Interns

Examination Questions Time allowed: 1 hour 15 minutes

A new interaction evaluation framework for digital libraries

User Experience Design

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Network Simulator Project Guidelines Introduction

Usability Report for Online Writing Portfolio

User-Centered Design. Jeff Bos, Design Insights BlackBerry

This PDF was generated from the Evaluate section of

Web Evaluation Report Guidelines

Yahoo! Digits: A Design Driven to Provide Instant Data Driven Insights and its Use in User Experience Design

User Experience Design

MIT GSL week 4 Wednesday. User Interfaces II

SFU CMPT week 11

This exam is open book / open notes. No electronic devices are permitted.

HCI Research Methods

Choosing the Right Usability Tool (the right technique for the right problem)

ITP 140 Mobile Technologies. User Testing

LECTURE 11 USABILITY - PROTOTYPING

Welcome to Analytics. Welcome to Applause! Table of Contents:

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

IT323 - Software Engineering 2 1

Information Systems Interfaces (Advanced Higher) Information Systems (Advanced Higher)

Lecture 23: Domain-Driven Design (Part 1)

Transcription:

CS3724 Human-computer Interaction Usability Specifications Copyright 2005 H. Rex Hartson and Deborah Hix.

Topics What are usability specifications? Usability specification tables Benchmark task descriptions User errors in usability specifications Usability specifications and managing the UE process Team exercises 2

Usability Specifications Quantitative usability goals against which user interaction design is measured Target levels for usability attributes Operationally defined metric for a usable interaction design Management control for usability engineering life cycle Indication that development process is converging toward a successful design Establish as early in process as feasible 3

Usability Specifications Tie usability specifications to early usability goals E.g., for early goal of walk-up-and-use usability, base usability specification on initial task performance time All project members should agree on usability specifications attribution and values 4

Usability Specification Data 5 Usability specifications based on Objective, observable user performance Subjective, user opinion and satisfaction Subjective preferences may reflect users desire to return to your Web site, but flash and trash soon bores and irritates Objective and subjective usability specifications can both be quantitative

Usability Specification Table Credit to: [Whiteside, Bennett, & Holtzblatt, 1988] Usability attribute Measuring instrument Value to be measured Current level Target level Usability attribute what general usability characteristic is to be measured May need separate usability attributes for each user class 6

Usability Specifications Some quantitative usability attributes Objective Initial performance (on benchmark tasks) Longitudinal (experienced, steady state) performance Learnability Retainability Subjective Initial impression (questionnaire score) Longitudinal satisfaction 7

Usability Specification Table Usability attribute for Calendar Initial performance, since want good 'walk-up-and-use' performance w/o training or manuals Usability attribute Measuring instrument Value to be measured Current level Target level Initial performance 8

Usability Specification Table Measuring instrument Vehicle by which values are measured for usability attribute The thing that generates the data Benchmark task generates objective timing data Questionnaire generates subjective preference data Sample questions from QUIS satisfaction questionnaire 9

Benchmark Tasks What tasks should be included? Representative, frequently performed tasks Common tasks 20% that account for 80% of usage Critical business tasks not frequent, but if you get it wrong, heads can roll Example: Schedule a meeting with Dr. Ehrich for four weeks from today at 10 am in 133 McBryde, about the HCI research project 10

Benchmark Task Descriptions Clear, precise, repeatable instructions IMPORTANT: What task to do, not how to do it Clear start and end points for timing Not: Display next week s appointments (end with a user action confirming end of task) Adapt scenarios already developed for design Clearly an important task to evaluate Remove information about how to do it 11

Benchmark Task Descriptions 12 Start with fairly simple tasks, then progressively increase difficulty Add an appointment, then add appointment 60 days from now, then move appointment from one month to other, add recurring appointments Avoid large amounts of typing if typing skill is not being evaluated Tasks should include navigation Not: look at today s appointments

Benchmark Task Descriptions Tasks wording should be unambiguous Why is this ambiguous? Schedule a meeting with Mr. Jones for one month from today, at 8 AM. Important: Don t use words in benchmark tasks that appear specifically in interaction design Not: Find first appointment when there is a button labeled Find Instead: use search for, locate 13

Benchmark Task Descriptions Use work context wording, not systemoriented wording Access information about xyz is better than submit query To evaluate error recovery, benchmark task can begin in error state 14

Benchmark Task Descriptions Put each benchmark on a separate sheet of paper Typical number of benchmark tasks: Enough for reasonable, representative coverage Example for Calendar: Add an appointment with Dr. Kevorkian for 4 weeks from today at 9 AM concerning your flu shot (yeah, right) 15

Usability Specification Table Usability attribute Measuring instrument Value to be measured Current level Target level Initial performance BT1: Add appt 16

Usability Specification Table Usability attribute Measuring instrument Value to be measured Current level Target level Initial performance BT1: Add appt Value to be measured metric for which usability data values are collected 17

Usability Specification Table 18 Value to be measured metric for which usability data values are collected Time to complete task Number of errors Frequency of help and documentation use Time spent in errors and recovery Number of repetitions of failed commands Number of times user expresses frustration or satisfaction Number of commands, mouse-clicks, or other user actions to perform task(s)

Usability Specification Table Usability attribute Measuring instrument Value to be measured Current level Target level Initial performance BT1: Add appt Time on task 19

Usability Specification Table Usability attribute Measuring instrument Value to be measured Current level Target level Initial performance BT1: Add appt Time on task 20 Current level present value of usability attribute to be measured

Usability Specification Table Current level Level of performance for current version of system for measuring instrument (when available) Baseline to help set target level, from: Automated system (existing or prior version) Competitor system Developer performance (for expert, longitudinal use) Try out some users on your early prototype 21

Usability Specification Table Usability attribute Measuring instrument Value to be measured Current level Target level Initial performance BT1: Add appt Time on task 20 secs (competitor system) 22

Usability Specification Table Usability attribute Measuring instrument Value to be measured Current level Target level Initial performance BT1: Add appt Time on task 20 secs (competitor system) Target level value indicating unquestioned usability success for present version 23

Usability Specification Table Target level Minimum acceptable level of user performance Determining target level values Usually acceptable improvement over current level 24

Usability Specification Table Usability attribute Measuring instrument Value to be measured Current level Target level Initial performance BT1: Add appt Time on task 20 secs (competitor system) 15 secs 25

Usability Specification Table More example usability specifications Usability attribute Measuring instrument Value to be measured Current level Target level Initial performance BT1: Add appt Time on task 20 secs (competitor system) 15 secs Initial performance BT1: Add appt Nbr of errors 2 1 26 Initial satisfaction Q 1, 2, 7 from questionnaire Avg score over questions, users / 10 7 8.5

User Errors in Usability Specs. What constitutes a user error? Deviation from any correct path to accomplish task (except, for example, going to Help) Only situations that imply usability problems Do not count oops errors, doing wrong thing when knew it was wrong 27

User Errors in Usability Specs. Examples of errors 28 Selecting wrong menu, button, icon, etc. when user thought it was the right one E.g., working on wrong month of calendar because they couldn't readily see month's name Double clicking when a single click is needed, and vice versa Operating on the wrong interaction object (when user thought it was the right one) Usually not typing errors

Creating Usability Specifications Usability evaluation design driven by usability goals First determine usability goals In terms of user class, task context, special tasks, marketing needs Example: Reduce amount of time for novice user to perform task X in Version 2.0 Be specific as possible Example: currently 35 seconds to perform task X ( current level ); reduce to 25 seconds ( target level ) 29

Creating Usability Specifications What are constraints in user or work context? Design for ecological validity How can setting be more realistic? Usability lab can be sterile work environment Does task require telephone or other physical props? 30

Creating Usability Specifications Design for ecological validity Does task involve more than one person or role? Does task involve background noise? Does task involve interference, interruption? 31

Creating Usability Specifications Experimental design must take into account trade-offs among user groups Watch out for potential trade-off between learnability for new users and performance power for experienced users 32

Usability Specifications Connecting Back to UE Process Usability Specifications help manage the usability engineering process This is the control of usability engineering life cycle Quantifiable end to process Accountability Stop iterating when target level usability specifications are met 33

Usability Specifications It s expected that you will not meet all usability target levels on first iteration If usability target levels are met on first iteration, they may have been too lenient Point is to uncover usability problems DO NOT design usability specifications with the goal of meeting them with your initial design! 34

Usability Specifications Bottom line: This is not an exact science Good engineering judgment is important For setting levels (especially target level) For knowing if specifications are reasonable You get better at it with experience 35

Team Exercise Usability Specifications 36 Goal: To gain experience in writing precise, measurable usability specifications using benchmark tasks Activities: Produce three usability specifications, two based on objective measures, one based on subjective measures For the objective measures, write brief but specific benchmark task descriptions (at least two different benchmark task descriptions), each on a separate sheet of paper. Have them be a little complicated, include some navigation.

Team Exercise Usability Specifications 37 Specifications with objective measures should be evaluatable, via benchmark tasks, in a later class exercise, on formative evaluation. Develop tasks that you can implement in your next exercise, to build a rapid prototype. The specification for subjective measure should be based on the questionnaire supplied. Select 3 or 4 items from questionnaire.

Team Exercise Usability Specifications 38 Cautions and hints: Don t spend any time on design in this exercise; there will be time for detailed design in the next exercise. Don t plan to give users any training. 3 usability specifications, in the form on a transparency Questionnaire question numbers included in subjective specification Benchmark task descriptions, each on a separate sheet of paper Complete in about 30-40 minutes max.