GOMS. Adapted from Berkeley Guir & Caitlin Kelleher

Similar documents
Theories of User Interface Design

- visibility. - efficiency

CogSysIII Lecture 9: User Modeling with GOMS

GOMS Lorin Hochstein October 2002

Chapter 7: Interaction Design Models

NUR - Psychological aspects, MHP

Input Performance. KLM, Fitts Law, Pointing Interaction Techniques. Input Performance 1

Evaluation Types GOMS and KLM PRICPE. Evaluation 10/30/2013. Where we are in PRICPE: Analytical based on your head Empirical based on data

Looking Back: Fitts Law

Analytical Evaluation

Recall Butlers-Based Design

Human Computer Interaction. Outline. Human Computer Interaction. HCI lecture S. Camille Peres, Ph. D.

Input. Scott Klemmer. HCI Design. with materials from Bjoern Hartmann, Stu Card, Pat Hanrahan

Analytical evaluation

Predictive Model Examples. Keystroke-Level Model (KLM) 1 2

Chapter 15: Analytical evaluation

Input Models. Jorge Garza & Janet Johnson COGS 230 / CSE 216

cs414 principles of user interface design, implementation and evaluation

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Übung zur Vorlesung Mensch-Maschine-Interaktion

Introduction to Human Computer Interaction

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Exercise. Lecture 5-1: Usability Methods II. Review. Oral B CrossAction (white & pink) Oral B Advantage Reach Max Reach Performance (blue & white)

11/17/2008. CSG 170 Round 8. Prof. Timothy Bickmore. Quiz. Open book / Open notes 15 minutes

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Models for design

UNIVERSITY OF CALIFORNIA AT BERKELEY. Name:

Usability. CSE 331 Spring Slides originally from Robert Miller

Announcements. Usability. Based on material by Michael Ernst, University of Washington. Outline. User Interface Hall of Shame

Models for design. Human-Computer Interaction Beatriz Sousa Santos, 2017/18

Using the Keystroke-Level Model to Estimate Execution Times

chapter 3 the interaction

Predictive Human Performance Modeling Made Easy

SBD:Interaction Design

Lecture 4, Task Analysis, HTA

Expert Evaluations. November 30, 2016

CS 315 Intro to Human Computer Interaction (HCI)

SFU CMPT week 11

evision Review Project - Engagement Simon McLean, Head of Web & IT Support Information & Data Services.

Input part 3: Interaction Techniques

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Input: Interaction Techniques

What is interaction? communication user system. communication between the user and the system

User Interface Design: Principles, Fitts' Law. 15 January 2009 CMPT370 Dr. Sean Ho Trinity Western University

CS Human Computer Interaction

cognitive models chapter 12 Cognitive models Cognitive models Goal and task hierarchies goals vs. tasks Issues for goal hierarchies

Human Performance Regression Testing

Introducing Evaluation

UX Design Principles and Guidelines. Achieve Usability Goals

How It All Stacks Up - or - Bar Charts with Plotly. ISC1057 Janet Peterson and John Burkardt Computational Thinking Fall Semester 2016

Master s Thesis 60 credits

Layout Appropriateness: A metric for evaluating user interface widget layout

Human-Computer Interaction. Chapter 2. What is HCI?


Interaction Techniques. SWE 432, Fall 2017 Design and Implementation of Software for the Web

Introduction to Accessibility. Universal Usability and Internationalization of Interfaces

Evaluating the Design without Users. A formalized way of imagining people s thoughts and actions when they use an interface for the first time

CSE 331 Software Design & Implementation

Usability Testing. November 9, 2016

Interaction Techniques. SWE 432, Fall 2016 Design and Implementation of Software for the Web

Table of Contents. Copyright JAVS

Human Factors / User Interface Design Guidelines. Slides adapted from Craig Zilles

The Pennsylvania State University College of Information Sciences and Technology. Group 5

14.6 Human Computer Interaction.

CAR-TR-673 April 1993 CS-TR-3078 ISR AlphaSlider: Searching Textual Lists with Sliders. Masakazu Osada Holmes Liao Ben Shneiderman

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

CareCarma. It Starts With Family

High Performance Computing

the Hick Hyman Law Pearson Addison-Wesley. All rights reserved. 6-1

The Interaction. notion of interaction interaction frameworks ergonomics interaction styles context of interaction

USER INTERFACE DESIGN

CS147: dt+ux Design Thinking for User Experience Design, Prototyping & Evaluation Autumn 2017 Prof. James A. Landay Stanford University

Easing the Generation of Predictive Human Performance Models from Legacy Systems

cognitive models Morten Fjeld HCI Course, Spring term 2006 chapter 12 Cognitive models Common assumptions about the architecture of the human mind

CS147: dt+ux Design Thinking for User Experience Design, Prototyping & Evaluation Autumn 2016 Prof. James A. Landay Stanford University

Customer Maintenance

COMP 388/441 HCI: 07 - Menu Selection, Forms, and Dialog Boxes Menu Selection, Forms, and Dialog Boxes

HCI: ACCESSIBILITY. Dr Kami Vaniea

Plunging into the waters of UX

Interaction Style Categories. COSC 3461 User Interfaces. Windows. Window Manager

Once Written, Twice Applied: The Basics of Array Processing In SAS Elizabeth A. Roth, RAND Corporation, Santa Monica, CA

ACT-Touch. Cogscent, LLC. ACT-R gets its Hands on a Multitouch Display Frank Tamborello & Kristen Greene

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

Team : Let s Do This CS147 Assignment 7 (Low-fi Prototype) Report

Puzzling it out: Piecing together topics to create scenario-based documentation. NetApp Information Engineering

User Interface Design

Functions in Processing

Etarmis. LITE User Guide

MIT GSL week 4 Wednesday. User Interfaces II

DEPARTMENT OF COMPUTER SCIENCE UNIVERSITY OF TORONTO CSC318S THE DESIGN OF INTERACTIVE COMPUTATIONAL MEDIA. Lecture Feb.

What is a good pen based application? HCI For Pen Based Computing. What is a good UI? Keystroke level model. Targeting

Understanding & Modeling Input Devices

CS 4317: Human-Computer Interaction

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

Physics 2660: Fundamentals of Scientific Computing. Lecture 5 Instructor: Prof. Chris Neu

Altus Call Recording. Dashboard Admin User Guide. Document Version Maryland Way, Suite 300 Brentwood, TN Tel

Modelling human-computer interaction

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT REALISING THE USER INTERFACE

h(p://ihm.tumblr.com/post/ /word- cloud- for- hci- human- computer- interacbon CS5340 Human-Computer Interaction ! January 31, 2013!

Creating a File and Folder Backup

Transcription:

GOMS Adapted from Berkeley Guir & Caitlin Kelleher 1

GOMS Goals what the user wants to do Operators actions performed to reach the goal Methods sequences of operators that accomplish a goal Selection Rules describe when to choose one method over another 2

What is GOMS? Crashtest Dummy for HCI" Family of techniques for modeling user s interactions with a system Like cognitive walkthrough, begin with a set of tasks and the steps to accomplish them. 3

Cog Sci based Modeling Use what we know about human information processing in the brain to make predictions about how people will perform. Problem: the brain is still poorly understood. Experts only allows us to get away from trying to predict how someone might misinterpret a UI element, etc. 4

The Model Human Processor Card, Moran, & Newell ( 83) (based on empirical data) Long-term Memory Working Memory sensory buffers Visual Image Store Auditory Image Store Eyes Ears Perceptual Processor Motor Processor Cognitive Processor Fingers, etc. 5

Where are we? We can t accurately predict planning time. We can predict execution time given an expert who makes no errors. 6

Which leaves. Learning Errors Long term recall Fatigue Etc. 7

Novice vs. Expert Use User testing focuses almost exclusively on first experiences. Learning issues Comprehension issues The GOMS family allows you to look at the expert experience. Where will users time go? Is that where we expect? Are there repetitive aspects we can eliminate? 8

So today GOMS techniques are most useful for systems where There will be experts Users repeatedly perform a (relatively) small number of tasks GOMS is good for streamlining the efficiency of a process 9

How to do a GOMS Analysis Generate task description Pick high-level user Goal Write Method for accomplishing Goal - may invoke subgoals Write Methods for subgoals This is recursive Stops when Operators are reached Evaluate description of task Apply results to UI Iterate Adapted from Chris Long 10

Keystroke-Level Model Model was developed to predict time to accomplish a task on a computer Predicts expert error-free task-completion time with the following inputs: a task or series of subtasks method used command language of the system motor-skill parameters of the user response-time parameters of the system Prediction is the sum of the subtask times and overhead 11

KLM Accuracy Widely validated in academia KLM predictions are generally within 10-20% of actual expert performance Simplified cognitive model 12

KLM Keystroke level model 1. Predict Action 1 x sec. Action 2 y sec. Action 3 + z sec. 2. Evaluate t sec. Time using interface 1 Time using interface 2 13

Symbols and values Operator Remarks Time (s) Raskin excludes K B P H D Press Key Mouse Button Press Point with Mouse Home hand to and from keyboard Drawing - domain dependent 0.2.10/.20 1.1 0.4 - M R Mentally prepare Response from system - measure 1.35 - Assumption: expert user 14

Raskin s rules i.e. not when P points to arguments Rule 0: Initial insertion of candidate M s M before K M before P iff P selects command K B P H D M R 0.2.10/.20 1.1 0.4-1.35 - Rule 1: Deletion of anticipated M s e.g. when you point and click If an operator following an M is fully anticipated, delete that M. 15

Raskin s rules e.g. 4564.23 Rule 2: Deletion of M s within cognitive units If a string of MK s belongs to a cognitive unit, delete all M s but the first. K B P H D M R 0.2.10/.20 1.1 0.4-1.35 - Rule 3: Deletion of M s before consecutive terminators e.g. ) If a K is a redundant delimiter, delete the M before it. 16

Raskin s rules Chunking e.g. 4564.23 Rule 2: Deletion of M s within cognitive units If a string of MK s belongs to a cognitive unit, delete all M s but the first. K B P H D M R 0.2.10/.20 1.1 0.4-1.35 - Rule 3: Deletion of M s before consecutive terminators e.g. ) If a K is a redundant delimiter, delete the M before it. 17

Raskin s rules Rule 4: Deletion of M s that are terminators of commands If K is a delimiter that follows a constant string, delete the M in front of it. Rule 5: Deletion of overlapped M s Do not count any M that overlaps an R. K B P H D M R 0.2.10/.20 1.1 0.4-1.35-18

Example 1 Temperature Converter Choose which conversion is desired, then type the temperature and press Enter. Convert F to C. Convert C to F. K B P H D M R 0.2.10/.20 1.1 0.4-1.35-19

Example 1 Temperature Converter Choose which conversion is desired, then type the temperature and press Enter. Convert F to C. Convert C to F. K B P H D M R 0.2.10/.20 1.1 0.4-1.35 - HPB (select F to C) PB (click in text box) HKKKKK Apply Rule 0 HMPMB PMB HMKMKMKMK MK Apply Rules 1 and 2 HMPB PB HMKKKKMK Convert to numbers.4+1.35+1.1+.20+ 1.1 +.2 +.4+1.35+4(.2)+1.35+.2 =8.45 20

Which is more efficient? Operator Description Time (2) K Key Press 0.2 B Mouse Button Press 0.1 P Pointing with Mouse 1.1 H Home hand to/from 0.4 keyboard M Mentally prepare 1.35 21

How to do a GOMS Analysis Generate task description Evaluate description of task Apply results to UI Look for ways to remove steps (learning + execution) Look for ways to reuse sub-methods (learning) Make sure that the end state is the goal (error prevention) Iterate Adapted from Chris Long 22

Real World Example Nynex estimates that for every second saved in operator support, a company could save $3 million a year. Phone co. considering replacing old workstations with new ones. 1000 new stations, $10K each. Promised to reduce operator support time. 23

Toll and Assistance Operators The people you get when you dial 0 (person to person, collect calls, calling card calls) Their task answer 3 questions: Who pays? At what rate? Is the connection complete? 24

Field Trial to finalize purchase $70 million purchase for new system Wanted to gain in-house experience with training, using and maintaining new system Hoped to be able to show benefits of new system. 25

Prediction: New System Faster Based on Reduced keystrokes on most call categories New system 880 msec faster at displaying a screenful of info Estimated savings of $12.3 million a year 26

Alongside: GOMS Modeling Models being built during the field trial. Based on the system specs, not field observation. 27

Field Trial Data New system ~1 second slower Learning does not appear to be a factor Differences in average times didn t converge over time, as you would expect if there was a training effect. 28

Field Trial Results 78,240 phone calls, four months 24 TAOs using new system 24 TAOs using old system 29

What s the problem? Original model captured a serial sequence of operations. In fact, TAOs do several things in parallel Listening to a customer Viewing display Entering info via the keyboard Model needs to use CPM GOMS (Critical Path Method/Cognitive Perceptual Motor) 30

With CPM-GOMS Predicted.6 sec slower Empirical data.8 sec slower Correctly predicted direction of difference in 13/15 call categories 31

GOMS old vs. new [Gray et al., GOMS Meets the Phone Company, Interact, 1990] 32

GOMS old vs. new [Gray et al., GOMS Meets the Phone Company, Interact, 1990] 33

Model as Explanation [Gray et al., GOMS Meets the Phone Company, Interact, 1990] 34

Where do I use this? Situations where task performance is critical Airline/auto displays Emergency management systems Process control systems Customer service systems Etc. 35

Caveats *Very* idealized Assuming expert user Assuming perfect performance Depending on the scenario, you may need to choose your version of GOMS carefully 36

CogTool - Automating KLM Free software from Bonnie John s group at CMU Take a collection of images Define the transitions Out comes a KLM model 37

Health Kiosk Screen Shots From: http://www.andrew.cmu.edu/user/sanchitg/healthkiosk.html 38

Example CogTool Comparison Results From Bellamy, John, and Kogan Deploying CogTool: Integrating Quantitative Usability into Real-World Software Development 39

Constant Time for Pointing Operator Remarks Time (s) Can we do better? K B P H D Press Key Mouse Button Press Point with Mouse Home hand to and from keyboard Drawing - domain dependent 0.2.10/.20 1.1 0.4 - M R Mentally prepare Response from system - measure 1.35 - Assumption: expert user 40

Fitts Law Models movement time for selection tasks The movement time for a well-rehearsed selection task increases as the distance to the target increases decreases as the size of the target increases 41

Fitts Law Time (in msec) = a + b log 2 (D/S+1) where a, b = constants (empirically derived) D = distance S = size ID is Index of Difficulty = log 2 (D/S+1) 42

Fitts Law Time = a + b log 2 (D/S+1) Target 1 Target 2 Same ID Same Difficulty 43

Fitts Law Time = a + b log 2 (D/S+1) Target 1 Target 2 Smaller ID Easier 44

Fitts Law Time = a + b log 2 (D/S+1) Target 1 Target 2 Larger ID Harder 45

Determining Constants for Fitts Law To determine a and b design a set of tasks with varying values for D and S (conditions) For each task condition multiple trials conducted and the time to execute each is recorded and stored electronically for statistical analysis Accuracy is also recorded either through the x-y coordinates of selection or through the error rate the percentage of trials selected with the cursor outside the target 46

A Quiz Designed to Give You Fitts http://www.asktog.com/columns/022designedtogiv efitts.html Microsoft Toolbars offer the user the option of displaying a label below each tool. Name at least one reason why labeled tools can be accessed faster. (Assume, for this, that the user knows the tool and does not need the label just simply to identify the tool.)

A Quiz Designed to Give You Fitts 1. The label becomes part of the target. The target is therefore bigger. Bigger targets, all else being equal, can always be acccessed faster. Fitt's Law. 2. When labels are not used, the tool icons crowd together. 48

A Quiz Designed to Give You Fitts You have a palette of tools in a graphics application that consists of a matrix of 16x16-pixel icons laid out as a 2x8 array that lies along the lefthand edge of the screen. Without moving the array from the left-hand side of the screen or changing the size of the icons, what steps can you take to decrease the time necessary to access the average tool?

A Quiz Designed to Give You Fitts 1. Change the array to 1X16, so all the tools lie along the edge of the screen. 2. Ensure that the user can click on the very first row of pixels along the edge of the screen to select a tool. There should be no buffer zone.

Fitts Law Example Pop-up Linear Menu Pop-up Pie Menu Today Sunday Monday Tuesday Wednesday Thursday Friday Saturday Which will be faster on average? pie menu (bigger targets & less distance) 51

Principles of Operation (cont.) Fitts Law moving hand is a series of microcorrections correction takes T p + T c + T m = 240 msec time T pos to move the hand to target size S which is distance D away is given by: T pos = a + b log 2 (D/S + 1) summary time to move the hand depends only on the relative precision required 52