USABILITY EVALUATIONS OF Health Information Technology

Similar documents
Learnability of software

HCI and Design SPRING 2016

Applying Usability to elearning

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

cs465 principles of user interface design, implementation and evaluation

Usability. HCI - Human Computer Interaction

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

Design Heuristics and Evaluation

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

Usability of interactive systems: Current practices and challenges of its measurement

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

Product Development for Medical, Life Sciences, and Consumer Health

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

Foundation Level Syllabus Usability Tester Sample Exam

03 Usability Engineering

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

HUMAN COMPUTER INTERACTION

Usability in Multimedia. By Pınar Koçer Aydın and Özgür Bayram

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

evision Review Project - Engagement Simon McLean, Head of Web & IT Support Information & Data Services.

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

SFU CMPT week 11

Analytical &! Empirical Evaluation

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

Introducing Evaluation

Safety-enhanced Design EDIS 2014 R (a)(1) Computerized Provider Order Entry

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Heuristic Evaluation of NUIG Participate Module 1

USER RESEARCH Website portfolio prototype

Usability. CSE 331 Spring Slides originally from Robert Miller

User Interface Evaluation

Evaluation studies: From controlled to natural settings

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

User Experience Report: Heuristic Evaluation

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

Chapter 4. Evaluating Interface Designs

HEURISTIC EVALUATION WHY AND HOW

User Centered Design (UCD)

A multi-perspective methodology for studying user interactions with a decision-support system

IPM 10/11 T1.6 Discount Evaluation Methods

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

Introduction What is Usability and Usability Testing?

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Human-Computer Interaction

Introduction to Usability and its evaluation

Amsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

iscreen Usability INTRODUCTION

Usability Testing Essentials

CUE-10: Moderation Page 1. Comparative Usability Evaluation 10. Moderation. Observing usability test moderators

User Testing Study: Collaborizm.com. Jessica Espejel, Megan Koontz, Lauren Restivo. LIS 644: Usability Theory and Practice

Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering

Designing Usable Apps

IBM MANY EYES USABILITY STUDY

EVALUATION OF PROTOTYPES USABILITY TESTING

CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018

Users and Usability Principles

Software Development and Usability Testing

Usability Testing CS 4501 / 6501 Software Testing

Improving Government Websites and Surveys with Usability Testing

Crab Shack Kitchen Web Application

..in a nutshell. credit: Chris Hundhausen Associate Professor, EECS Director, HELP Lab

Programmiersprache C++ Winter 2005 Operator overloading (48)

MAKING YOUR GATEWAY EASY AND PLEASANT TO USE

Jakob Nielsen s Heuristics (

Foundation Level Syllabus Usability Tester Sample Exam Answers

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

Property of Shree Dhavale. Not for public distribution. Practicum Report. Shree Dhavale

CS 315 Intro to Human Computer Interaction (HCI)

Comp 15 - Usability and Human Factors. Overview. Usability. Unit 5a - Usability Evaluation Methods

SWEN 444 Human Centered Requirements and Design Project Breakdown

Explanation A. User Experience Design

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

User Experience Research Report: Heuristic Evaluation

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

Methods: Deciding What To Design

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

White Paper. Incorporating Usability Experts with Your Software Development Lifecycle: Benefits and ROI Situated Research All Rights Reserved

CHAPTER 4 HUMAN FACTOR BASED USER INTERFACE DESIGN

Due on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction

Generating and Using Results

Folsom Library & RensSearch Usability Test Plan

Human-Computer Interaction IS4300

Heuristic Evaluation Report. The New York Philharmonic Digital Archives archives.nyphil.org

UX Research. research methods overview. UX Research research methods overview. KP Ludwig John

A Step-Wise Evaluation of Three Commonly used Statistical Packages in Institutional Research

Software Design and Evaluation by Ergonomics Knowledge and Intelligent Design System (EKIDES)

Web Evaluation Report Guidelines

Human Computer Interaction in Health Informatics: From Laboratory Usability Testing to Televaluation of Web-based Information Systems

Usability Testing, Cont.

SWEN 444 Human Centered Requirements and Design Project Breakdown

MIT GSL week 4 Wednesday. User Interfaces II

USER SEARCH INTERFACES. Design and Application

USABILITY IN HEALTHCARE IT: DATA COLLECTION AND ANALYSIS APPROACHES

Transcription:

USABILITY EVALUATIONS OF Health Information Technology 1

Outline Usability, general overview Usability evaluation methods Human-computer usability heuristics Considerations when developing usability evaluations Hands on usability evaluation experience this afternoon Summarize usability evaluations tomorrow & plan for next training session 2

Overall objectives Expand knowledge and skills for conducting formal usability evaluations & apply results to continuously improve IT applications at GHS Create a network of usability coordinators to provide feedback & expertise on usability evaluations gained from continuous learning & experience associated with software design & implementation 3

We all have mental models.. 4

Software usability The extent to which the product can be used by specified users to achieve specified goals..[including].. effectiveness, efficiency & satisfaction. (Karat, 1997) Also refers to methods for improving ease-ofuse through user-centered design 5

Usability through the eyes of Jakob Nielsen Usability is the degree to which the system is easy to use or user friendly Nielsen has defined 5 facets of usability: learnability efficiency memorability errors satisfaction 6

Learnability A system should be easy to learn so the user can rapidly catch on. This is more likely if the system: is intuitive/logical to end user accounts for the end user transferring &/or unlearning skills from previous experience 7

How measure learnability? Amount of time it takes to achieve proficiency on system use Complexity of navigating through software (captured through mapping) User feedback Other 8

Efficiency of use The system should be efficient to use, so once the user has learned the system, a high level of productivity is possible. New system s efficiency should also motivate user to want to use it 9

Efficiency of use Efficiency has been shown to be the most significant measure of physician satisfaction with CPOE response time & ease of use (Bates group, VA groups, Patel group) 10

How measure efficiency of use? Amount of time for users to perform typical task(s) Number of keystrokes to accomplish a task ** Keep in mind levels of users: super-users through casual, less frequent users 11

Memorability A system should be easy to remember so even the intermittent or casual user is able to return to the system after some period of not having used it, without having to learn everything all over again. logical some degree of repetitiveness consider what else users interact with 12

How measure memorability? Recall of casual users who haven t used/observed much Memory test with users after intro session Others 13

Errors A system should have a low error rate, so users make few errors during the use of the system, and, if they do make errors, they can easily recover from them. CATASTROPHIC ERRORS MUST NOT OCCUR! 14

If an error does occur.. What is the effect? (What risk/consequence? How great/severe?) Can the system recover? (How long? What s entailed?) 15

Satisfaction A system should be pleasant to use, so users are subjectively satisfied with using it; they like it. Getting at issues of look & feel 16

How measure satisfaction? Most commonly measured by capturing user opinion through: focus groups interviews questionnaires 17

Why conduct usability evaluations? Ensure user-centered design Increase user satisfaction Increase user productivity Decrease costs of redesign later, after system is implemented Others 18

When conduct usability evaluations? Prior to system procurement During in-house software design When developing training programs, policies & procedures In conjunction with FMEA, RCA, or other type of error analysis When designing work environments To replicate manufacturer s testing Others 19

Common usability evaluation methods Heuristic evaluation using common heuristic criteria Simulation, role playing generally in a lab Field study/observation actual use Rapid reflection evaluators convene &, thru group process, develop consensus of reactions 20

Heuristic evaluation.. an inspection method requiring 3 to 5 evaluators, who apply widely-recognized heuristics, that provide an overview and identify system difficulties. Finish with assessment of the risk, severity, and priority of each violation. 21

Heuristics evaluations -- common scoring definitions RISK Potential risk for patient care none low = moderate event (rare instances of increased length of stay or level of care) medium = major event (permanent loss of bodily functioning sensory, motor, physiologic, intellectual) high = catastrophic event; death or permanent loss of function (transfusion reaction, wrong-side surgery) 22

Heuristics evaluations -- common scoring definitions SEVERITY Severity of problems to users low = user delayed/annoyed medium = user accomplishes task with difficulty; frustrated high = user unable to accomplish task 23

Heuristics evaluations -- common scoring definitions PRIORITY Priority for redesign low = problem should be fixed when resources are available medium = problem should be fixed high = problem must be fixed 24

Fourteen human-computer usability heuristics ( Nielsen-Schneiderman Heuristics ) per Zhang, et al 2003 1. Consistency users should not have to wonder whether different words, situations, or actions mean the same thing. Standards & conventions should be followed. 25

26

Nielsen-Schneiderman Heuristics, cont d. 2. Visibility users should be informed about what s going on with the systems through appropriate feedback & display of information. 27

28

Nielsen-Schneiderman Heuristics, cont d. 3. Match the image of the system perceived by the users should match the model the users have about the system. 29

30

Nielsen-Schneiderman Heuristics, cont d. 4. Minimalist any extraneous information is a distraction & slow-down. 31

32

Nielsen-Schneiderman Heuristics, cont d. 5. Memory users should not have to be required to memorize a lot of information to carry out tasks. 33

Nielsen-Schneiderman Heuristics, cont d. 6. Feedback users should be given prompt & informative feedback. 34

35

Nielsen-Schneiderman Heuristics, cont d. 7. Flexibility users always learn & users are different. Give users the flexibility to create customization & shortcuts to accelerate their performance. 36

37

Nielsen-Schneiderman Heuristics, cont d. 8. Message the messages should be informative enough such that users can understand the nature of errors, learn from errors & recover from them. 38

39

Nielsen-Schneiderman Heuristics, cont d. 9. Error it is always better to design interfaces that prevent errors from happening in the first place. 40

41

Nielsen-Schneiderman Heuristics, cont d. 10.Closure every task has a beginning and an end. Users should be clearly notified about the completion of a task. 42

43

Nielsen-Schneiderman Heuristics, cont d. 11.Undo users should be allowed to recover from errors. Reversible actions also encourage exploratory learning. 44

Nielsen-Schneiderman Heuristics, cont d. 12.Language the language utilized should always be presented in a form understandable by the intended users. 45

46

Nielsen-Schneiderman Heuristics, cont d. 13. Control do not give users that impression that they are controlled by the systems. 47

48

Nielsen-Schneiderman Heuristics, cont d. 14. Document always provide help when needed. 49

Heuristic evaluation Issues: need to identify evaluators who understand heuristics are all or only certain heuristics to be applied? clearly define risk, severity & priority scores 50

Heuristic evaluation, cont d. Pros: low cost efficient not difficult to implement Cons: resources necessary take caution to not oversimplify 51

{Go to heuristic evaluation of a CPOE system IEA PowerPoint} 52

Simulation. a means of evaluating user interaction with a product/system/etc. in other than the actual setting &/or work place 53

Simulation Issues: clearly identify objective direct impact on design of simulation identify whom to evaluate type of provider and/or level of user; recruitment or assign role playing? generally need to develop vignettes/scenarios data collection important video? audio? notes? who act as evaluator(s)? need to pilot test prior to beginning 54

Simulation, cont d. Pros: provides thorough evaluation opportunity (get-to-the-point evaluation) because of repetition, provides objective means of assessment in some instances can also provide JIT training Cons: time- & resource-intense for design if use think-aloud technique it s far from perfect 55

Field study/direct observation. means of evaluating user interaction with and use of system in their work environment while performing the task of interest 56

Field study/direct observation Issues: clearly define objective drives data collection & analysis ability to observe what most interested in find observers willing/interested in watching people perform their work find users willing to allow being observed Hawthorne effect? how/who collect what observing audio? video? notes? 57

Field study/direct observation, cont d. Pros: REAL use of system being observed no need to design scenarios, role-playing, etc. Cons: may prove time-consuming (depends on what want to observe) data collection challenging 58

Rapid reflection. a means of capturing evaluators observations & gut reactions of system use 59

Rapid reflection Issues: no structure to data collection dependent on recall group think when evaluators convene? Hawthorne effect on users 60

So. what method is best? Consider objectives, constraints, circumstances of method and project Triangulation (combination) of methods is becoming more common frequently mixing more quantitative methods with qualitative methods 61

Results from one study. (Kjeldskov, et al 2005) Distribution of usability problems by degree Critical Serious Cosmetic Field (direct obsevation) Lab (simulation) Heuristics (criteria-based) Rapid reflection (recall) Reference: Kjeldskov, J., Graham, C., Pedell, F., Howard, S., Balbo, S. & Davies, J. (2005). Evaluating the usability of a mobile guide: the influence of location, participations and resources. Behaviour & Information Technology, 24 (1), 51-65. 62

Who to involve as users in usability evaluation (Laxmisan, Malhotra, Keselman, Johnson, Patel, 2005) Sharp-end practitioners focus on clinical & human aspects providers (considerations: type, previous dis/similar experience, level of use rare, frequent, super users) Blunt-end practitioners focus on device/system, documentation & training engineers administrators 63

Who to involve as evaluators in usability evaluation Individuals with understanding of usability evaluation techniques Individuals who can communicate both ways and serve as interface between users and designers (i.e., analysts) 64

Measures / Usability benchmark Standard against which the testing is being analyzed Provides a threshold for objective evaluation 65

Collect measures of. performance amount of time number of key presses measurements errors post-task need for assistance comments subjective rating/ preference process mapping use reference to manual comments mode error device alarms screen displays subjective rating/ preference 66

Collect measures of... but what do you already have?? system data incident reports other retrospective performance data performance data from manufacturer 67

User feedback methods Video Audio think aloud Focus group Questionnaire Interview Measures. 68

Capturing user satisfaction or perceptions Issues: design of data collection instrument important must understand issue being conveyed, unbiased instrument information collected may be difficult to prove synthesizing & summarizing data best if it entails both qualitative & quantitative information 69

User/evaluator feedback Pros: generally easy to capture (once users willing to provide) Cons: can be difficult to summarize depends on recall and perceptions 70

Quick review Determine objective(s) Determine method(s) & design/include necessary materials ; this generally drives the setting Select whom to involve as users; recruit Select whom to involve as evaluators Identify what to collect & how to collect it Determine means of analyzing data Identify time constraints/timeline 71

CAUTION!! Don t try to address everyone s issues stick to those of importance Keep design clear and simple 72

Questions?? 73

Usability evaluations to be conducted this afternoon Conduct usability evaluation of emar using Nielsen-Schneiderman criteria {emar presentation excerpts from Linda Musser} 74

emar Usability Evaluation One person ( mock end user ) will navigate through the system under direction of. other person ( usability coordinator ) who will give instructions, provide direction on what user should do and record issues identified. Jointly, the two will identify system design issues, assign evaluation heuristics, and risk, severity & priority scores. Findings will be shared Friday morning in feedback session. 75

References Berg, M. Patient care information systems & health care work: a sociotechnical approach. International Journal of Medical Informatics, 1999; 55:87-101. Nielsen, J. Usability Engineering. New York: AP Professional, 1993. Norman, DA. The Design of Everyday Things. New York: Doubleday. 1998. Schniederman, B. Designing the User Interface: Strategies for Effective Human-Computer Interaction. Reading, MA: Addison Wesley Longman, Inc. 1998. http://usability.gov/basics/index.html http://www.stcsig.org/usability/resources/toolkit/toolkit.html Questionnaire for User Interaction Satisfaction http://lap.umd.edu/quis/ 76