THE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW

Similar documents
Learnability of software

USER-CENTERED DESIGN KRANACK / DESIGN 4

Accelerates Timelines for Development and Deployment of Coatings for Consumer Products.

EVALUATION OF THE USABILITY OF EDUCATIONAL WEB MEDIA: A CASE STUDY OF GROU.PS

NPTEL Computer Science and Engineering Human-Computer Interaction

How to Conduct a Heuristic Evaluation

User-centered design in technical communication

OpenScape Contact Center Multimedia. First Contact Resolution in a Multi-Channel World <insert date here>

POLICY STATEMENT. eportfolio Policy: Policy to support the eportfolio and practitioner engagement.

Joint Application Design & Function Point Analysis the Perfect Match By Sherry Ferrell & Roger Heller

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system

Usability Testing CS 4501 / 6501 Software Testing

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

Agile Master Data Management TM : Data Governance in Action. A whitepaper by First San Francisco Partners

SFU CMPT week 11

User Experience Report: Heuristic Evaluation

Working with Health IT Systems is available under a Creative Commons Attribution-NonCommercial- ShareAlike 3.0 Unported license.

Evaluation techniques 1

Evaluation techniques 1

THINK THE FDA DOESN T CARE ABOUT USER EXPERIENCE FOR MOBILE MEDICAL APPLICATIONS? THINK AGAIN.

The Pluralistic Usability Walk-Through Method S. Riihiaho Helsinki University of Technology P.O. Box 5400, FIN HUT

How to Create Consumer-Friendly Health IT

APPLYING HUMAN FACTORS ENGINEERING TO IMPROVE USABILITY AND WORKFLOW IN PATHOLOGY INFORMATICS

Automated Medical Patient Evaluation System - Phase 2 Design Report

Perfect Timing. Alejandra Pardo : Manager Andrew Emrazian : Testing Brant Nielsen : Design Eric Budd : Documentation

University of Maryland. fzzj, basili, Empirical studies (Desurvire, 1994) (Jeries, Miller, USABILITY INSPECTION

Setting Usability Requirements For A Web Site Containing A Form Sarah Allen Miller and Caroline Jarrett

A Patient s Guide to the Portal

Combining Usability Research with Documentation Development for Improved User Support

Usability Services at the University of Maryland: Who, What and How

Using the Web in Your Teaching

HPE Network Transformation Experience Workshop Service

mhealth Wearables Report 2016 SCORR Marketing and Applied Clinical Trials

Microsoft SharePoint Server 2013 Plan, Configure & Manage

Indiana s Family Development Certification Guide Supported By:

Florida Safe Families Network

CRITICAL INCIDENT STRESS MANAGEMENT

White Paper. RingCentral. Professional Services Implementation & Onboarding Methodology

Compile together the individual QA Testing Checklists for your team site.

Acurian on. The Role of Technology in Patient Recruitment

IBM s approach. Ease of Use. Total user experience. UCD Principles - IBM. What is the distinction between ease of use and UCD? Total User Experience

Design Heuristics and Evaluation

The Information Technology Program (ITS) Contents What is Information Technology?... 2

PUTTING THE CUSTOMER FIRST: USER CENTERED DESIGN

COSO 2013: Implementing the Framework

Business Analysis for Practitioners - Requirements Elicitation and Analysis (Domain 3)

Memorandum of Understanding between the Central LHIN and the Toronto Central LHIN to establish a Joint ehealth Program

CUE-10: Moderation Page 1. Comparative Usability Evaluation 10. Moderation. Observing usability test moderators

Candidate Profile for the Position of Vice President, Education and Certification

Predictive Insight, Automation and Expertise Drive Added Value for Managed Services

3Lesson 3: Web Project Management Fundamentals Objectives

2016 NCCA Standards Revisions Recap and Takeaways: What You Need to Know

Professor Hausi A. Müller PhD PEng FCAE Department of Computer Science Faculty of Engineering University of Victoria

Data Virtualization Implementation Methodology and Best Practices

Euro-BioImaging Preparatory Phase II Project

The Analysis and Proposed Modifications to ISO/IEC Software Engineering Software Quality Requirements and Evaluation Quality Requirements

MITEL. Applications Suite FOR MITEL 3300 IP COMMUNICATIONS PLATFORM

The ebuilders Guide to selecting a Web Designer

Choosing the Right Usability Tool (the right technique for the right problem)

Healthcare mobility: selecting the right device for better patient care

King County Housing Authority Delivers Multimedia Online Help with MadCap Doc-To-Help

The Clinical Data Repository Provides CPR's Foundation

Young Adult Case Plan & Judicial Review Worksheets User Guide

WASHINGTON UNIVERSITY HIPAA Privacy Policy # 7. Appropriate Methods of Communicating Protected Health Information

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

Improve Under Performing Web Content with a Content Audit DAVID STURTZ DIRECTOR OF PRODUCT STRATEGY

Task-Selection Bias: A Case for User-Defined Tasks

Colburn Community Primary School ICT and Computing Policy

Systems Analysis & Design

iscreen Usability INTRODUCTION

Hospital System Accelerates Stroke Diagnosis

CS3205 HCI IN SOFTWARE DEVELOPMENT PROTOTYPING STRATEGIES. Tom Horton. * Material from: Floryan (UVa) Klemmer (UCSD, was at Stanford)

QUICK START USER S GUIDE

Overview of the Multi-Payer Claims Database (MPCD)

ehealth in Southwestern Ontario

EXAM PREPARATION GUIDE

Software Development and Usability Testing

Essential Guide. Chapter 1: The Physiotherapy Competency Exam (PCE)

User Interface Evaluation

Telehealth Workforce Offers Unique Competencies & Opportunities #245, February 23, 2017 Jay Weems, Vice-President, Operations, Avera ecare

What is the Joint Application Development (JAD) Process?

Architecture and Standards Development Lifecycle

ERP/CRM System Implementation Methodology

The Seven Habits of Highly Effective Usability People

Usability in Radiation Therapy Software Define usability and in the process learn to identify usability problems

KIRKPATRICK FOUR LEVELS EVALUATION CERTIFICATION PROGRAM - SILVER LEVEL

How Secure Do You Feel About Your HIPAA Compliance Plan? Daniel F. Shay, Esq.

THERAPISTS. Make Every Call Count

INTRODUCTION TO THE QUALIFIED COURSE PROFESSIONAL GUIDELINES

PAKISTAN HOW TO SPEED UP THE INTRODUCTION OF EHEALTH SERVICES IN DEVELOPING COUNTRIES

IT SECURITY OFFICER. Department: Information Technology. Pay Range: Professional 18

Chapter 15: Analytical evaluation

Track 1 // Collaboration & Partnerships

Webcasts, Webinars, Web Conferencing and Video Conferencing: What's the Difference?

Problem and Solution Overview: An elegant task management solution, that saves busy people time.

CICT ACCREDITED INSTRUCTOR TRAINER PROGRAM

COMP6471 WINTER User-Centered Design

The LUCID Design Framework (Logical User Centered Interaction Design)

Touchworks Enterprise EHR And Follow My Health Patient Portal Integration Tasking Guide

Provider File Management Guide

Transcription:

THE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW Dianne Davis Fishbone Interactive Gordon Tait Department of Surgery, University of Toronto Cindy Bruce-Barrett Strategic Projects, The Hospital for Sick Children The Hospital for Sick Children developed a web based referral process to replace their current paper based referral system for the acceptance and management of referrals from community pediatricians and other health care professionals. Partnered usability testing, involving an external usability consultant and an internal project manager (i.e., a nurse at the hospital), familiar with the referral process as well as the prototype functionality was used to assess the prototype before completing the final development stage. The partnered usability sessions were a unique way to deal with the usability test of a very complex application that must accommodate multiple user groups as well as multiple and interlinked workflows. This approach was very successful in identifying ways in which the application did not capture some of the more fine grained details of work flow. The detection of some of the subtleties related to workflow would not have been possible without the participation of the internal project manager who was familiar with the minutia of the paper referral process in addition to the detailed functions of the prototype and its missing functionality. Details of the roles and steps involved in partnered usability testing are discussed as well as keys to successful implementation. INTRODUCTION The Hospital for Sick Children (Sick Kids) in Toronto, Canada, currently uses a paper-based referral system for the acceptance and management of referrals from community pediatricians and other health care professionals. A review of ambulatory services at the Hospital found that the paper-based referral system had a significant number of inappropriate and incomplete referrals, was often inefficient, and occasionally resulted in long delays in processing referrals or referrals being misplaced. The Hospital decided to implement a Web-based referral process to help solve some of these problems, assist in the standardization of the process of acceptance and management of referrals from community pediatricians and other health care professionals, and to provide a measure of wait times for patient appointments by priority of the referral. A prototype of the Ambulatory Referral Management System (ARMS) was developed by an internal development team after an extensive requirements gathering phase. To assess the prototype before completing the final development phase, the team employed usability testing both to capture usability issues related to navigation and interface, and to identify gaps where the online system failed to mirror the capabilities and intricacies of the current paper-based referral process. To provide an objective assessment of the prototype, a decision was made to use a partnered approach involving an external usability consultant and an internal project manager familiar with the referral process as well as the prototype functionality. In this paper we describe how partnered usability testing was used to identify gaps in the online work flow. In addition, the benefits and pitfalls of the process as well as keys to successful implementation are discussed. Challenges The creation of this online referral system was complex and challenging for developers. The main challenge when converting any paper process to an

electronic one is that it must retain as much of the inherent benefits of paper (i.e., flexibility, mobility, browseability) as possible. Paper is handlable, manipulable, portable, and dismantlable (Luff & Heath, 1998, p.307), and the electronic referral system must be able to support these capabilities in the context of the user s workflow. A second challenge, specific to this system, was that the processing workflow varied from department to department. A third challenge in creating the online referral system was that the system had to accommodate the goals and tasks of very different user groups (e.g., physicians, nurses, clerks). Software applications (e.g., word processing programs) often have a variety of user groups; however they all typically use the application for similar purposes. In this case, the same application was to be used by a variety of different user groups both internally and externally for a variety of different purposes. For example, internal physicians needed to use the system to review new cases and approve their acceptance. In contrast, referring physicians (both internally and externally) would use the system to assess current guidelines, submit referrals, and check on referral status. The final challenge faced in creating this application, was that not only did the system have to accommodate different user goals and intentions, it also had to allow interlinking and handing-off between the various user groups while they worked through the steps in the referral management process on the same referral. For example, a clerk could interact with a single referral at three or more different points during the referral management process. They might first process the referral as an incoming fax, and assign it a reviewer. They would then need to see it a second time after the internal physician reviewed it to track down information from the referring physician. Finally, the clerk would see the referral a third time, after the physician accepted it, to book the patient. During each stage in the process, the various user groups act on the same referral, communicating their actions to the other user groups so that they can perform their respective roles. These challenges are not only relevant to developers but also to those evaluating the usability of the system. Usability assessment must be able to determine to what extent the system has captured the complexities of the current paper based system. For example, it must be able to assess whether the system accommodates the inherent benefits of paper, as well as the goals, tasks and interlinking of different user groups. Assessment in this case would be particularly challenging as the paper workflow varies from department to department. PRACTICE INNOVATION: PARTNERED USABILITY TESTING Traditional usability testing as described in essential texts by authors such as Rubin (1994) and Nielsen (1993) usually involves the use of an objective test facilitator, outside the team, to conduct test sessions. Rubin notes that it is very difficult for individuals to remain objective when conducting a usability test of their own product (Rubin, 1994). As a result, human factors or interface design specialists, who have grounding in the basics of usability engineering, are often hired to conduct the testing. Partnered usability testing is somewhat different from traditional user testing methods. In contrast to traditional user testing, in partnered usability testing, an individual from the project team is brought into the testing process, not as an observer but to act as an additional test monitor. While the usability specialist takes the role of the traditional test monitor, leading the sessions, the partner acts as a sidekick, probing subjects when necessary and interjecting details of missing functionality. The usability specialist has the responsibility of interpreting subjects actions and how they relate to the usability issues of the design, just as they do in traditional tests (Nielsen & Mack, 1994). However, together, the two testers are responsible for assessing how and where the application fails to meet the requirements of users work flow. Due to the complexities involved in the multiple paper work flows (i.e., variety of user types; variety of clinic settings), and the resulting complexity of the online prototype, an external usability

consultant was employed to partner with an internal employee, to carry out usability testing with representative stakeholders. The role of the usability consultant was to establish basic test scenarios and capture high-level issues related to workflow. However, detection of some of the subtleties related to the complex workflow would be beyond the reach of the usability consultant as they would have required spending extensive time becoming familiar with the variations in the referral process for each user group (i.e., clinic) and user type. As the application being tested was a prototype, it was also crucial that an individual very familiar with the application be available to talk users through specific aspects of the prototype s missing functionality. An internal employee, the ARMS project manager, a nurse at the hospital, was chosen to partner with the usability specialist in the user tests. The ARMS project manager was a particularly good partner as she was familiar with the minutia of the paper referral process for a variety of different roles (i.e., clerk, etc.,) and was familiar with the detailed functionality of the prototype. Process of Partnership Testing The partnering process began long before the first test session. The first interaction involved the usability consultant providing the ARMS project manager with a training session on usability testing. During this session the usability consultant focused on the goals and logistics involved in user testing. In addition, she also provided the project manager with tips related to running a test session, to ensure the project manager understood how to enable rather than lead subjects. Testing was then conducted with eight subjects from three main audiences of the application: 1) clerks, 2) nurses and 3) physicians from the Neurology, Hematology and Orthopaedic clinics at the hospital. Traditional talk-aloud techniques were used in the sessions. Study participants worked through a series of task-oriented usage scenarios (e.g. Process a new fax that has arrived or Review a new referral ), one-on-one or with a colleague, talking aloud during the process. During the first two test sessions, the project manager played the role of observer, occasionally jumping in to clarify the functionality of the prototype for subjects. The usability consultant and the project manager then collaborated on the remaining sessions. In general, the usability specialist lead the sessions, with the project manager intermittently interjecting to probe regarding specifics of the users workflow, and to offer clarification on the functionality of the prototype. FINDINGS Partnered usability testing was helpful in uncovering a number of key findings. The pair found that, overall, the system was successful in providing the various clinics with flexibility, accommodating a great deal of variability in their referral management work flow process by allowing them to mirror many of the idiosyncratic steps in their current processing of new referrals. The system was also successful in accommodating three entirely different user groups who use the application in very different but interlinked ways. While users were able to follow the basic flow of their current work practice and to access the things they needed to do their job, partnered usability testing was integral in uncovering the specific ways in which the application did not capture some of the fine grained details of use, particularly those relevant to the clerks work flow. While some usability issues were also found, such as the placement of buttons or the clarity of text, the key findings related to these more fine grained details of work flow. For example, the testers discovered that the clerks and nurses needed the application to mirror what they can currently accomplish with paper with respect to urgent cases. Currently, when an urgent review is needed the clerks walk the paper referral over to the physician for review. In addition, the testers discovered that the clerks also needed something that would mirror the Problem binder they use to hold all the active referrals they are working on when they are waiting for missing information. The Problem binder allows them to quickly view a list of the files they need to act on. Finally the testers also determined that clinic staff need a place in the system to record notes on

communications with family members (e.g., Left a voicemail message regarding suggested appointment time ) and the family physician (e.g., expected date for test results, etc.,). They noted that the booking process is often not just one event but can be an extended process, with a number of steps or interactions between booking staff, physicians, parents, and interpreters, that all need to be tracked. DISCUSSION The partnered usability sessions were a unique way to deal with the usability test of a very complex project with multiple user groups. The advantage of this approach is that it combines the objectivity and skills of an independent usability consultant with the detailed content and process knowledge of a member of the project team. This partnership was integral in uncovering gaps in the more fine grain aspects of the user s workflow that would not have been found without the probes of someone very familiar with the intricacies of the current workflow. The ARMS project manager was able to push subjects to reflect on some of the subtleties of their work process and probe to assess whether they were able to perform all aspects of their work flow with the current prototype. For example, her probes were crucial in uncovering the need for an online equivalent to the problem binder as well as the need for the online system to allow clerks to record the ongoing details related to booking patients. In addition, partnership with a member of the project team was integral in dealing with the limited functionality of the prototype. Only someone intimately familiar with the prototype could supply relevant information about expected functionality to aid users in realistically completing their tasks. The ARMS project manager was able to supply enough details of missing functionality that subjects were able to respond in a knowledgeable way as to whether their workflow was being adequately mirrored. Comparison to Other Methods of Assessment The partnered approach offers advantages over other traditional usability assessment methods described in the literature, such as heuristic testing and collaborative usability inspection. Traditional heuristic evaluation involves multiple usability evaluators acting as evaluators, assessing an interface based on their knowledge of usability principles. The disadvantage of this approach is that it is difficult to use it to assess how well a system captures a complex domain specific workflow. While heuristic evaluation has been used by Nielsen (Nielsen & Mack, 1994) to assess a domain specific interface, the assessment was limited to the assessment of a specific scenario, in which evaluators had to first undergo training to better understand the domain. However, with the need to assess multiple scenarios for multiple user groups, this approach could become quite unwieldy as it would require multiple and extensive training sessions for each evaluator. While some of the benefits of the shared perspective of a usability and domain expert may be attained when using collaborative usability inspection, often organizations do not have the manpower or time to run these sessions. A collaborative usability inspection involves the systematic examination of an interface by a team of four or more participants including a lead reviewer, a recorder, developers, a usability specialist and a user or domain expert. Typically the inspections are done in two passes, a usage inspection to conduct representative tasks, followed by an interface inspection pass. Similar to user test sessions, the aim of a usability inspection is to identify usability problems so that the final product better supports the work of end users. However, the challenge in using the collaborative usability inspection approach is in being able to liberate the time of so many internal team members to participate in the sessions. Inspections of large or particularly complex systems may take several days to complete (Constantine & Lockwood, 2000). The time required increases when there are multiple users each using the application to perform different tasks. Despite their interest and willingness to participate, developers and internal staff are often only able to attend one or two sessions, as it is rare that organizations can free them for an extensive period of time. The partnered user test sessions provide a lean alternative as only one internal team member is required to partner with the usability

specialist. Similarly, brainstorming of results is also limited to these two stakeholders rather than an extensive team. When to Use Partnered Usability Testing Partnered usability testing is particularly appropriate when the application or prototype under review has a complex workflow that cannot be easily learned. Often these workflow processes involve domain knowledge, and or the involvement of multiple user groups using the application in different ways. As a result, it is not realistic to bring an objective tester, unfamiliar with the domain up to speed on the application. Partnered usability testing can be particularly helpful when the prototype being tested does not yet contain extensive functionality. Keys to Successful Partnered Usability Testing While the success of partnered usability testing relies in part on knowing when to employ it, it also relies on a successful partnership. The keys to a successful partnership are two-fold. Firstly, an appropriate internal partner must be selected who is familiar with the internal workflow but who is somewhat objective with respect to the application s interface. Ideally, the person should not be involved in the screen design. However, the internal partner could be someone from the development team responsible for elements of the project such as requirements analysis. Ideally, though, the internal partner should be someone intimately familiar with the work process, but also familiar with the functionality of the prototype and the ultimate functions of the future system. In this case, the nurse, acting as the project manager, was ideal as she was familiar with the current paper system from a number of different perspectives, and was also familiar with the online prototype from having conducted numerous demonstrations of the application. As her role within the development team was to track comments and critiques of the system she was sufficiently independent of the design process to be objective during the tests. It is important that one partner, most often the usability expert, take the lead, but that there is openness in conceding that lead when appropriate. Ideally, the partners should work through two or three trial sessions to establish an appropriate working rhythm prior to beginning the actual tests. This will also give the internal partner some experience in running user tests and provide an opportunity for the usability specialist to provide feedback to the internal partner to ensure that she enables but does not lead users. Given that these constraints are followed, the partnered usability testing process can lead to very rich and detailed data collection when testing applications involving extensive and complex workflows. REFERENCES Constantine, L., & Lockwood, L. (2000). Usability Inspection Techniques. Course materials. Luff, P., & Heath, C. (1998). Mobility in collaboration. CSCW 98, (pp. 305-314). Seattle, Washington. Nielsen, J. (1993). Usability Engineering. Boston, MA: AP Professional. Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J & Mack, R.L. (Eds.), Usability Inspection Methods (pp. 25-76). Toronto: John Wiley & Sons, Inc. Rubin, J. (1994). Handbook of Usability Testing. Toronto: John Wiley & Sons, Inc. The second key component of success, requires that the partners work together to establish a comfortable rhythm in interleaving their questions.