EHR Usability Test Report of SRS EHR v10
|
|
- Roger Stokes
- 5 years ago
- Views:
Transcription
1 EHR Usability Test Report of SRS EHR v10 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports SRS EHR v10 Date of Usability Test: 5/18/2017 Date of Report: 5/26/2017 Report Prepared By: Harika Prabhakar, Product Owner, SRS Health Chestnut Ridge Road, Montvale NJ, Table of Contents EXECUTIVE SUMMARY... 3 INTRODUCTION... 4 METHOD... 5 SRS User Centered Design Process... 5 PARTICIPANTS... 6 STUDY DESIGN... 6 TASKS... 7 PROCEDURES... 7 TEST LOCATION... 8 TEST ENVIRONMENT... 8 TEST FORMS AND TOOLS... 8 PARTICIPANT INSTRUCTIONS... 9 USABILITY METRICS... 9 DATA SCORING... 9 RESULTS DATA ANALYSIS AND REPORTING DISCUSSION OF THE FINDINGS EFFECTIVENESS Risk Prone Errors EFFICIENCY... 13
2 SATISFACTION MAJOR FINDINGS AREAS FOR IMPROVEMENT APPENDICES Appendix 1: SAMPLE RECRUITING SCREENER Appendix 2: PARTICIPANT DEMOGRAPHICS Appendix 3: NON-DISCLOSURE AGREEMENT AND INFORMED CONSENT FORM Appendix 4: EXAMPLE MODERATOR S GUIDE Appendix 5: SYSTEM USABILITY SCALE QUESTIONNAIRE... 31
3 EXECUTIVE SUMMARY A usability test of SRS EHR v10 Ambulatory EHR was conducted on 18 th May 2017 in the offices of Princeton Eye Group, 419 North Harrison Street, Princeton, NJ and Orthopedic Institute of Central Jersey, 2315 Route 34 S Manasquan, NJ by SRSHealth. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the SRS EHR, the EHR under Test (EHRUT). During the usability test, 10 clinical staff matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on one task typically conducted on an EHR: (a)(14) Implantable Device List During the 10 minute one-on-one usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger, recorded user performance data on paper and electronically. The administrator did not give the participant assistance in how to complete the task. Participant screens were recorded for subsequent analysis. The following types of data were collected for each participant: Number of tasks successfully completed within the allotted time without assistance Time to complete the tasks Number and types of errors Path deviations Participant s verbalizations Participant s satisfaction ratings of the system All participant data was de-identified no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a post-test questionnaire. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the SRS EHR.
4 Task Measure N Effectiveness Efficiency Satisfaction Task Success % of success Task Time Deviations (Observed / Optimal) Path Deviation Deviations (Observed / Optimal) Task Time Time (Seconds) Task Ratings Rating (1=Easy) # Searching for Implantable Device of UDI format HIBCC 10 80% Adding Implantable Device through Save&Close (UDI format HIBCC ) 10 80% Viewing Implantable Device Activity 10 Editing Implantable Device 10 Reset Implantable Device 10 Adding activity to Existing Implantable Device 10 70% % % % The results from the System Usability Scale scored the subjective satisfaction with the system based on performance with these tasks to be: In addition to the performance data, the following qualitative observations were made: - Major findings Most participants found it easy to view and update the Implantable Device list information. There were minor issues with the initial searching of the device. But as users progressed through the task they found the feature easy to use and completed tasks with ease. From a usability perspective the screen would benefit from trimming extra spaces before and after the device UDI to avoid running into errors. In addition the screen would benefit from having the necessary interfaces to automatically pre-populate the screen with the device data. Lastly visual cues to alert the user as to which data elements are missing and why the user is not able to continue with the device entry/update. - Areas for improvement As stated above, the order entry UI contains some design elements which hindered usability. For example, when saving a device the user is required to enter the state & date. From a usability standpoint these fields should have a visual indicator following a standard UI paradigm to indicate required fields. Making the Implantable Device List interoperable to reduce data entry would improve usability. INTRODUCTION The EHRUT tested for this study was SRS EHR v10. It is designed to present medical information to healthcare providers in an outpatient ambulatory environment. The SRS EHR consists of various modules
5 and interfaces design to capture, display, and modify patient clinical data. The usability testing attempted to represent realistic exercises and conditions. The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR under Test (EHRUT). To this end, measures of effectiveness, efficiency and user satisfaction, such as time on task, user satisfaction, and deviation from optimal paths, were captured during the usability testing. METHOD SRS User Centered Design Process SRSHealth designed and implemented its own UCD process based upon the ISO industry standard. The SRS UCD process follows the same principles of human-centered design and fits in to our overall quality management system (QMS). The system involves users in the overall design and software development process. The design team includes individuals with skills spanning across multiple disciplines and with different perspectives. The team works to address the entire user experience through a deep understanding of the tasks and workflows required by end users. The process is iterative and requirements are continually updated to better address the context of use, increase usability, and reduce user error. The main steps of the process are: Understanding/stating the context of use for the function or module Business requirements are gathered from internal and client stakeholders. These stake holders are kept up to date with product developments through Feature team meetings. Creating design solutions Business Analysis work with Feature teams to design several possible solutions and through further analysis with stakeholders the solution is refined. Creating user requirements Business analysis creates full requirements documents based on feedback from Feature teams Evaluating the solution Solution prototypes are created and vetted against the original understanding of the context of use. Performing user centered evaluations Formal summative user testing is performed and the analysis is sent back to the feature teams. The results are then used to drive future iterations of the product. The following tasks and modules were developed based on the SRS UCD design process: (a)(14) Implantable Device List
6 PARTICIPANTS The testing methods were in accordance with the SRS User Centered Design Process. A total of 10 participants were tested on the EHRUT(s). Participants in the study included medical assistants, and technicians from the orthopedic and ophthalmology specialties. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. For the test purposes, end-user characteristics were identified and translated into a recruitment screener used to solicit potential participants; an example of a screener is provided in Appendix 1. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs randomly generated from the demographics collection tool so that an individual s data cannot be tied back to individual identities. ID Gender Age Occupation/Role EHR Experience (Years) IDL Module Experience Female Technician 10+ None Female Manager 10+ None Female Technician 3-5 None Female Technician 6-9 None Female Medical Assistant 10+ None Female Medical Assistant 3-5 None Female Medical Assistant 6-9 None Female Medical Assistant 3-5 None Female X-Ray Technician 3-5 None Male Medical Assistant 6-9 None Ten participants (matching the demographics in the section on Participants) were recruited and 10 participated in the usability test. Participants were scheduled for 10 minute sessions with a 5 minutes session for debrief by the administrator and data logger, and to reset systems to proper test conditions. A spreadsheet was used to keep track of the participants, and included each participant s demographic characteristics. STUDY DESIGN
7 Overall, the objective of this test was to uncover areas where the application performed well that is, effectively, efficiently, and with satisfaction and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with one EHR. Each participant used the system in the same room at the the two testing locations, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant: Number of tasks successfully completed within the allotted time without assistance Time to complete the tasks Number and types of errors Path deviations Participant s verbalizations (comments) Participant s satisfaction ratings of the system Additional information about the various measures can be found in Section 3.9 on Usability Metrics. TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including: 1. Implantable Device List: a. Searching for Implantable Device of UDI format HIBCC b. Adding Implantable Device through Save&Close (UDI format HIBCC ) c. Viewing Implantable Device Activity d. Editing Implantable Device e. Reset Implantable Device f. Adding activity to existing Implantable Device Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users. PROCEDURES Upon arrival, participants were greeted; their identity was verified and matched with a name on the participant schedule. Participants were then assigned a participant ID. Each participant reviewed and signed an informed consent and release form (See Appendix 3). A representative from the test team witnessed the participant s signature. To ensure that the test ran smoothly, two staff members participated in this test, the usability administrator and the data logger. The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, and took notes on participant comments. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below): As quickly as possible making as few errors
8 Without assistance; administrators were allowed to give immaterial guidance and clarification on tasks, but not instructions on use. Without using a think aloud technique. For each task, the participants were given a written copy of the task. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Scoring is discussed below in Section 3.9. Following the session, the administrator gave the participant the post-test questionnaire (the System Usability Scale, see Appendix 5) and thanked each individual for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, verbal responses, and post-test questionnaire were recorded into a spreadsheet. TEST LOCATION The tests were performed in the exam room where the EHRUT would typically be deployed and used in production. Only the participant, the administrator and the data logger were in the test room. To ensure that the environment was comfortable for users, noise levels were kept to a minimum with the ambient temperature within a normal range. All of the safety instruction and evacuation procedures were valid, in place, and visible to the participants. TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in an examination room on a computer where interactions with the EHRUT would typically take place in real world office scenarios. For testing, the computer used a Windows PC running windows 10 with a standard mouse and keyboard. The SRS EHR v10 system was viewed on a 22/14 inch monitor. The application was set up by the SRSHealth according to standard operating procedure for client/server installation. The application itself was running on a Windows 2008 Server using a Demo/Training SQL database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size). TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including: 1. Informed Consent 2. Moderator s Guide 3. Post-test Questionnaire Examples of these documents can be found in Appendices 3-5 respectively. The Moderator s Guide was devised so as to be able to capture required data. The participant s interaction with the EHRUT was captured and recorded digitally with Goto Meeting and Goto Training software running on the test machine.
9 PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator s guide in Appendix 4): Thank you for participating in this study. Our session today will last about 10 minutes. During that time you will take a look at an early prototype of SRS EHR vv10. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Please do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. Since this is a test system some of the data may not make sense as it is placeholder data. We are recording the screen of our session today for our internal use only. Do you have any questions or concerns? Following the procedural instructions, participants were shown the EHR. Once this task was complete, the administrator gave the following instructions: For each task, I will read the description to you and say Begin. At that point, please perform the task and say Done once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you your impressions about the task once you are done. Participants were then given six tasks to complete. Tasks are listed in the moderator s guide in Appendix 4. USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess: 1. Effectiveness of SRS EHR vv10 by measuring participant success rates and errors 2. Efficiency of SRS EHR vv10 by measuring the average task time and path deviations 3. Satisfaction with SRS EHR vv10 by measuring ease of use ratings DATA SCORING The following table (Table 1) details how tasks were scored, errors evaluated, and the time data analyzed Measures Effectiveness: Task Success Rationale and Scoring A task was counted as a Success if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis.
10 The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency and was recorded. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator s Guide were defined by taking multiple measures of optimal performance and multiplying each time by a factor of 1.5. This factor allows some time buffer, as participants are presumably not trained to expert performance. Thus, if expert performance on a task was [x] seconds then optimal task time performance was [x * 1.5] seconds. The ratio of reported time to optimal time was aggregated across tasks and reported with mean. Effectiveness: Task Failures If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as a Failures. No task times were taken for errors. On a qualitative level, an enumeration of errors and error types was collected and is described in the narrative section below. Efficiency: Task Deviations The participant s path through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, opened an incorrect module, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. Optimal paths were recorded in the moderator guide. Task deviations are discussed further in the qualitative sections. Efficiency: Task Time Satisfaction: Task Rating: Each task was timed from when the administrator said Begin until the participant said, Done. If he or she failed to say Done, the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Participant s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a postsession questionnaire. After each task, the participant was asked to rate Overall, this task was: on a scale of 1 (Very Easy) to 5 (Very Difficult). These data are averaged across participants. Common convention is that average ratings for systems judged easy to use should be 3.3 or below. To measure participants confidence in and likeability of the SRS EHR vv10 overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, I thought the system was easy to use and I would imagine that most people would learn to use this system very quickly. See full System Usability Score questionnaire in Appendix 5.
11 Table 1. Details of how observed data were scored. RESULTS DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. There were a few minor issues which may have affected data collection. The task descriptions and the terminology used could have contributed to some hesitation on the part of the user when completing tasks. Efforts were made to use descriptions and terminology which would be familiar to users; however, some users still may have experienced some confusion. As part of testing, users were presented with entirely new data entry screens. Test participants had never used these screens to create and update implantable device list information, so it is understandable that users would have some initial difficulty navigating unfamiliar screens. Task The usability testing results for the EHRUT are detailed below (see Table 2). Measure N Effectiveness Efficiency Satisfaction Task Success % of success Task Time Deviations (Observed / Optimal) Path Deviation Deviations (Observed / Optimal) Task Time Time (Seconds) Task Ratings Rating (1=Easy) # Searching for Implantable Device of UDI format HIBCC 10 80% Adding Implantable Device through Save&Close (UDI format HIBCC ) 10 80% Viewing Implantable Device Activity 10 Editing Implantable Device 10 Reset Implantable Device 10 Adding activity to Existing Implantable Device 10 70% % % % Table 2.Usability Testing Results
12 The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be: Broadly interpreted, scores under 60 represent systems with poor usability; scores over 80 would be considered above average. 1 DISCUSSION OF THE FINDINGS EFFECTIVENESS Based on the findings, most users were quite effective at using the EHRUT. The average number of failures per task was 2, but the errors were primarily due to the user timing out before the task could be completed. Some users spent additional time making sure that they had typed information correctly, which contributed to timing out on the task, other users verified they had completed all intended steps before completing the last step of the task and ultimately, failure on the task. However, making certain that data is entered correctly contributes to an overall goal of this project - patient safety. Most test participants were effective and showed very few, if any path deviations. The duration taken to complete the task can be attributed to the fact that this is a new feature on the application on which no one has been trained. It would be interesting to measure effectiveness once again after training has been completed once the EHRUT has been deployed to the customers and to observe if users continue to follow an optimal path. Risk Prone Errors When discussing errors, it is important to note which tasks are more likely to result in errors and what types of errors are most likely to cause patient safety issues. Tasks which do not alter the patient s record but simply display information to the user are less likely to lead to errors. Viewing device details and reviewing the patient s device history does not involve the user entering or alerting data, so the chance of error is very low. The system will only allow ancillary information associated with the device to be updated. The actual device information itself cannot be altered and is saved as retrieved from the FDA database. So there is no chance of an incorrect device being attached with a patient record. Tasks which require the user to enter data are more prone to error. To that end every effort should be made to ensure that the user can clearly discern what has been selected on the screen and that they are given an opportunity to double check and cancel / reset actions before they are committed. Below is a prioritized list of tasks in the order of the associated risk and with a rating of the risk (1-3, where 1 is a low risk and 3 is a high risk) Task Risk Level Searching for Implantable Device of UDI format HIBCC 3 Adding Implantable Device through Save&Close (UDI format HIBCC ) 1 Viewing Implantable Device Activity 1 Editing Implantable Device 1 Reset Implantable Device 1 Adding activity to Existing Added Implantable Device 1 1 See Tullis, T. & Albert, W. (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149).
13 EFFICIENCY A few users were unable to complete the tasks in what was deemed a reasonable amount of time. Users who failed to complete the task within the maximum amount of time, as determined for each task prior to testing, had their data excluded for efficiency measures. The average deviations ratios (observed path / optimal path) in the group tested were close to 1.0 for users who could complete the task. Even users who were unable to complete a task in time were generally on the correct path and rarely deviated into a different area of the software. Thus, we conclude that users were relatively efficient when completing the tasks set before them. SATISFACTION The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be: Verbal feedback as well as task ratings conclude that there is a high level of comfort and overall satisfaction with the system. Specifically, users stated that the system is simple and intuitive, user friendly, and organized logically. These statements, along with other participant verbalizations, suggest a high level of usability within the system. There are of course some areas of improvement, which are discussed below, but even so the average task ease of use rating was between 1.8 and 2.3 for each task. As set forth in the data scoring section, average ratings for systems judged easy to use should be 3.3 or below. Hence, a major finding of this testing was that the modules tested are very easy to use. MAJOR FINDINGS Most of the participants were very familiar with the EHRUT so very few users had difficulty with those tasks. Most users agreed that they found that they rarely had trouble using the module. Most participants found it easy to view and update the Implantable Device list information. There were minor issues with the initial searching of the device. But as users progressed through the task they found the feature easy to use and completed tasks with ease. From a usability perspective the screen would benefit from making the visual cues to alert the user as to which data elements are missing any why the user is not able to continue with the device entry/update.
14 AREAS FOR IMPROVEMENT Generally, the feedback was very positive, but there are some areas where usability could be improved. One major takeaway from this process is that gathering user feedback earlier in the development process as well as working with less advanced/beginner users rather than super users could improve usability and overall safety. As stated above, the order entry UI contains some design elements which hindered usability. For example, when saving an order the user is required to enter the state & date. From a usability standpoint these fields should have a visual indicator following a standard UI paradigm to indicate required fields. Making the Implantable Device List interoperable to reduce data entry would improve usability.
15 APPENDICES The following appendices include supplemental data for this usability test report. Following is a list of the appendices provided: 1: Sample Recruiting Screener 2: Participant Demographics 3: Non-Disclosure Agreement (NDA) and Informed Consent Form 4: Moderator s Guide 5: System Usability Scale Questionnaire
16 Appendix 1: SAMPLE RECRUITING SCREENER The purpose of a screener to ensure that the participants selected represent the target user population as closely as possible. Rather than reaching out to individual participants, SRS opted to target the Practice Administrator and rely upon them to select qualified participants based on the functionality being tested. Recruiting Scripts: Hello, my name is, and I am calling from SRSHealth. We are recruiting participants for a usability study on the SRS EHR. Would you be willing to participate, along with staff members? This is strictly for research purposes. We will primarily be conducting tests on the usability of the new implantable device list. Could you identify users in the practice such as physicians, medical assistants, and technicians who use this module? Once you have identified those users, please send them a demographics questionnaire. Would you be able to participate between May 15 th and May 20 th? The test will take place at your office. Thank you, and we appreciate your time. Participants then filled out an online demographics survey at
17 Appendix 2: PARTICIPANT DEMOGRAPHICS Following is a high-level overview of the participants in this study. Gender Men 1 Women 9 Total (participants) 10 Age Total (participants) 10 Race/Ethnicity Caucasian 8 Asian 0 Black/African-American 1 Latino/a or Hispanic 1 Other 0 Total (participants) 10 Occupation/Role Physician 0 Medical Assistant 5 Technician 4 Manager 1 Other 0 Total (participants) 10 Years of Experience Total (participants) 10
18 Following is a full participant breakdown: ID Gender Age Occupation/Role EHR Experience (Years) IDL Module Experience Assistive Tech Needs Race /Ethnicity Female Technician 10+ None N/A Caucasian Female Manager 10+ None N/A Caucasian Female Technician 3-5 None N/A Caucasian Female Technician 6-9 None N/A Caucasian Female Medical Assistant 10+ None N/A Black/Africa n-american Female Medical Assistant 3-5 None N/A Caucasian Female Medical Assistant 6-9 None N/A Caucasian Female Medical Assistant 3-5 None N/A Latino/a or Hispanic Female X-Ray Technician 3-5 None N/A Caucasian Male Medical Assistant 6-9 None N/A Caucasian
19 Appendix 3: NON-DISCLOSURE AGREEMENT AND INFORMED CONSENT FORM Non-Disclosure Agreement THIS AGREEMENT is entered into as of May 18 th 2017, between ( the Participant ) and the testing organization SRSHealth located at 155 Chestnut Ridge Road, Montvale, New Jersey, The Participant acknowledges his or her voluntary participation in today s usability study may bring the Participant into possession of Confidential Information. The term "Confidential Information" means all technical and commercial information of a proprietary or confidential nature which is disclosed by SRSHealth, or otherwise acquired by the Participant, in the course of today s study. By way of illustration, but not limitation, Confidential Information includes trade secrets, processes, formulae, data, know-how, products, designs, drawings, computer aided design files and other computer files, computer software, ideas, improvements, inventions, training methods and materials, marketing techniques, plans, strategies, budgets, financial information, or forecasts. Any information the Participant acquires relating to this product during this study is confidential and proprietary to SRSHealth and is being disclosed solely for the purposes of the Participant s participation in today s usability study. By signing this form the Participant acknowledges that s/he will not disclose this confidential information obtained today to anyone else or any other organizations. Participant s printed name: Signature: Date:
20 Informed Consent SRSHealth would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about 30 minutes. Agreement I understand and agree that as a voluntary participant in the present study conducted by SRSHealth I am free to withdraw consent or discontinue participation at any time. I understand and agree to participate in the study conducted and recorded by SRSHealth. I understand and consent to the use and release of the recording by SRSHealth. I understand that the information and recording is for research purposes only and that my name and image will not be used for any purpose other than research. I relinquish any rights to the recording and understand the recording may be copied and used by SRSHealth without further permission. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared with outside of SRSHealth and SRSHealth s client. I understand and agree that data confidentiality is assured, because only de-identified data i.e., identification numbers not names will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can leave at any time. Please check one of the following: in this study. Signature: Date:
21 Appendix 4: EXAMPLE MODERATOR S GUIDE SRS EHR Usability Test Moderator s Guide Administrator Data Logger Date Time Participant # Location Prior to testing ly Prior to each participant: Prior to each task: Goto Meeting After each participant: Goto Meeting After all testing Orientation Thank you for participating in this study. Our session today will last about 10 minutes. During that time you will take a look at an early prototype of SRS EHR vv10. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Please do not do anything more than asked. If you get lost or have difficulty I cannot answer help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. Since this is a test system some of the data may not make sense as it is placeholder data. We are recording the screen of our session today for our internal use only. Do you have any questions or concerns?
22 Participant Name: Task 1: Searching for Implantable Device of UDI format HIBCC (Optimal Time: Secs) Searching for Implantable Device of UDI format HIBCC Success: Task Time: Seconds Optimal Path 1) User is logged into SRS 2) User selects a patient with appointment 3) User opens the Implantable Device List app on the Interoperability Dashboard by clicking on the green + icon 4) Enter with valid UDI format (of type HIBCC) in Search box like +B NS1/$$ LOT /SXYZ /16D C1 then Press Enter Key 5) User can see the below Implantable Device information on the screen with respect to search GMDN PT: "Cardiopulmonary bypass system filter, arterial blood line" SNOMED: Brand Name: "CAP, NON-VENTED FEMALE: BLUE NON-STERILE" Version/Model: NS Company: "NOVOSCI CORP" Lot/Batch No: LOT Serial No: XYZ MRI: "Labeling does not contain MRI Safety Information" Contains Natural Rubber: false Expiration Date: 02/02/2020 Manufactured Date: 02/02/2013 HCT/P: false
23 Observed Errors and Verbalizations: Rating: Overall, this task was: Show participant written scale: Very Easy (1) to Very Difficult (5) Administrator / Notetaker Task 2: Adding Implantable Device through Save&Close (UDI format HIBCC ) (Optimal Time: Secs) Adding Implantable Device through Save&Close (UDI format HIBCC ) Success: Task Time: Seconds Optimal Path 1) User is logged into SRS 2) User selects a patient with appointment 3) User opens the Implantable Device List app on the Interoperability Dashboard by clicking on the green + icon 4) Enter with valid UDI format (of type HIBCC) in Search box like +B NS1/$$ LOT /SXYZ /16D C1 then Press Enter Key 5) User can see the below Implantable Device information on the screen with respect to search GMDN PT: "Cardiopulmonary bypass system filter, arterial blood line"
24 SNOMED: Brand Name: "CAP, NON-VENTED FEMALE: BLUE NON-STERILE" Version/Model: NS Company: "NOVOSCI CORP" Lot/Batch No: LOT Serial No: XYZ MRI: "Labeling does not contain MRI Safety Information" Contains Natural Rubber: false Expiration Date: 02/02/2020 Manufactured Date: 02/02/2013 HCT/P: false 6) User can see the below labeled with text boxes under Expanded layout grid Friendly Name: Text Box Status: Dropdown Date: Text box(calendar) Notes: Text Box Reset and Save & Close 7) User enter the Data (Free text) in the above test box Like below Example - Friendly Name: HIBCC Device Status: Active Date: 01/01/2017 Notes: Saving the Device 8) User click on Save & Close button 9) User should able to save Implantable Device successfully and screen also closed 10) User should be see the added Implantable Device on the Interoperability Dashboard
25 11) Now user again opens the Implantable Device List app on the Interoperability Dashboard by clicking on the green + icon 12) User should able to see the saved Implantable Device in the Bottom grid and information should be displayed under columns (i.e. GMDN Pt Name, Status, Date, Brand and Notes) Observed Errors and Verbalizations: Rating: Overall, this task was: Show participant written scale: Very Easy (1) to Very Difficult (5) Administrator / Notetaker Task 3: Viewing Implantable Device Activity. (Optimal Time: Secs) Viewing Implantable Device Activity Success: Task Time: Seconds Optimal Path 1) User is logged into SRS 2) User selects a patient with appointment
26 3) User opens the Implantable Device List app on the Interoperability Dashboard by clicking on the green + icon 4) User already added Implantable device data to the Bottom grid 5) User click on + icon left to saved Implantable device in the Bottom grid 6) User should able to see the information rect Observed Errors and Verbalizations: Rating: Overall, this task was: Show participant written scale: Very Easy (1) to Very Difficult (5) Administrator / Notetaker
27 Task 4: Editing Implantable Device Editing Implantable Device (Optimal Time: 55.5 Secs) Success: Task Time: Seconds Optimal Path 1) User is logged into SRS 2) User selects a patient with appointment 3) User opens the Implantable Device List app on the Interoperability Dashboard by clicking on the green + icon 4) User already added Implantable device data to the Bottom grid 5) User click on Edit icon right side to saved Implantable device in the Bottom grid 6) User enter the updated information in the columns friendly name, date and Notes and click on save/&close 7) User should able to update successfully the columns (friendly name, status date and Notes) 8) User again click on Edit icon right side to saved Implantable device in the Bottom grid (Step 4) 9) Now user should able to see the Updated information Observed Errors and Verbalizations: Rating: Overall, this task was: Show participant written scale: Very Easy (1) to Very Difficult (5)
28 Administrator / Notetaker Task 5: Reset Implantable Device (Optimal Time: Secs) Reset Implantable Device Success: Task Time: Seconds Optimal Path 1) User is logged into SRS 2) User selects a patient with appointment 3) User opens the Implantable Device List app on the Interoperability Dashboard by clicking on the green + icon 4) User already added Implantable device data to the Bottom grid 5) User click on Edit icon right side to saved Implantable device in the Bottom grid 6) User enter the new information in the columns for friendly name, status, date and Notes and do not click on save/&close 7) User click on Reset Button 8) User should able to reset to previous information Observed Errors and Verbalizations: Rating: Overall, this task was:
29 Show participant written scale: Very Easy (1) to Very Difficult (5) Administrator / Notetaker Task 6: Adding activity to Existing Implantable Device (Optimal Time: Secs) Adding activity to Existing Implantable Device Success: Task Time: Seconds Optimal Path 1) User is logged into SRS 2) User selects a patient with appointment 3) User opens the Implantable Device List app on the Interoperability Dashboard by clicking on the green + icon 4) User already added Active Implantable device data to the Bottom grid 5) User click on Edit icon right side to saved Implantable device in the Bottom grid 6) User change the status to inactive and click on save/&close button 7) Now user again opens the Implantable Device List app on the Interoperability Dashboard by clicking on the green + icon 8) User click on + icon left to saved Implantable device in the Bottom grid User should able to see the Activity by click on + icon left to it :: Describe below Observed Errors and Verbalizations:
30 Rating: Overall, this task was: Show participant written scale: Very Easy (1) to Very Difficult (5) Administrator / Notetaker
31 Appendix 5: SYSTEM USABILITY SCALE QUESTIONNAIRE This questionnaire was adapted from John Brooke s SUS: a quick and dirty usability scale at Digital Equipment Corporation in the UK. 2 Participants completed the questionnaire on paper. System Usability Scale Strongly Strongly Disagree Agree 1. I think that I would like to use this system frequently I found the system unnecessarily complex I thought the system was easy to use I think that I would need the support of a technical person to be able to use this system 5. I found the various functions in this system were well integrated 6. I thought there was too much inconsistency in this system 7. I would imagine that most people would learn to use this system very quickly 8. I found the system very cumbersome to use 9. I felt very confident using the system 10. I needed to learn a lot of things before I could get going with this system Brooke, J. (1996). "SUS: a "quick and dirty" usability scale." In P. W. Jordan, B. Thomas, B. A. Weerdmeester, & A. L. McClelland. Usability Evaluation in Industry. London: Taylor and Francis.
Safety-enhanced Design EDIS 2014 R (a)(1) Computerized Provider Order Entry
170.314(a)(1) - Computerized Provider Order Entry Safety-enhanced Design EDIS 2014 R1 170.314(a)(1) Computerized Provider Order Entry Report content based on NISTR 7742, Customized Common Industry Format
More informationUsability Test Report of ClinNext 10 EHR version 1.0
Usability Test Report of ClinNext 10 EHR version 1.0 Report based on ISO/IEC 25062: 2006 Common Industry Format for Usability Test Reports ClinNext 10 version 1.0 Date of Usability Test: August 29, 2018
More informationEHR Usability Test Report of IntelleChartPro, Version 7.0 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports
EHR Usability Test Report of IntelleChartPro, Version 7.0 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports IntelleChartPro, Version 7.0 Date of Usability Test: 08/03/2017
More informationONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification. Modular EHR
Test Results Summary for 2014 Edition EHR Certification 14-2552-R-0089-PRI Vl.O, December 24, 2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification lnfoigard Part 1:
More informationProduct Development for Medical, Life Sciences, and Consumer Health
Product Development for Medical, Life Sciences, and Consumer Health Fundamentals of Usability Testing for FDA Validation MassMEDIC April 8, 2010 Beth Loring Director of Research and Usability 2 Topics
More informationFolsom Library & RensSearch Usability Test Plan
Folsom Library & RensSearch Usability Test Plan Eric Hansen & Billy Halibut 1 Table of Contents Document Overview!... 3 Methodology!... 3 Participants!... 3 Training!... 4 Procedure!... 4 Roles!... 4 Ethics!5
More informationUsability Report for Online Writing Portfolio
Usability Report for Online Writing Portfolio October 30, 2012 WR 305.01 Written By: Kelsey Carper I pledge on my honor that I have not given or received any unauthorized assistance in the completion of
More informationStream Features Application Usability Test Report
Stream Features Application Usability Test Report Erin Norton and Katelyn Waara HU 4628: Usability and Instruction Writing Michigan Technological University April 24, 2013 Table of Contents Executive Summary
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification 15-3002-R-0007-PRA V1.14, February 28, 2016 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product
More informationWebsite Usability Study: The American Red Cross. Sarah Barth, Veronica McCoo, Katelyn McLimans, Alyssa Williams. University of Alabama
Website Usability Study: The American Red Cross Sarah Barth, Veronica McCoo, Katelyn McLimans, Alyssa Williams University of Alabama 1. The American Red Cross is part of the world's largest volunteer network
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification 14-2710-R-0092-PRA V1.2 May 4, 2017 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationIBM MANY EYES USABILITY STUDY
IBM MANY EYES USABILITY STUDY Team Six: Rod Myers Dane Petersen Jay Steele Joe Wilkerson December 2, 2008 I543 Interaction Design Methods Fall 2008 Dr. Shaowen Bardzell EXECUTIVE SUMMARY We asked three
More informationUsability Veracity Attestation
DocuSign Envelope ID: 8484BF5C-7BB8-4D22-9280-D92F4667628D Usability Veracity Attestation Medstreaming, LLC All in One Medstreaming EMR v5.3 Nora Selim, VP of Operations Medstreaming, LLC 9840 Willows
More informationFoundation Level Syllabus Usability Tester Sample Exam
Foundation Level Syllabus Usability Tester Sample Exam Version 2017 Provided by German Testing Board Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged.
More informationUsability Testing Methodology for the 2017 Economic Census Web Instrument
Usability Testing Methodology for the 2017 Economic Census Web Instrument Rebecca Keegan Economic Statistical Methods Division March 8th, 2017 FCSM Disclaimer: Any views expressed are those of the author
More informationConcepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?
Concepts of Usability Usability Testing What is usability? How to measure it? Fang Chen ISO/IS 9241 Usability concept The extent to which a product can be used by specified users to achieve specified goals
More informationWeb Evaluation Report Guidelines
Web Evaluation Report Guidelines Graduate Students: You are required to conduct a usability test for your final project in this course. Please review the project description and the evaluation rubric on
More informationTeam : Let s Do This CS147 Assignment 7 (Low-fi Prototype) Report
Team : Let s Do This CS147 Assignment 7 (Low-fi Prototype) Report 1. Title, each team member s name & role Title: Let s Do This Roles: Divya - Developer. Eric - Developer, manager. Sami - User testing,
More informationProblem and Solution Overview: An elegant task management solution, that saves busy people time.
An elegant task management solution, that saves busy people time. Team: Anne Aoki: Storyboarding, design, user studies, writing Alex Anderson: User studies, design Matt Willden: Ideation, writing, user
More informationStandard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms
Standard Glossary of Terms used in Software Testing Version 3.2 Foundation Extension - Usability Terms International Software Testing Qualifications Board Copyright Notice This document may be copied in
More informationUsability Veracity Attestation ( g.3)
Usability Veracity Attestation (170.315.g.3) Debbie McKay, RN, BHS, JD Senior Solutions Manager Regulatory Sunrise Business Unit Allscripts Healthcare Solutions 222 Merchandise Mart Plaza, Suite 2024 Chicago,
More informationConsumers Energy Usability Testing Report
Consumers Energy Usability Testing Report SI 622 Section 2 Group 2: Katrina Lanahan, Michael Grisafe, Robert Wang, Wei Wang, Yu-Ting Lu April 18th, 2014 1 Executive Summary This report addresses a usability
More informationUsability and Evaluation of BCKOnline PORTAL Prototype Early breast cancer Recurrent Advanced
Usability and Evaluation of BCKOnline PORTAL Prototype Web Site URL: http://130.194.38.233:3400/index.html Participant No: Demographics 1. Please indicate your gender - Male Female 2. Please indicate your
More informationUsability Tests and Heuristic Reviews Planning and Estimation Worksheets
For STC DC Usability SIG Planning and Estimation Worksheets Scott McDaniel Senior Interaction Designer 26 February, 2003 Eval_Design_Tool_Handout.doc Cognetics Corporation E-mail: info@cognetics.com! Web:
More informationEHR Usability Test Report of Office Ally s EHR 24/7 System Version 3.9.2
Page 1 EHR Usability Test Report of Office Ally s EHR 24/7 System Version 3.9.2 Report based on NISTIR 7741 NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records,
More informationTHINK THE FDA DOESN T CARE ABOUT USER EXPERIENCE FOR MOBILE MEDICAL APPLICATIONS? THINK AGAIN.
THINK THE FDA DOESN T CARE ABOUT USER EXPERIENCE FOR MOBILE MEDICAL APPLICATIONS? THINK AGAIN. When granting regulatory approvals for medical devices, both IEC 62366 and the FDA have emphasized the importance
More informationStandards: Implementation, Certification and Testing Work group Friday, May 8, :00 Pm-1:30 Pm ET.
Standards: Implementation, Certification and Testing Work group Friday, May 8, 2015. 12:00 Pm-1:30 Pm ET. Agenda Complete Work group Comments- Group 1 Review Group 2 Comments. 2015 Edition Certification
More informationOffice of the Director. Contract No:
Centers for Disease Control and Prevention and Prevention (CDC) Office of Infectious Diseases (OID) National Center for Immunization and Respiratory Diseases (NCIRD) Office of the Director Contract No:
More informationUsability Report. Author: Stephen Varnado Version: 1.0 Date: November 24, 2014
Usability Report Author: Stephen Varnado Version: 1.0 Date: November 24, 2014 2 Table of Contents Executive summary... 3 Introduction... 3 Methodology... 3 Usability test results... 4 Effectiveness ratings
More informationUser Testing Study: Collaborizm.com. Jessica Espejel, Megan Koontz, Lauren Restivo. LIS 644: Usability Theory and Practice
User Testing Study: Collaborizm.com Jessica Espejel, Megan Koontz, Lauren Restivo LIS 644: Usability Theory and Practice TABLE OF CONTENTS EXECUTIVE SUMMARY... 3 INTRODUCTION... 4 METHODOLOGY... 5 FINDINGS
More informationWho we tested [Eight] participants, having the following profile characteristics, evaluated [product name].
Executive Summary NOTE: Include high-level summary of findings here, including: Overall task performance User difficulties and frustrations with the site Significant usability findings (may include positive
More informationUsability Report for Love and Robots.com
Usability Report for Love and Robots.com Fiona Brady, Aoibhίn Hassett, Louise Grant & Hannah O Sheehan N00147351, N00144119, N00144786 & N00147667 Usability C.A. Applied Psychology Table of Contents Introduction...1
More informationAmsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app
Amsterdam Medical Center Department of Medical Informatics Improve Usability evaluation of the sign up process of the Improve app Author L.J.M. Heerink Principal investigator Prof. Dr. M.W.M Jaspers Supervised
More informationUSABILITY REPORT A REPORT OF USABILITY FINDINGS FOR OFF THE BEATEN PATH WEBSITE
USABILITY REPORT A REPORT OF USABILITY FINDINGS FOR OFF THE BEATEN PATH WEBSITE Prepared by: Cori Vandygriff, Joseph Kmetz, Cammy Herman, and Kate DeBusk To: Off the Beaten Path Team From: Cammy Herman
More informationUsability evaluation in practice: the OHIM Case Study
Usability evaluation in practice: the OHIM Case David García Dorvau, Nikos Sourmelakis coppersony@hotmail.com, nikos.sourmelakis@gmail.com External consultants at the Office for Harmonization in the Internal
More informationCS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test
CS 160: Evaluation Professor John Canny Spring 2006 2/15/2006 1 2/15/2006 2 Iterative Design Prototype low-fi paper, DENIM Design task analysis contextual inquiry scenarios sketching 2/15/2006 3 Evaluate
More informationCS 160: Evaluation. Professor John Canny Spring /15/2006 1
CS 160: Evaluation Professor John Canny Spring 2006 2/15/2006 1 Outline User testing process Severity and Cost ratings Discount usability methods Heuristic evaluation HE vs. user testing 2/15/2006 2 Outline
More informationRunning Head: TREE TAP USABILITY TEST 1
Running Head: TREE TAP USABILITY TEST 1 Gogglefox Tree Tap Usability Test Report Brandon S. Perelman April 23, 2014 Final Design Documents Final Design Prototype White Paper Team Gogglefox Website Author's
More informationUser study to evaluate FlexArm flexibility and efficiency
Image guided therapy Azurion 7 C20 with FlexArm User study to evaluate FlexArm flexibility and efficiency Including 17 clinical users from Europe and USA Interventional suite requirements are a paradox.
More informationUsability Testing Essentials
Usability Testing Essentials Ready, Set...Test! Carol M. Barnum ELSEVIER Amsterdam Boston Heidelberg London New York Oxford Paris San Diego San Francisco Singapore Sydney Tokyo Morgan Kaufmann is an imprint
More informationUSER RESEARCH Website portfolio prototype
USER RESEARCH Website portfolio prototype Researcher & Author: Álvaro Ibargüen Villa UX, UI & Visual Designer Tel. E-mail Online +34 695 42 17 92 alvaroiv1@gmail.com aivweb.es INTRODUCTION 2 OBJECTIVES
More informationONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification
ONC HIT Certification Program Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Clinicians Gateway Product Version: 3.6.0 Domain: Ambulatory Test Type: Modular EHR
More informationMarch 20, Division of Dockets Management (HFA-305) Food and Drug Administration 5630 Fishers Lane, Room 1061 Rockville, MD 20852
701 Pennsylvania Avenue, NW Suite 800 Washington, D.C. 20004 2654 Tel: 202 783 8700 Fax: 202 783 8750 www.advamed.org March 20, 2017 Division of Dockets Management (HFA-305) Food and Drug Administration
More informationCS3724 Human-computer Interaction. Usability Specifications. Copyright 2005 H. Rex Hartson and Deborah Hix.
CS3724 Human-computer Interaction Usability Specifications Copyright 2005 H. Rex Hartson and Deborah Hix. Topics What are usability specifications? Usability specification tables Benchmark task descriptions
More informationUsability Tests Descriptions
Assignment 3b: Usability Check-In (Neat) Yoanna Dosouto, Siddhartha Gorti, Andrew Tat & Doaa Alsharif Usability Tests Descriptions Our first participant was a female student recruited at the HCDE lounge.
More informationUsability Testing CS 4501 / 6501 Software Testing
Usability Testing CS 4501 / 6501 Software Testing [Nielsen Normal Group, https://www.nngroup.com/articles/usability-101-introduction-to-usability/] [TechSmith, Usability Basics: An Overview] [Ginny Redish,
More informationUsability Testing. November 14, 2016
Usability Testing November 14, 2016 Announcements Wednesday: HCI in industry VW: December 1 (no matter what) 2 Questions? 3 Today Usability testing Data collection and analysis 4 Usability test A usability
More informationNational Weather Service Weather Forecast Office Norman, OK Website Redesign Proposal Report 12/14/2015
National Weather Service Weather Forecast Office Norman, OK Website Redesign Proposal Report 12/14/2015 Lindsay Boerman, Brian Creekmore, Myleigh Neill TABLE OF CONTENTS Parts PAGE Abstract... 3 Introduction...
More informationCHAPTER 4 HUMAN FACTOR BASED USER INTERFACE DESIGN
CHAPTER 4 HUMAN FACTOR BASED USER INTERFACE DESIGN 4.1 Introduction Today one of the most important concerns is how to use the system with effectiveness, efficiency and satisfaction. The ease or comfort
More informationOn Premise. Service Pack
On Premise Service Pack 02.0.01 - This Documentation, which includes embedded help systems and electronically distributed materials, (hereinafter referred to as the Documentation ) is for your informational
More informationTHE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW
THE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW Dianne Davis Fishbone Interactive Gordon Tait Department of Surgery, University of Toronto Cindy Bruce-Barrett Strategic
More informationPROJECT SUMMARY Our group has chosen to conduct a usability study over
LS 560 GROUP 2 Edmund Balzer Michelle Eisele Beth Keene Christine Remenih Usability Study PAGE 4 - CONSENT FORM: REMOTE USABILITY TEST PAGE 5 - SURVEY: QUESTIONS AND GRAPHED RESULTS PAGE 10 - REPORT: OBSERVATIONS,
More informationOn Premise. Service Pack
On Premise Service Pack 02.0.01 - This Documentation, which includes embedded help systems and electronically distributed materials, (hereinafter referred to as the Documentation ) is for your informational
More information1. Introduction and Overview
1. Introduction and Overview Thank you for volunteering to be an Alliance Usability Tester! This Survey Monkey addresses basic principles of usability as demonstrated in evidence based research on usability
More informationVisual Appeal vs. Usability: Which One Influences User Perceptions of a Website More?
1 of 9 10/3/2009 9:42 PM October 2009, Vol. 11 Issue 2 Volume 11 Issue 2 Past Issues A-Z List Usability News is a free web newsletter that is produced by the Software Usability Research Laboratory (SURL)
More informationUsability and User Experience of Case Study Prof. Dr. Wiebke Möhring
Usability and User Experience of www.seniorenberatung-hannover.de Case Study Prof. Dr. Wiebke Möhring Summary 1 Executive Summary... 3 2 Problem Statement... 3 3 Method... 4 4 Conclusion... 6 5 Implementation...
More informationLevel 1 Certificate in Reception Services ( )
Level 1 Certificate in Reception Services (8067-01) Assessment pack www.cityandguilds.com January 2012 Version 1.01 About City & Guilds City & Guilds is the UK s leading provider of vocational qualifications,
More informationExperimental Validation of TranScribe Prototype Design
Experimental Validation of TranScribe Prototype Design Hao Shi Velian Pandeliev ABSTRACT In this paper we describe an experiment to compare a new transcription design prototype to an existing commercial
More informationCSc 238 Human Computer Interface Design Chapter 5 Designing the Product: Framework and Refinement. ABOUT FACE The Essentials of Interaction Design
BBuckley - 1 CSc 238 Human Computer Interface Design Chapter 5 Designing the Product: Framework and Refinement ABOUT FACE The Essentials of Interaction Design Cooper, Reimann, Cronin, and Noessel Requirements
More informationReviewers Guide on Clinical Trials
Reviewers Guide on Clinical Trials Office of Research Integrity & Compliance Version 2 Updated: June 26, 2017 This document is meant to help board members conduct reviews for Full Board: Clinical Trial
More informationOnCore Enterprise Research. Subject Administration Full Study
OnCore Enterprise Research Subject Administration Full Study Principal Investigator Clinical Research Coordinator June 2017 P a g e 1 This page is intentionally blank. P a g e 2 Table of Contents What
More informationThe LUCID Design Framework (Logical User Centered Interaction Design)
The LUCID Design Framework (Logical User Centered Interaction Design) developed by Cognetics Corporation LUCID Logical User Centered Interaction Design began as a way of describing the approach to interface
More informationThis PDF was generated from the Evaluate section of
Toolkit home What is inclusive design? Why do inclusive design? How to design inclusively Overview Map of key activities Manage This PDF was generated from the Evaluate section of www.inclusivedesigntoolkit.com
More informationProperty of Shree Dhavale. Not for public distribution. Practicum Report. Shree Dhavale
Practicum Report By Shree Dhavale Submitted in fulfillment of the HIT 5000: Practicum in Applied Health Informatics Shree Dhavale Student Signature Faculty: Susan H. Fenton, PhD, RHIA, FAHIMA Associate
More informationNextGen Practice Management Appointment Scheduling Guide. Version 5.8
NextGen Practice Management Appointment Scheduling Guide Version 5.8 Copyright 1994-2013 NextGen Healthcare Information Systems, LLC. All Rights Reserved. NextGen and NextPen are registered trademarks
More informationONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification
Test Results Summary for 2014 Edition EHR Certification 14-2655-R-0079-PRI Version 1.1, February 28, 2016 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product
More informationDue on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction
Week 6 Assignment: Heuristic Evaluation of Due on: May 12 2013 Team Members: Arpan Bhattacharya Collin Breslin Thkeya Smith INFO 608-902 (Spring 2013): Human-Computer Interaction Group 1 HE Process Overview
More informationCUE-10: Moderation Page 1. Comparative Usability Evaluation 10. Moderation. Observing usability test moderators
CUE-10: Moderation Page 1 Comparative Usability Evaluation 10 Moderation Observing usability test moderators Workshop: Boston, MA, USA, Wednesday 9 May 2018 CUE-10: Moderation Page 2 Call For Participation
More informationWITH MODIFICATIONS ), S, PHONE NUMBERS OR OTHER FORMS, PLEASE VISIT THE IRB S WEBSITE AT:
Modification Request Dominican College Institutional Review Board (IRB) IRB #: Date Received: IRB Approval Name & Signature: FOR MORE IRB INSTRUCTIONS, APPLICATIONS (including the RENEWAL FORM that should
More informationFirst-Time Usability Testing for Bluetooth-Enabled Devices
The University of Kansas Technical Report First-Time Usability Testing for Bluetooth-Enabled Devices Jim Juola and Drew Voegele ITTC-FY2005-TR-35580-02 July 2004 Project Sponsor: Bluetooth Special Interest
More informationIBM s approach. Ease of Use. Total user experience. UCD Principles - IBM. What is the distinction between ease of use and UCD? Total User Experience
IBM s approach Total user experiences Ease of Use Total User Experience through Principles Processes and Tools Total User Experience Everything the user sees, hears, and touches Get Order Unpack Find Install
More informationCPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018
CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018 OVERVIEW... 2 SUMMARY OF MILESTONE III DELIVERABLES... 2 1. Blog Update #3 - Low-fidelity Prototyping & Cognitive Walkthrough,
More informationMEDITECH. ARRA Meaningful Use Stage 3 Usability Study. Acute CS 5.67
MEDITECH ARRA Meaningful Use Stage 3 Usability Study Usability Issues and Recommendations Acute CS 5.67 Medication Allergy List Medication List Drug-Drug, Drug-Allergy Interaction Checks Electronic Prescribing
More informationUser s Guide. QualityMetric Incorporated, Lincoln, RI
User s Guide QualityMetric Incorporated, Lincoln, RI Version 6.8 October 2016 Smart Measurement System Table of Contents Page i Table of Contents Chapter 1 About the Smart Measurement System 1 Chapter
More informationFoundation Level Syllabus Usability Tester Sample Exam Answers
Foundation Level Syllabus Usability Tester Sample Exam s Version 2017 Provided by German Testing Board Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged.
More informationHow to Add Usability Testing to Your Evaluation Toolbox
How to Add Usability Testing to Your Evaluation Toolbox Christine Andrews Paulsen, Ph.D. Concord Evaluation Group cpaulsen@ Presented at AEA, 11/5/11, Anaheim, CA 1 Goals Develop an understanding of usability
More informationCOLLECTION & HOW THE INFORMATION WILL BE USED
Privacy Policy INTRODUCTION As our esteemed client, your privacy is essential to us. Here, at www.indushealthplus.com, we believe that privacy is a top priority. We know that you care how information about
More informationMemorandum Participants Method
Memorandum To: Elizabeth Pass, Associate Professor, School of Writing, Rhetoric and Technical Communication From: Andrew Carnes, WRTC 456 Section 1[ADC] Date: February 2, 2016 Re: Project 1 Competitor
More informationWhat is Usability? What is the Current State? Role and Activities of NIST in Usability Reactions from Stakeholders What s Next?
What is Usability? What is the Current State? Role and Activities of NIST in Usability Reactions from Stakeholders What s Next? Usability is "the extent to which a product can be used by specified users
More informationUPA 2004 Presentation Page 1
UPA 2004 Presentation Page 1 Thomas S. Tullis and Jacqueline N. Stetson Human Interface Design Department, Fidelity Center for Applied Technology Fidelity Investments 82 Devonshire St., V4A Boston, MA
More informationPage 1 of 6 SURVEY: PROJECT TECH
SURVEY: PROJECT TECH Case managers: Thank you for working with us to administer the survey. Please read each question as it is written. Instructions for you are written in italics. Information that you
More informationA Paper Prototype Usability Study of a Chronic Disease Self-management System for Older Adults
A Paper Prototype Usability Study of a Chronic Disease Self-management System for Older Adults D. Tao, C. K. L. Or Department of Industrial and Manufacturing Systems Engineering, The University of Hong
More informationEyetracking Study: Kent State University Library. Janie Ralston
Eyetracking Study: Kent State University Library Janie Ralston Ralston 2 Executive Summary Eyetracking web usability studies allow designers to see where the hot spots are and what information or graphics
More informationProgrammiersprache C++ Winter 2005 Operator overloading (48)
Evaluation Methods Different methods When the evaluation is done How the evaluation is done By whom the evaluation is done Programmiersprache C++ Winter 2005 Operator overloading (48) When the evaluation
More informationPersonal Information. New Profile Icon
What is New in MyChart? On December 8th, we will be upgrading our MyChart patient portal site. We would like to make you aware of a few differences that you will see, when you sign into your MyChart account.
More informationPUTTING THE CUSTOMER FIRST: USER CENTERED DESIGN
PUTTING THE CUSTOMER FIRST: USER CENTERED DESIGN icidigital.com 1 Case Study DEFINE icidigital was chosen as a trusted creative partner to design a forward-thinking suite of sites for AICPA, one of the
More informationMaine ASO Provider Portal Atrezzo End User Guide
Maine ASO Provider Portal Atrezzo End User Guide October 2018 CONTENTS INTRODUCTION... 4 The KEPRO/Maine s Atrezzo Portal Guide... 4 SETUP AND ACCESS ATREZZO... 5 A. New Provider Registration/ Register
More informationA Comparative Usability Test. Orbitz.com vs. Hipmunk.com
A Comparative Usability Test Orbitz.com vs. Hipmunk.com 1 Table of Contents Introduction... 3 Participants... 5 Procedure... 6 Results... 8 Implications... 12 Nuisance variables... 14 Future studies...
More informationFritztile is a brand of The Stonhard Group THE STONHARD GROUP Privacy Notice The Stonhard Group" Notice Whose Personal Data do we collect?
Fritztile is a brand of The Stonhard Group THE STONHARD GROUP Privacy Notice For the purposes of applicable data protection and privacy laws, The Stonhard Group, a division of Stoncor Group, Inc. ( The
More informationQUICK TIPS FOR FULL-ACCESS ACCOUNTS. Florida SHOTS. Contact Information.
Florida SHOTS FOR FULL-ACCESS ACCOUNTS Contact Information www.flshots.com Free help desk: 877-888-SHOT (7468) Monday Friday, 8 A.M. to 5 P.M. Eastern Quick Content Finder LOGGING IN 1 FORGOTTEN PASSWORD
More informationComparative Biometric Testing Official Test Plan
Comparative Biometric Testing Official Test Plan Version 2.1 Copyright 2002 International Biometric Group One Battery Park Plaza New York, NY 10004 212-809-9491 part of the Test Plan may be copied or reproduced
More informationUser Guide Data Entry User
User Guide Data Entry User Version 1.6 Page 1 of 22 Document control Version Description Release date Reason for change Reviewer 1.1 Data Entry User Guide 1.2 Data Entry User Guide 1.3 Data Entry User
More informationICADV LEGAL SERVICES REFERRAL FORM
ICADV LEGAL SERVICES REFERRAL FORM REV. 10/01/16 Referred by: Organization: Phone Number: Email: Circle appropriate title: IMPD Coordinator / Family Advocate / Extended Support Advocate / Other DATE: Updated
More information2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system
Introducing Interactive Systems Design and Evaluation: Usability and Users First Ahmed Seffah Human-Centered Software Engineering Group Department of Computer Science and Software Engineering Concordia
More informationExperimental Evaluation of Effectiveness of E-Government Websites
Experimental Evaluation of Effectiveness of E-Government Websites A. Basit Darem 1, Dr. Suresha 2 1 Research Scholar, DoS in Computer Science, University of Mysore 2 Associate Professor, DoS in Computer
More informationCS Human 2.0 Studio Lo-fi Prototyping & Pilot Usability Test
CS 147 - Human 2.0 Studio Lo-fi Prototyping & Pilot Usability Test Jack G., Amin O., Esteban R. Introduction: Value Proposition: seamless shopping recommendations. Mission Statement: We strive to make
More informationUsability Testing Review
Usability Testing Summary Usability Testing Review Alexis Anand, Katrina Ezis, Ma Shixuan, Cynthia Zhang CSE 440 Section AD All of our usability tests were conducted with students from Computer Science
More informationCriteria for SQF Training Courses
Criteria for SQF Training Courses Licensed Course Criteria and Guidelines 9th Edition M A R C H 2 0 1 8 2018 Food Marketing Institute SQF Institute 2345 Crystal Drive, Suite 800 Arlington, VA 22202 USA
More informationEngagement Portal. Physician Engagement User Guide Press Ganey Associates, Inc.
Engagement Portal Physician Engagement User Guide 2015 Press Ganey Associates, Inc. Contents Logging In... 3 Summary Dashboard... 4 Results For... 5 Filters... 6 Summary Page Engagement Tile... 7 Summary
More informationChapter 4. Evaluating Interface Designs
Chapter 4 Evaluating Interface Designs 1 Introduction Designers may fail to evaluate their designs adequately. Experienced designers know that extensive testing is a necessity. Few years ago, evaluation
More information