ONC HIT Certification Program

Size: px
Start display at page:

Download "ONC HIT Certification Program"

Transcription

1 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, 2017 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Product Version: Domain: Test Type: PowerSoftMD Certified MU2B Ambulatory Complete EHR 1.2 Developer/Vendor Information Developer/Vendor Name: Data Tec, Inc. Address: PO Box Des Peres, MO Website: Phone: (636) Developer/Vendor Contact: Hal Goodall Part 2: ONC-Authorized Certification Body Information 2.1 ONC-Authorized Certification Body Information ONC-ACB Name: InfoGard Laboratories, Inc. Address: 709 Fiero Lane Suite 25 San Luis Obispo, CA Website: Phone: (805) ONC-ACB Contact: Adam Hardcastle This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative: Adam Hardcastle ONC-ACB Authorized Representative Signature and Date EHR Certification Body Manager Function/Title 5/4/ InfoGard. May be reproduced only in its original entirety, without revision Page 1 of 13

2 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, Gap Certification The following identifies criterion or criteria certified via gap certification (a)(1) (a)(17) (d)(5) (d)(9) (a)(6) (b)(5)* (d)(6) (f)(1) (a)(7) (d)(1) (d)(8) *Gap certification allowed for Inpatient setting only No gap certification 2.3 Inherited Certification The following identifies criterion or criteria certified via inherited certification (a)(1) (a)(14) (c)(3) (f)(1) (a)(2) (a)(15) (d)(1) (f)(2) (a)(3) (a)(16) Inpt. only (d)(2) (f)(3) (a)(4) (a)(17) Inpt. only (d)(3) (f)(4) Inpt. only (a)(5) (b)(1) (d)(4) (f)(5) Optional & (a)(6) (b)(2) (d)(5) Amb. only (a)(7) (b)(3) (d)(6) (f)(6) Optional & (a)(8) (b)(4) (d)(7) Amb. only (a)(9) (b)(5) (d)(8) (g)(1) (a)(10) (b)(6) Inpt. only (d)(9) Optional (g)(2) (a)(11) (b)(7) (e)(1) (g)(3) (a)(12) (c)(1) (e)(2) Amb. only (g)(4) (a)(13) (c)(2) (e)(3) Amb. only No inherited certification 2017 InfoGard. May be reproduced only in its original entirety, without revision Page 2 of 13

3 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, 2017 Part 3: NVLAP-Accredited Testing Laboratory Information Report Number: R-0092 V1.3 Test Date(s): July 27, 2014 February 27, 2015 Location of Testing: InfoGard and Vendor Site 3.1 NVLAP-Accredited Testing Laboratory Information ATL Name: InfoGard Laboratories, Inc. Accreditation Number: NVLAP Lab Code Address: 709 Fiero Lane Suite 25 San Luis Obispo, CA Website: Phone: (805) ATL Contact: Milton Padilla For more information on scope of accreditation, please reference Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory Representative: Milton Padilla ATL Authorized Representative EHR Test Body Manager Function/Title Signature and Date 5/4/ Test Information Additional Software Relied Upon for Certification Additional Software NewCrop Applicable Criteria (a)2, (a)8, a(10), (b)1, (b)2, (b)3, (b)4, (b)7, (e)1, (e)3 Functionality provided by Additional Software Ordering Medication and Patient Portal No additional software required 2017 InfoGard. May be reproduced only in its original entirety, without revision Page 3 of 13

4 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, Test Tools Test Tool Version Cypress eprescribing Validation Tool HL7 CDA Cancer Registry Reporting Validation Tool HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool HL7 v2 Immunization Information System (IIS) Reporting Validation Tool HL7 v2 Laboratory Results Interface (LRI) Validation Tool HL7 v2 Syndromic Surveillance Reporting Validation Tool Transport Testing Tool 179 Direct Certificate Discovery Tool No test tools required Test Data Alteration (customization) to the test data was necessary and is described in Appendix [insert appendix letter] No alteration (customization) to the test data was necessary Standards Multiple Standards Permitted The following identifies the standard(s) that has been successfully tested where more than one standard is permitted Criterion # (a)(8)(ii)(a)(2) (a)(13) (b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain (a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release Standard Successfully Tested (b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide (j) HL7 Version 3 Standard: Clinical Genomics; Pedigree 2017 InfoGard. May be reproduced only in its original entirety, without revision Page 4 of 13

5 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, 2017 Criterion # (a)(15)(i) (a)(16)(ii) (b)(2)(i)(a) (b)(7)(i) (e)(1)(i) (e)(1)(ii)(a)(2) (e)(3)(ii) Common MU Data Set (15) (b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain (g) Network Time Protocol Version 3 (RFC 1305) (i) The code set specified at 45 CFR (c)(2) (ICD-10- CM) for the indicated conditions (i) The code set specified at 45 CFR (c)(2) (ICD-10- CM) for the indicated conditions Standard Successfully Tested Annex A of the FIPS Publication SHA256 RSA 2048bit (g) Network Time Protocol Version 3 (RFC 1305) (b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide (g) Network Time Protocol Version 4 (RFC 5905) (a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release (a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release (g) Network Time Protocol Version 4 (RFC 5905) Annex A of the FIPS Publication Sha256RSA(2048 bits) (a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release (b)(2) The code set specified at 45 CFR (a)(5) (HCPCS and CPT-4) None of the criteria and corresponding standards listed above are applicable Newer Versions of Standards The following identifies the newer version of a minimum standard(s) that has been successfully tested Newer Version Applicable Criteria No newer version of a minimum standard was tested 2017 InfoGard. May be reproduced only in its original entirety, without revision Page 5 of 13

6 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, Optional Functionality Criterion # (a)(4)(iii) (b)(1)(i)(b) (b)(1)(i)(c) (b)(2)(ii)(b) (b)(2)(ii)(c) (f)(3) Common MU Data Set (15) Common MU Data Set (15) Optional Functionality Successfully Tested Plot and display growth charts Receive summary care record using the standards specified at (a) and (b) (Direct and XDM Validation) Receive summary care record using the standards specified at (b) and (c) (SOAP Protocols) Transmit health information to a Third Party using the standards specified at (a) and (b) (Direct and XDM Validation) Transmit health information to a Third Party using the standards specified at (b) and (c) (SOAP Protocols) Ambulatory setting only Create syndrome-based public health surveillance information for transmission using the standard specified at (d)(3) (urgent care visit scenario) Express Procedures according to the standard specified at (b)(3) (45 CFR (a)(4): Code on Dental Procedures and Nomenclature) Express Procedures according to the standard specified at (b)(4) (45 CFR (c)(3): ICD-10-PCS) No optional functionality tested 2017 InfoGard. May be reproduced only in its original entirety, without revision Page 6 of 13

7 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, Edition Certification Criteria* Successfully Tested Criteria # Version Version Criteria # TP** TD*** TP TD (a)(1) (c)(3) (a)(2) (d)(1) (a)(3) (d)(2) 1.4 (a)(4) (d)(3) 1.3 (a)(5) (a)(6) (a)(7) (d)(4) (d)(5) (d)(6) (a)(8) 1.2 (d)(7) (a)(9) (a)(10) (d)(8) (d)(9) Optional (a)(11) (e)(1) 1.7 (a)(12) (e)(2) Amb. only 1.2 (a)(13) (e)(3) Amb. only 1.3 (a)(14) (f)(1) (a)(15) 1.5 (f)(2) (a)(16) Inpt. only (f)(3) (a)(17) Inpt. only (f)(4) Inpt. only (b)(1) (f)(5) Optional & (b)(2) Amb. only (b)(3) (b)(4) (b)(5) (g)(1) (f)(6) Optional & Amb. only (b)(6) Inpt. only (g)(2) (b)(7) (g)(3) 1.3 (c)(1) (g)(4) 1.2 (c)(2) *For a list of the 2014 Edition Certification Criteria, please reference (navigation: 2014 Edition Test Method) **Indicates the version number for the Test Procedure (TP) ***Indicates the version number for the Test Data (TD) 2017 InfoGard. May be reproduced only in its original entirety, without revision Page 7 of 13

8 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, Clinical Quality Measures* Type of Clinical Quality Measures Successfully Tested: Ambulatory Inpatient No CQMs tested *For a list of the 2014 Clinical Quality Measures, please reference (navigation: 2014 Clinical Quality Measures) Ambulatory CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version V3 50 V V V V V V V V3 182 Inpatient CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version InfoGard. May be reproduced only in its original entirety, without revision Page 8 of 13

9 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, Automated Numerator Recording and Measure Calculation Automated Numerator Recording Automated Numerator Recording Successfully Tested (a)(1) (a)(9) (a)(16) (b)(6) (a)(3) (a)(11) (a)(17) (e)(1) (a)(4) (a)(12) (b)(2) (e)(2) (a)(5) (a)(13) (b)(3) (e)(3) (a)(6) (a)(14) (b)(4) (a)(7) (a)(15) (b)(5) Automated Numerator Recording was not tested Automated Measure Calculation Automated Numerator Recording Successfully Tested (a)(1) (a)(9) (a)(16) (b)(6) (a)(3) (a)(11) (a)(17) (e)(1) (a)(4) (a)(12) (b)(2) (e)(2) (a)(5) (a)(13) (b)(3) (e)(3) (a)(6) (a)(14) (b)(4) (a)(7) (a)(15) (b)(5) Automated Measure Calculation was not tested Attestation Attestation Forms (as applicable) Safety-Enhanced Design* Quality Management System** Privacy and Security Appendix A B C *Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4) **Required for every EHR product 2017 InfoGard. May be reproduced only in its original entirety, without revision Page 9 of 13

10 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, 2017 Appendix A: Safety-Enhanced Design An inaccurate description of the summative usability testing measures used for Effectiveness, Efficiency, and Satisfaction was provided in the "Results" section of the report. The information provided in the table of results data did not match the results as described for measures of Effectiveness, Efficiency, and Satisfaction InfoGard. May be reproduced only in its original entirety, without revision Page 10 of 13

11 PO Box 31576, Des Peres, MO Voice (636) Fax (636) October 27, 2015 To Infogard: PowerSoftMD utilizes the NISTIR 7741 standard to design the following measures: (a)(1) Computerized provider order entry (a)(2) Drug-drug, drug-allergy interaction checks (a)(6) Medication list (a)(7) Medication allergy list (a)(8) Clinical decision support (b)(3) Electronic prescribing (b)(4) Clinical information reconciliation Sincerely, Peter Goodall Data Tec, Inc.

12 Page 1 of 18 EHR Usability Test Report of PowerSoftMD Certified Version MU2 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports PowerSoftMD Certified Version MU2 by Data Tec, Inc. Date of Usability Test: Date of Report: Report Prepared By: March 31 st through April 8 th April 9 th Data Tec, Inc. Peter Goodall, Manager, Data Tec, Inc. (636) info@powersoftmd.com PO Box 31576, Des Peres, MO Table of Contents 1 EXECUTIVE SUMMARY 2 2 INTRODUCTION 6 3 METHOD PARTICIPANTS STUDY DESIGN TASKS PROCEDURE TEST LOCATION TEST ENVIRONMENT TEST FORMS AND TOOLS PARTICIPANT INSTRUCTIONS USABILITY METRICS 13 4 RESULTS DATA ANALYSIS AND REPORTING DISCUSSION OF THE FINDINGS PARTICIPANT DEMOGRAPHICS 18 Mar 2014

13 Page 2 of 18 EXECUTIVE SUMMARY A usability test of PowerSoftMD Certified, MU2, Ambulatory Complete EHR was conducted between the dates of Mach 31 st through November 16 th in Des Peres, MO, by Data Tec, Inc. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 3 healthcare providers, 1 medical assistant and 1 office worker matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 9 tasks typically conducted on an EHR: First Impressions Review Computerized Orders Create New Medication Orders Create New Laboratory Orders Create New Radiology Orders Mark Orders Complete Electronic Prescriptions Build Medication List Build Allergy List Clinical Information Reconciliation Clinical Decision Support Rule Drug-Drug, Drug-Allergy Interactions During the 120 minute one-on-one usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants between no experience to several years prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data on paper and electronically. The administrator did not give the participant assistance in how to complete the task. Mar 2014

14 Page 3 of 18 Participant screens, head shots and audio were recorded for subsequent analysis. The following types of data were collected for each participant: Number of tasks successfully completed within the allotted time without assistance Time to complete the tasks Number and types of errors Path deviations Participant s verbalizations Participant s satisfaction ratings of the system All participant data was de-identified no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a post- test questionnaire and were compensated with $50 per hour compensation] for their time. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT. Mar 2014

15 Page 4 of 18 Chart of Task Attempts, Successes and Failures: Task 1. First Impressions 2. Review Computerized Orders 3. Create New Medication Orders 4. Create New Laboratory Orders 5. Create New Radiology Orders 6. Mark Orders Complete 7. Electronic Prescriptions 8. Build Medication List 9. Build Allergy List 10. Clinical Information Reconciliation 11. Clinical Decision Support Rule 12. Drug-Drug Drug-Allergy Interaction # of attem pts Task Success - Mean Path Deviation, n Task Time Mean, in seconds Task Time mean of Deviations, in Seconds Errors, n Task Rating, (mean) 1=Very Easy N/A N/A N/A Mar 2014

16 Page 5 of 18 The results from the System Usability Scale scored the subjective satisfaction with the system based on performance with these tasks to be a mean of: In addition to the performance data, the following qualitative observations were made: - Major findings Overall, the subjects found the software easy to use once they acclimated to it. Most responded that once they knew or were shown how to do a task, they did not have to be shown again. They found the flashing arrow and prompts very helpful. Some subjects were confused by the difference between medications and prescriptions in e-scripts, but once they understood the meaning of the terms, were able to find the correct information. - Areas for improvement Several subjects had difficulty remembering to click the Select to Move to Current Meds button once they had select the correct med, suggesting removal of this in a future update, if possible. In general, some subjects were uncomfortable with the slight interface change between PowerSoft and the e- scripts window, although they quickly adapted. Subjects took a long time with the Clinical Decision Support Rule, and appeared to get hung up on the many options available. 5 See Tullis, T. & Albert, W. (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149). Broadly interpreted, scores under 60 represent systems with poor usability; cores over 80 would be considered above average. Mar 2014

17 Page 6 of 18 INTRODUCTION The EHRUT tested for this study was Software engineering Software product Quality Requirements and Evaluation(SQuaRE) Common Industry Format (CIF) for usability test reports (ISO/IEC 25062:2006(E)) Designed to present medical information to healthcare providers in Ambulatory settings, the EHRUT consists of timed, task-oriented trials. The usability testing attempted to represent realistic exercises and conditions. METHOD The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT).. To this end, measures of effectiveness, efficiency and user satisfaction, such as the time taken for each task, and the ease of completion were captured during the usability testing. PARTICIPANTS A total of 5 participants were tested on the EHRUT(s). Participants in the test was a medical assistant, a former office manager, 2 current medical office managers and a medical doctor. Participants were recruited by Data Tec, Inc. and were compensated $50 per hour (typically 1 hour) for their time. In addition, participants had no direct connection to the development of or organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were not given the opportunity to have the same orientation and level of training as the actual end users would have received. They were seeing several of these features for the first time. For the test purposes, end-user characteristics were identified and translated into a recruitment screener used to solicit potential participants; an example of a screener is provided in Appendix 1. Mar 2014

18 Page 7 of 18 Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual s data cannot be tied back to individual identities. 5 participants (matching the demographics in the section on Participants) were recruited and 5 participated in the usability test. Zero participants failed to show for the study. Participants were scheduled for 120 minute sessions with 1 minute in between each session for debrief by the administrator(s) and data logger(s), and to reset systems to proper test conditions. A spreadsheet was used to keep track of the participant schedule, and included each participant s demographic characteristics as provided by the recruiting firm. STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well that is, effectively, efficiently, and with satisfaction and areas where the application failed to meet the needs of Mar 2014

19 Page 8 of 18 the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted with 01 EHR. Each participant used the system in the same location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant: Number of tasks successfully completed within the allotted time without assistance Time to complete the tasks Number and types of errors Path deviations Participant s verbalizations (comments) Participant s satisfaction ratings of the system Additional information about the various measures can be found in Section 3.9 on Usability Metrics. TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including: First Impressions Review Computerized Orders Create New Medication Orders Create New Laboratory Orders Create New Radiology Orders Mark Orders Complete Electronic Prescriptions Build Medication List Mar 2014

20 Build Allergy List Clinical Information Reconciliation Clinical Decision Support Rule Drug-Drug, Drug-Allergy Interactions Page 9 of 18 Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users. 6 Tasks should always be constructed in light of the study objectives. PROCEDURES Upon arrival, participants were greeted; their identity was verified and matched with a name on the participant schedule. Participants were then assigned a participant ID. 7 Each participant reviewed and signed an informed consent and release form (See Appendix 3). A representative from the test team witnessed the participant s signature. To ensure that the test ran smoothly, two staff members participated in this test, the usability administrator and the data logger. The usability testing staff conducting the test was experienced usability practitioners with Peter Peter Goodall, a bachelor of political science from the University of Missouri, with 7 years experience in the medical software field. The administrator moderated the session including administering instructions and tasks. The administrator also monitored task times, obtained post-task rating data, and took notes on participant comments. A second person served as the data logger and took notes on task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below): As quickly as possible making as few errors and deviations as possible. 6 Constructing appropriate tasks is of critical importance to the validity of a usability test. These are the actual functions, but most tasks contain larger and more fleshed out context that aligns with the sample data sets available in the tested EHR. Please consult usability references for guidance on how to construct appropriate tasks. 7 All participant data must be de-identified and kept confidential. Mar 2014

21 Page 10 of 18 Without assistance; administrators were allowed to give immaterial guidance and clarification on tasks, but not instructions on use. Without using a think aloud technique. For each task, the participants were given a written copy of the task. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Scoring is discussed below in Section 3.9. Following the session, the administrator gave the participant the post-test questionnaire (e.g., the System Usability Scale, see Appendix 5), compensated them for their time, and thanked each individual for their participation. Participants' demographic information, task success rate, time on task, errors, deviations, verbal responses, and post-test questionnaire were recorded into a spreadsheet. Participants were thanked for their time and compensated. Participants signed a receipt and acknowledgement form (See Appendix 6) indicating that they had received the compensation. TEST LOCATION The test facility included a waiting area and a quiet testing room with a table, computer for the participant, and recording computer for the administrator. Only the participant and administrator were in the test room. All observers and the data logger worked from a separate room where they could see the participant s screen and face shot, and listen to Mar 2014

22 Page 11 of 18 the audio of the session. To ensure that the environment was comfortable for users, noise levels were kept to a minimum with the ambient temperature within a normal range. All of the safety instruction and evacuation procedures were valid, in place, and visible to the participants. TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in an office building. For testing, the computer used an Intel-based computer running Windows 8. The participants used a mouse and keyboard when interacting with the EHRUT. TEST FORMS AND TOOLS The test subject used a 17 monitor at 1366x768 resolution with 16 million colors. The application was set up by Data Tec, Inc. according to the vendor s documentation describing the system set-up and preparation. The application itself was running on a Windows 8 Computer using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size). During the usability test, various documents and instruments were used, including: 1. Informed Consent 2. Moderator s Guide Mar 2014

23 Page 12 of Post-test Questionnaire 4. Incentive Receipt and Acknowledgment Form Examples of these documents can be found in Appendices 3-6 respectively. The Moderator s Guide was devised so as to be able to capture required data. The participant s interaction with the EHRUT was captured and recorded digitally with screen capture software running on the test machine. A web camera recorded each participant s facial expressions synced with the screen capture, and verbal comments were recorded with a microphone. 8 The test session were electronically transmitted to a nearby observation room where the data logger observed the test session. PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator s guide in Appendix [B4]): Thank you for participating in this study. Your input is very important. Our session today will last about 120 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions very closely. Please note that we are not testing you we are testing the system, therefore if you have difficulty all this means is that something needs to be improved in the system. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application. Overall, we are interested in how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. Please be honest with your opinions. All of the information 8 There are a variety of tools that record screens and transmit those recordings across a local area network for remote observations. Mar 2014

24 Page 13 of 18 that you provide will be kept confidential and your name will not be associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing. Following the procedural instructions, participants were shown the EHR and as their first task, were given time (10 minutes) to explore the system and make comments. Once this task was complete, the administrator gave the following instructions: For each task, I will read the description to you and say Begin. At that point, please perform the task and say Done once you believe you have successfully completed the task. I would like to request that you not talk aloud or verbalize while you are doing the tasks. 9 I will ask you your impressions about the task once you are done. Participants were then given 9 tasks to complete. Tasks are listed in the moderator s guide in Appendix [B4]. USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess: 1. Effectiveness of the test by measuring participant success rates and errors 2. Efficiency of the test by measuring the average task time and path deviations 9 Participants should not use a think-aloud protocol during the testing. Excessive verbalization or attempts to converse with the moderator during task performance should be strongly discouraged. Participants will naturally provide commentary, but they should do so, ideally, after the testing. Some verbal commentary may be acceptable between tasks, but again should be minimized by the moderator. Mar 2014

25 Page 14 of Satisfaction with the test by measuring ease of use ratings DATA SCORING The following table (Table [x]) details how tasks were scored, errors evaluated, and the time data analyzed. 10 Measures Effectiveness: Task Success Rationale and Scoring A task was counted as a Success if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times were recorded for successes. Observed task times divided by the optimal time for each task is a measure of optimal efficiency. Optimal task performance time, as benchmarked by expert performance under realistic conditions, is recorded when constructing tasks. Target task times used for task times in the Moderator s Guide must be operationally defined by taking multiple measures of optimal performance. Effectiveness: Task Failures Efficiency: Task Deviations If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as an Failures. No task times were taken for errors. The total number of errors was calculated for each task and then divided by the total number of times that task was attempted. Not all deviations would be counted as errors. 11 This should also be expressed as the mean number of failed tasks per participant. On a qualitative level, an enumeration of errors and error types should be collected. The participant s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. 10 An excellent resource is Tullis, T. & Albert, W. (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman. Also see 11 Errors have to be operationally defined by the test team prior to testing. Mar 2014

26 Page 15 of 18 Efficiency: Task Time Satisfaction: Task Rating It is strongly recommended that task deviations be reported. Optimal paths (i.e., procedural steps) should be recorded when constructing tasks. Each task was timed from when the administrator said Begin until the participant said, Done. If he or she failed to say Done, the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Variance measures (standard deviation and standard error) were also calculated. Participant s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate Overall, this task was: on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. 12 Common convention is that average ratings for systems judged easy to use should be 3.3 or above. To measure participants confidence in and likeability of the test overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, I think I would like to use this system frequently, I thought the system was easy to use, and I would imagine that most people would learn to use this system very quickly. See full System Usability Score questionnaire in Appendix Details of how observed data were scored. Mar 2014

27 Page 16 of 18 RESULTS DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses 12 See Tedesco and Tullis (2006) for a comparison of post-task ratings for usability tests. Tedesco, D. & Tullis, T. (2006) A comparison of methods for eliciting post-task subjective ratings in usability testing. Usability Professionals association Conference, June 12 16, Broomfield, CO. 13 The SUS survey yields a single number that represents a composite measure of the overall perceived usability of the system. SUS scores have a range of 0 to 100 and the score is a relative benchmark that is used against other iterations of the system. The usability testing results for the EHRUT are detailed below 14. The results should be seen in light of the objectives and goals outlined in Section 3.2 Study Design. The data should yield actionable results that, if corrected, yield material, positive impact on user performance. The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be a mean of : DISCUSSION OF THE FINDINGS 14 Note that this table is an example. You will need to adapt it to report the actual data collected. 15 See Tullis, T. & Albert, W. (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149). Mar 2014

28 EFFECTIVENESS It appeared that the subjects only pressed the buttons they needed to, and mostly ignored anything they didn t use, although it is reasonable to think that less buttons would lead to less hesitation. EFFICIENCY Most users found they had several options from each of the screens and they could easily jump to any screen they wanted to from any other screen. Again, if they did not need a particular button or option, most would simply ignore it. If they did need it, they were happy it was there. Overall, they found it very efficient. SATISFACTION The subjects were very impressed with the software and reported finding it very easy-touse. Most would recommend it to others and those that don t use this EMR in an office setting would be happy to give it a try. Moreover, they would be happy to receive additional training and learn more about the software, if they knew nothing about it in the first place. MAJOR FINDINGS Overall, the subjects found the software easy to use once they acclimated to it. Most responded that once they knew or were shown how to do a task, they did not have to be shown again. They found the flashing arrow and prompts very helpful. Some subjects were confused by the difference between medications and prescriptions in e-scripts, but once they understood the meaning of the terms, were able to find the correct information. AREAS FOR IMPROVEMENT Several subjects had difficulty remembering to click the Select to Move to Current Meds button once they had select the correct med, suggesting removal of this in a future update, if possible. In general, some subjects were uncomfortable with the slight interface change between PowerSoft and the e-scipts window, although they quickly adapted. Mar 2014

29 PARTICIPANT DEMOGRAPHICS The report should contain a breakdown of the key participant demographics. A representative list is shown below. Following is a high-level overview of the participants in this study. Gender Men Women [2 ] [3 ] Total (participants) [5 ] Occupation/Role RN/BSN [0 ] Physician [1 ] Admin Staff [4 ] Total (participants) [5 ] Years of Experience Years experience [0 ] Facility Use of EHR All paper [0 ] Some paper, some [5 ] electronic All electronic [0 ] Total (participants) [5 ] Mar 2014

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96 Usability Test Report of NewCrop, LLC (g)(3) NewCrop Core Version: Date of Usability Test:/Date of Report: 07/25/2013 Usability Test Conducted: 07/25/2013 Report Prepared: 07/25/2013 Report Preparted by: NewCrop, LLC Jennifer Harvey Newcrop, LLC 1800 Bering Dr. #600 Houston, TX

97 Table of Contents 1. EXECUTIVE SUMMARY METHOD PARTICIPANTS TASKS USABILITY METRICS RESULTS DATA ANALYSIS AND REPORTING DISCUSSION OF FINDINGS APPENDICES Appendix 1: PARTICIPANT DEMOGRAPHICS Appendix 2: INFORMED CONSENT FORM Appendix 3: FINAL QUESTIONS Appendix 4: SYSTEM USABILITY SCALE QUESTIONNAIRE

98 1. EXECUTIVE SUMMARY A usability test of NewCrop Core was conducted on 07/25/2013 by online meeting via GoToMeeting by Jennifer Harvey. The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, one healthcare provider and one nurse matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 18 tasks typically conducted on NewCrop Core 13.05: CPOE (314.a.1) Record Medication Order Change Medication Order Access Medication Order Record Laboratory Order Change Laboratory Order Access Laboratory Order Record Radiology/Imaging Order Change Radiology/Imaging Order Access Radiology/Imaging Order Drug-drug, drug-allergy interactions checks (314.a.2) Create drug-drug and drug-allergy interventions prior to CPOE completion Adjustment of severity level of drug-drug interventions Medication list (314.a.6) Record Medication List Change Medication List Access Medication List Medication allergy list (314.a.7) Record Medication List Change Medication List Access Medication List Formulary Checking (314.a.7) Electronic prescribing (314.b.3) Create prescriptions Clinical Messaging Compose Clinical message View Unread Clinical Message View Sent Clinical Message 3

99 2. METHOD During the 60 minute usability test, participants were greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 2); they were instructed that they could withdraw at any time. Participants had prior experience with NewCrop Core through their EHR application. Testing was completed in a Pre-Production environment, using a direct log in to NewCrop Core The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time). During the testing, the administrator recorded user performance data on paper and electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: Number of tasks successfully completed Number and types of errors Path deviations Participant s verbalizations Participant s satisfaction ratings of the system Following the conclusion of the testing, the practice was asked to complete a post- test questionnaire. Following is a summary of the performance and rating data collected on the EHRUT. 2.1 PARTICIPANTS Participants for the test consisted of the following subjects 1 Doctor of Internal Medicine with 5 years of experience with EHR software. 1 Licensed Practical Nurse with 5 years of experience with EHR Software. 4

100 2.2 TASKS Task Ratings 1= Very Easy 2= Easy 3= Neither Easy, Nor Difficult 4= Difficult 5= Very Difficult Physician Results: Task Path Deviation Number of errors Type of Error Task Rating Record Medication Order None Very Easy Change Medication Order Access Medication Order Record Laboratory Order None Very Easy None Very Easy None very easy Change Laboratory Order Clicked Validate order prior to modifying the order very easy Access Laboratory Order Record Radiology/Imaging Order Change Radiology/Imaging Order Access Radiology/Imaging Order None very easy None very easy None very easy None very easy Access Formulary Checking Create Drug-Drug Intervention Create Drug- Allergy Intervention None very easy None very easy None very easy 5

101 Adjust Severity of Drug-Drug Intervention Adjust Severity of Drug-Allergy Intervention None very easy None very easy Record Medication Allergy List Change Medication Allergy List Access Medication Allergy List Create Electronic Prescription None very easy None very easy None very easy None very easy Compose Clinical Message None Very Easy View Unread Clinical Message View Sent Message No deviations, but test subject stated that the layout was cluttered and appeared to be one solid link rather than individual links for Compose, Inbox, and Sent, prompting him to give a rating of 2-Easy No deviations, but test subject stated that the layout was cluttered and appeared to be one solid link rather than individual links for Compose, Inbox, and Sent, prompting him to give a rating of 2-Easy Easy Easy 6

102 Nurse Results: Task Path Deviation Number of errors Type of Error Task Rating Record Medication Order None Very Easy Change Medication Order Access Medication Order Record Laboratory Order Change Laboratory Order Access Laboratory Order Record Radiology/Imaging Order Change Radiology/Imaging Order Access Radiology/Imaging Order Access Formulary Checking Create Drug-Drug Intervention Create Drug- Allergy Intervention None Very Easy None Very Easy None very easy None very easy None very easy None very easy None very easy None very easy None very easy None very easy None very easy Adjust Severity of Drug-Drug Intervention None very easy 7

103 Adjust Severity of Drug-Allergy Intervention None very easy Record Medication Allergy List Change Medication Allergy List Access Medication Allergy List Create Electronic Prescription None very easy None very easy None very easy None very easy Compose Clinical Message None Very Easy View Unread Clinical Message View Sent Message No deviations, but test subject stated that the layout was cluttered and appeared to be one solid link rather than individual links for Compose, Inbox, and Sent, prompting him to give a rating of 2-Easy No deviations, but test subject stated that the layout was cluttered and appeared to be one solid link rather than individual links for Compose, Inbox, and Sent, prompting him to give a rating of 2-Easy Easy Easy 2.3 USABILITY METRICS 1= Very Easy 2= Easy 3= Neither Easy, Nor Difficult 4= Difficult 5= Very Difficult 3. RESULTS 3.1 DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the Task Ratings specified above. 8

104 3.2 DISCUSSION OF FINDINGS SATISFACTION The test subjects were satisfied with the workflow of commonly used features such as writing and transmitting new medications, discontinuing medications, formulary checking, drug-drug interaction checking, drug-allergy interaction checking. They were also satisfied with new features such as clinical messaging and lab ordering, though feedback was given regarding areas for improvement in clinical messaging. MAJOR FINDINGS Overall, the test subjects navigated the system with ease. There was a slight deviation in changing the lab order the first time where the test doctor clicked validate before adding a new lab and validating the order. This was the only actual error in 18 tasks tested. AREAS FOR IMPROVEMENT In the course of testing, the subjects indicated that they would like to see a confirmation for drug-drug interaction checking that involved actually clicking an acknowledgement of the interaction, rather than simply viewing it on the screen. The subjects also indicated that the separate links for Compose, Inbox, and Sent messages in the clinical messaging portion of the system are too close together and give the impression of being one link. This led to some confusion in the messaging portion of the test, but no errors or deviations were recorded due to this confusion. In the imaging and laboratory portion of the test, the physician recommended that the drop down box for lab location selection be renamed lab/imaging facility because it could cause confusion if taken literally, as imaging orders aren t sent to a lab. It did not cause errors or deviations with either subject. 4. APPENDICES The following appendices include supplemental data for this usability test report. Following is a list of the appendices provided: 1. Participant Demographics 2. Informed Consent Form 3. Final Questions 4. System Usability Scale Questionnaire 4.1 Appendix 1: PARTICIPANT DEMOGRAPHICS Following is a high-level overview of the participants in this study. Gender Men [1] Women [1] Total (participants) [2] Occupation/Role LPN [1] Physician [1] Admin Staff [0] Total (participants) [2] Years of Experience 9

105 Facility Use of EHR [5]All paper [8] Some paper, some electronic [3] All electronic [2] Total (participants) [2] Detailed Participant Overview The intended users, of NewCrop Core 13.05, include Physicians, Mid-level Prescribers, Nurses, and Clinical Staff. For this case study, one Physician and one Nurse performed all tests. 4.2 Appendix 2: INFORMED CONSENT FORM 10

106 11

107 4.3 Appendix 3: FINAL QUESTIONS What was your overall impression of this system? Overall impression is very good, nice interface and easy to understand and intuitive What aspects of the system did you like most? erx we have done and works great, appreciate adding interface for lab ordering What aspects of the system did you like least? messaging, once a message is sent the message should leave the screen Were there any features that you were surprised to see? yes, surprised to see the lab feature with a Medication company What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? not sure as I have not seen everything but the question is if this is solely lab ordering or what I'd like to see is also interface of receiving lab as structured data Compare this system to other systems you have used. Would you recommend this system to your colleagues? Only using MacPractice that does not really have this feature yet 4.4 Appendix 4: SYSTEM USABILITY SCALE QUESTIONNAIRE Please answer the following questions with a response of strongly agree, agree, neither agree nor disagree, disagree, or strongly disagree 1. I think that I would like to use this system frequently. strongly agree 2. I found the system unnecessarily complex. strongly disagree 3. I thought the system was easy to use. strongly agree 4. I think that I would need the support of a technical person to be able to use this system. strongly disagree 5. I found the various functions in this system were well integrated. strongly agree 6. I thought there was too much inconsistency in this system. strongly disagree 7. I would imagine that most people would learn to use this system very quickly. strongly agree 8. I found the system very cumbersome to use. strongly disagree 9. I felt very confident using the system. strongly agree 10. I needed to learn a lot of things before I could get going with this system. strongly disagree 12

108 Page 7 Supplement - Test Subjects Demographics Chart Part ID Gend er Age Educati on Professional Experience Computer Experience Product Experience Assistive Technology Needs 1 Subj ect 1 2 Subj ect 2 3 Subj ect 3 4 Subj ect 4 5 Subj ect 5 femal e femal e femal e male male Not collecte d Not collecte d Not collecte d Not collecte d Not collecte d Not collecte d Not collecte d Not collecte d Not collecte d Not collecte d 20 years medical office manager 7 years medical assistant 6 years medical office manager 39 years as a physician 12 years as a medical office manager Not collected Not collected Not collected Not collected Not collected 7 years Not collected none Not collected 6 years Not collected 5 years Not collected Not collected

109 Page 14 Supplement - Qualitative Evaluation of Errors: Addendum To DATA SCORING Task 1) First Impressions - errors not applicable Task 2) Review Computerized Orders - zero errors total Task 3) Create New Medication Orders - 3 errors total Error 1) Subject 2 tried to click on existing orders instead of new orders as the create new order button is below the list of existing orders Error 2) Subject 3 did not know to press complete meds button once they had selected meds Error 3) Subject 4 added to the meds list, but also transmitted the medication to the pharmacy as if it was a prescription Task 4) Create new lab orders - 1 error total Error 1) Subject 3 did not hit the save & exit button after composing the order Task 5) Create New Radiology Orders - zero errors total Task 6) Mark Orders Complete - 1 error total Error 1) Subject 4 marked multiple orders complete instead of just 1 Task 7) Electronic Prescriptions - 5 errors total Error 1) Subject 1 ignored the warnings for contraindication Error 2) Subject 2 made a spelling error, could not find the drug and could not figure out how to search again Error 3) Subject 3 got to the review page and thought she was finished, but she still needed to send the prescriptions Error 4) Subject 4 decided to send the drug, even with the contraindication warning Error 5) Subject 5 had trouble viewing the screen, had to wait for each screen Task 8) Build medication List - 1 error total Error 1) Subject 2 added an order instead of medication list, then started over and left that order pending Task 9) Build Allergy List - zero error total

110 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, 2017 Appendix B: Quality Management System 2017 InfoGard. May be reproduced only in its original entirety, without revision Page 11 of 13

111 Quality Management System Attestation Form-EHR-37-V03 For reporting information related to testing of (g)(4). Vendor and Product Information Vendor Name Product Name Data Tec, Inc. PowerSoftMD Certified Quality Management System Type of Quality Management System (QMS) used in the development, testing, implementation, and maintenance of EHR product. Was one QMS used for all certification criteria or were multiple QMS applied? Based on Industry Standard (for example ISO9001, IEC 62304, ISO 13485, etc.). Standard: A modified or home-grown QMS. No QMS was used. One QMS used. Multiple QMS used. Description or documentation of QMS applied to each criteria: Not Applicable. Statement of Compliance I, the undersigned, attest that the statements in this document are completed and accurate. Vendor Signature by an Authorized Representative Date Dec. 5, 2014 InfoGard Laboratories, Inc. Page 1

112 Test Results Summary for 2014 Edition EHR Certification R-0092-PRA V1.2 May 4, 2017 Appendix C: Privacy and Security 2017 InfoGard. May be reproduced only in its original entirety, without revision Page 12 of 13

113 Privacy and Security Attestation Form-EHR-36-V04 Vendor and Product Information Vendor Name Product Name Data Tec, Inc. PowerSoftMD Certified Privacy and Security (d)(2) Auditable events and tamperresistance Not Applicable (did not test to this criteria) Audit Log: Cannot be disabled by any user. Audit Log can be disabled. The EHR enforces that the audit log is enabled by default when initially configured Audit Log Status Indicator: Cannot be disabled by any user. Audit Log Status can be disabled The EHR enforces a default audit log status. Identify the default setting (enabled or disabled): There is no Audit Log Status Indicator because the Audit Log cannot be disabled. Encryption Status Indicator (encryption of health information locally on end user device): Cannot be disabled by any user. Encryption Status Indicator can be disabled The EHR enforces a default encryption status. Identify the default setting (enabled or disabled): There is no Encryption Status Indicator because the EHR does not allow health information to be stored locally on end user devices. Identify the submitted documentation that describes the inability of the EHR to allow users to disable the audit logs, the audit log status, and/or the encryption status: PowerSoftMD does not allow the Audit Logs to be disabled. Therefore, there is no Audit Log Status indication because it is always active. PowerSoftMD does not ever give users access to Audit Log Files or Data Bases; only the ability for authorized user to view reports. PowerSoftMD Certified is designed to not store any electronic health information on workstations or end user devices. Since, even when in use, no electronic health information is stored on the device, there is not any electronic health information on the workstations or end user device to encrypt. You can also see AuditLogsAndEncryptionAffidavit.pdf InfoGard Laboratories, Inc. Page 1

114 Privacy and Security Attestation Form-EHR-36-V04 Identify the submitted documentation that describes the method(s) by which the EHR protects 1) recording of actions related to electronic health information, 2) recording of audit log status, and 3) recording of encryption status from being changed, overwritten, or deleted by the EHR technology: Actual logs are never presented for users, when requested, only copies of logs are presented. Logs are not stored in user editable format. No options to view, change, overwrite, or delete log files exist in the PowerSoftMD Certified software. Also refer to our AuditLogsAndEncryptionAffidavit.pdf. Identify the submitted documentation that describes the method(s) by which the EHR technology detects whether the audit log has been altered: The Audit Log is not accessible by users. Refer to our AuditLogsAndEncryptionAffidavit.pdf (d)(7) End-user device encryption Storing electronic health information locally on enduser devices (i.e. temp files, cookies, or other types of cache approaches). The EHR does not allow health information to be stored locally on end-user devices. Identify the submitted documentation that describes the functionality used to prevent health information from being stored locally: There are no functions in the PowerSoftMD software for storing patient health information on end user devices. Also refer to our AuditLogsAndEncryptionAffidavit.pdf. Not Applicable (did not test to this criteria) The EHR does allow health information to be stored locally on end user devices. Identify the FIPS approved algorithm used for encryption: Identify the submitted documentation that describes how health information is encrypted when stored locally on end-user devices: The EHR enforces default configuration settings that either enforces the encryption of locally stored health information or prevents health information from being stored locally. Identify the default setting: The default, and only setting, is for health information to not be stored on any end user devices Also refer to our AuditLogsAndEncryptionAffidavit.pdf (d)(8) Integrity Not Applicable (did not test to this criteria) Identify the hashing algorithm used for integrity (SHA-1 or higher): SHA1 InfoGard Laboratories, Inc. Page 2

115 Privacy and Security Attestation Form-EHR-36-V (e)(1) View, Download, and Transmit to 3 rd Party Not Applicable (did not test to this criteria) (e)(3) Secure Messaging Not Applicable (did not test to this criteria) Identify the FIPS approved algorithm used for encryption: AES256 Identify the FIPS approved algorithm used for hashing: SHA1 Identify the FIPS approved algorithm used for encryption: AES256 Identify the FIPS approved algorithm used for hashing: SHA1 Statement of Compliance I, the undersigned, attest that the statements in this document are accurate. Vendor Signature by an Authorized Representative Date Dec. 5, 2014 InfoGard Laboratories, Inc. Page 3

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification. Modular EHR

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification. Modular EHR Test Results Summary for 2014 Edition EHR Certification 14-2552-R-0089-PRI Vl.O, December 24, 2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification lnfoigard Part 1:

More information

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: National Jewish Health Patient

More information

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification. OSEHRA pophealth for Eligible Providers 4.0.

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification. OSEHRA pophealth for Eligible Providers 4.0. 16-3687-R-0001-PRA Vl.O, January 15, 2016 ONC HIT Certification Program lnfoigard Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Product Version: OSEHRA pophealth

More information

ONC HIT Certification Program

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification 15-3002-R-0007-PRA V1.14, February 28, 2016 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product

More information

ONC HIT Certification Program. Test Results Summary for 2014 Edition EHR Certification

ONC HIT Certification Program. Test Results Summary for 2014 Edition EHR Certification ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Centricity Perinatal Product

More information

ONC HIT Certification Program

ONC HIT Certification Program ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Centricity Perinatal (CPN)

More information

The following identifies criterion or criteria certified via gap certification

The following identifies criterion or criteria certified via gap certification 2.2 Gap Certification The following identifies criterion or criteria certified via gap certification 170.314 (a)(1) (a)(17) (d)(5) (d)(9) (a)(6) (b)(5)* (d)(6) (f)(1) (a)(7) (d)(1) (d)(8) *Gap certification

More information

ONC HIT Certification Program

ONC HIT Certification Program ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Centricity Perinatal Product

More information

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification 15-3251-R-0014-PRA Vl.O, March 12, 2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification lnfoigard Part 1: Product and Developer Information 1.1 Certified Product Information

More information

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification ONC HIT Certification Program Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Clinicians Gateway Product Version: 3.6.0 Domain: Ambulatory Test Type: Modular EHR

More information

ONC HIT Certification Program

ONC HIT Certification Program ONC HIT Certification Program Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: ConnectEHR Product Version: 1 Domain: Ambulatory Test Type: Modular EHR Developer/Vendor

More information

Test Results Summary for 2014 Edition EHR Certification R 0033 PRA V1.0, June 26, (f)(5) Amb. only. (d)(3)

Test Results Summary for 2014 Edition EHR Certification R 0033 PRA V1.0, June 26, (f)(5) Amb. only. (d)(3) 2.2 Gap Certification The following identifies criterion or criteria certified via gap certification 170.314 (a)(1) (a)(19) (d)(6) (h)(1) (a)(6) (a)(20) (d)(8) (h)(2) (a)(7) (b)(5)* (d)(9) (h)(3) (a)(17)

More information

Test Results Summary for 2014 Edition EHR Certification R-0041-PRA V1.0, September 11, (f)(5) Amb. only. (d)(3)

Test Results Summary for 2014 Edition EHR Certification R-0041-PRA V1.0, September 11, (f)(5) Amb. only. (d)(3) 2.2 Gap Certification The following identifies criterion or criteria certified via gap certification 170.314 (a)(1) (a)(19) (d)(6) (h)(1) (a)(6) (a)(20) (d)(8) (h)(2) (a)(7) (b)(5)* (d)(9) (h)(3) (a)(17)

More information

ONC HIT Certification Program

ONC HIT Certification Program ONC HIT Certification Program Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Clicktate Product Version: 5.0 Domain: Ambulatory Test Type: Complete 1.2 Developer/Vendor

More information

ONC HIT Certification Program

ONC HIT Certification Program ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: IGNITE Product Version: 1.0.0

More information

Test Results Summary for 2014 Edition EHR Certification R 0003 PRA V1.0, January 15, 2015

Test Results Summary for 2014 Edition EHR Certification R 0003 PRA V1.0, January 15, 2015 2.2 Gap Certification The following identifies criterion or criteria certified via gap certification 170.314 (a)(1) (a)(17) (d)(5) (d)(9) (a)(6) (b)(5)* (d)(6) (f)(1) (a)(7) (d)(1) (d)(8) *Gap certification

More information

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification. Inpatient Modular EHR. www. healthmedx.com

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification. Inpatient Modular EHR. www. healthmedx.com ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Note: Originally tested by CCHIT, inherited certification from CHPL Product ID: CC-2014-510940-1 Part 1: Product and

More information

2.2 Gap Certification. 2.3 Inherited Certification. Test Results Summary for 2014 Edition EHR Certification R 0026 PRA V1.

2.2 Gap Certification. 2.3 Inherited Certification. Test Results Summary for 2014 Edition EHR Certification R 0026 PRA V1. 2.2 Gap Certification The following identifies criterion or criteria certified via gap certification. 170.314 (a)(1) (a)(19) (d)(6) (h)(1) (a)(6) (a)(20) (d)(8) (h)(2) (a)(7) (b)(5)* (d)(9) (a)(17) (d)(1)

More information

ONC HIT Certification Program

ONC HIT Certification Program ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: DataMotion Direct Product

More information

ONC HIT Certification Program

ONC HIT Certification Program ONC HIT Certification Program Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Product Version: v4.3 Domain: Test Type: 1.2 Developer/Vendor Information Developer/Vendor

More information

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Soteria Product Version:

More information

ONC HIT Certification Program

ONC HIT Certification Program ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Access Profiler Enterprise

More information

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Test Results Summary for 2014 Edition EHR Certification 14-2655-R-0079-PRI Version 1.1, February 28, 2016 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product

More information

Safety-enhanced Design EDIS 2014 R (a)(1) Computerized Provider Order Entry

Safety-enhanced Design EDIS 2014 R (a)(1) Computerized Provider Order Entry 170.314(a)(1) - Computerized Provider Order Entry Safety-enhanced Design EDIS 2014 R1 170.314(a)(1) Computerized Provider Order Entry Report content based on NISTR 7742, Customized Common Industry Format

More information

Test Results Summary for 2014 Edition EHR Certification R 0050 PRI V1.0, August 28, 2014

Test Results Summary for 2014 Edition EHR Certification R 0050 PRI V1.0, August 28, 2014 2.2 Gap Certification The following identifies criterion or criteria certified via gap certification 170.314 (a)(1) (a)(17) (d)(5) (d)(9) (a)(6) (b)(5)* (d)(6) (f)(1) (a)(7) (d)(1) (d)(8) *Gap certification

More information

ONC HIT Certification Program

ONC HIT Certification Program ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Care Compass Product Version:

More information

ONC HIT Certification Program

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jan-2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer

More information

Usability Test Report of ClinNext 10 EHR version 1.0

Usability Test Report of ClinNext 10 EHR version 1.0 Usability Test Report of ClinNext 10 EHR version 1.0 Report based on ISO/IEC 25062: 2006 Common Industry Format for Usability Test Reports ClinNext 10 version 1.0 Date of Usability Test: August 29, 2018

More information

Medical Transcription Billing Corporation (MTBC) Address: 7 Clyde Road, Somerset, NJ (732) x243

Medical Transcription Billing Corporation (MTBC) Address: 7 Clyde Road, Somerset, NJ (732) x243 2015 Edition Health IT Module Test Report Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: TalkEHR Product /Release: 1.0 1.2 Developer Information Developer Name:

More information

EHR Usability Test Report of IntelleChartPro, Version 7.0 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports

EHR Usability Test Report of IntelleChartPro, Version 7.0 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports EHR Usability Test Report of IntelleChartPro, Version 7.0 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports IntelleChartPro, Version 7.0 Date of Usability Test: 08/03/2017

More information

EHR Usability Test Report of SRS EHR v10

EHR Usability Test Report of SRS EHR v10 EHR Usability Test Report of SRS EHR v10 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports SRS EHR v10 Date of Usability Test: 5/18/2017 Date of Report: 5/26/2017 Report

More information

EHR Usability Test Report of Office Ally s EHR 24/7 System Version 3.9.2

EHR Usability Test Report of Office Ally s EHR 24/7 System Version 3.9.2 Page 1 EHR Usability Test Report of Office Ally s EHR 24/7 System Version 3.9.2 Report based on NISTIR 7741 NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records,

More information

Product Development for Medical, Life Sciences, and Consumer Health

Product Development for Medical, Life Sciences, and Consumer Health Product Development for Medical, Life Sciences, and Consumer Health Fundamentals of Usability Testing for FDA Validation MassMEDIC April 8, 2010 Beth Loring Director of Research and Usability 2 Topics

More information

Standards: Implementation, Certification and Testing Work group Friday, May 8, :00 Pm-1:30 Pm ET.

Standards: Implementation, Certification and Testing Work group Friday, May 8, :00 Pm-1:30 Pm ET. Standards: Implementation, Certification and Testing Work group Friday, May 8, 2015. 12:00 Pm-1:30 Pm ET. Agenda Complete Work group Comments- Group 1 Review Group 2 Comments. 2015 Edition Certification

More information

Usability Testing Methodology for the 2017 Economic Census Web Instrument

Usability Testing Methodology for the 2017 Economic Census Web Instrument Usability Testing Methodology for the 2017 Economic Census Web Instrument Rebecca Keegan Economic Statistical Methods Division March 8th, 2017 FCSM Disclaimer: Any views expressed are those of the author

More information

Usability Veracity Attestation

Usability Veracity Attestation DocuSign Envelope ID: 8484BF5C-7BB8-4D22-9280-D92F4667628D Usability Veracity Attestation Medstreaming, LLC All in One Medstreaming EMR v5.3 Nora Selim, VP of Operations Medstreaming, LLC 9840 Willows

More information

Folsom Library & RensSearch Usability Test Plan

Folsom Library & RensSearch Usability Test Plan Folsom Library & RensSearch Usability Test Plan Eric Hansen & Billy Halibut 1 Table of Contents Document Overview!... 3 Methodology!... 3 Participants!... 3 Training!... 4 Procedure!... 4 Roles!... 4 Ethics!5

More information

Thank you, and enjoy the webinar.

Thank you, and enjoy the webinar. Disclaimer This webinar may be recorded. This webinar presents a sampling of best practices and overviews, generalities, and some laws. This should not be used as legal advice. Itentive recognizes that

More information

Meaningful Use Setup Guide

Meaningful Use Setup Guide Meaningful Use Setup Guide Table of Contents ChiroWrite Certified Settings... 3 Exports... 3 Dr. First... 4 Imports... 4 Microsoft HealthVault... 5 Additional Settings... 7 CPOE... 7 Orders... 8 Order

More information

e-mds Patient Portal TM

e-mds Patient Portal TM e-mds Patient Portal TM Version 6.3.0 The Patient s Guide to Using the Portal e-mds 9900 Spectrum Drive. Austin, TX 78717 Phone 512.257.5200 Fax 512.335.4375 e-mds.com 2009 e-mds, Inc. All rights reserved.

More information

Web Evaluation Report Guidelines

Web Evaluation Report Guidelines Web Evaluation Report Guidelines Graduate Students: You are required to conduct a usability test for your final project in this course. Please review the project description and the evaluation rubric on

More information

April 25, Dear Secretary Sebelius,

April 25, Dear Secretary Sebelius, April 25, 2014 Department of Health and Human Services Office of the National Coordinator for Health Information Technology Attention: 2015 Edition EHR Standards and Certification Criteria Proposed Rule

More information

Usability Report for Online Writing Portfolio

Usability Report for Online Writing Portfolio Usability Report for Online Writing Portfolio October 30, 2012 WR 305.01 Written By: Kelsey Carper I pledge on my honor that I have not given or received any unauthorized assistance in the completion of

More information

Patient Portal Users Guide

Patient Portal Users Guide e-mds Solution Series Patient Portal Users Guide Version 7.2 How to Use the Patient Portal CHARTING THE FUTURE OF HEALTHCARE e-mds 9900 Spectrum Drive. Austin, TX 78717 Phone 512.257.5200 Fax 512.335.4375

More information

Usability Veracity Attestation ( g.3)

Usability Veracity Attestation ( g.3) Usability Veracity Attestation (170.315.g.3) Debbie McKay, RN, BHS, JD Senior Solutions Manager Regulatory Sunrise Business Unit Allscripts Healthcare Solutions 222 Merchandise Mart Plaza, Suite 2024 Chicago,

More information

NIST Normative Test Process Document: e-prescribing (erx) Test Tool

NIST Normative Test Process Document: e-prescribing (erx) Test Tool NIST Normative Test Process Document: e-prescribing (erx) Test Tool Test Tool and Test Descriptions to Conduct ONC 2015 Edition Certification Version 1.7 Date: December 3, 2015 Developed by the National

More information

Usability and Evaluation of BCKOnline PORTAL Prototype Early breast cancer Recurrent Advanced

Usability and Evaluation of BCKOnline PORTAL Prototype Early breast cancer Recurrent Advanced Usability and Evaluation of BCKOnline PORTAL Prototype Web Site URL: http://130.194.38.233:3400/index.html Participant No: Demographics 1. Please indicate your gender - Male Female 2. Please indicate your

More information

Who we tested [Eight] participants, having the following profile characteristics, evaluated [product name].

Who we tested [Eight] participants, having the following profile characteristics, evaluated [product name]. Executive Summary NOTE: Include high-level summary of findings here, including: Overall task performance User difficulties and frustrations with the site Significant usability findings (may include positive

More information

MAQ DASHBOARD USERS GUIDE

MAQ DASHBOARD USERS GUIDE USERS GUIDE V10 - July 2014 eclinicalworks, 2014. All rights reserved CONTENTS ABOUT THIS GUIDE 4 Product Documentation 4 Webinars 4 eclinicalworks Newsletter 4 Getting Support 5 Conventions 5 MAQ DASHBOARD

More information

IBM MANY EYES USABILITY STUDY

IBM MANY EYES USABILITY STUDY IBM MANY EYES USABILITY STUDY Team Six: Rod Myers Dane Petersen Jay Steele Joe Wilkerson December 2, 2008 I543 Interaction Design Methods Fall 2008 Dr. Shaowen Bardzell EXECUTIVE SUMMARY We asked three

More information

Robert Snelick, NIST Sheryl Taylor, BAH. October 11th, 2012

Robert Snelick, NIST Sheryl Taylor, BAH. October 11th, 2012 Test Tool Orientation for International Society for Disease Surveillance (ISDS): 2014 Edition 170.314(f)(3) Transmission to Public Health Agencies - Syndromic Surveillance Robert Snelick, NIST Sheryl Taylor,

More information

Usability Report. Author: Stephen Varnado Version: 1.0 Date: November 24, 2014

Usability Report. Author: Stephen Varnado Version: 1.0 Date: November 24, 2014 Usability Report Author: Stephen Varnado Version: 1.0 Date: November 24, 2014 2 Table of Contents Executive summary... 3 Introduction... 3 Methodology... 3 Usability test results... 4 Effectiveness ratings

More information

MAPIR User Guide for Eligible Hospitals. Medical Assistance Provider Incentive Repository (MAPIR): User Guide for Eligible Hospitals

MAPIR User Guide for Eligible Hospitals. Medical Assistance Provider Incentive Repository (MAPIR): User Guide for Eligible Hospitals Medical Assistance Provider Incentive Repository (MAPIR): User Guide for Eligible Hospitals Version: 1.0 Original Version Date: 02/23/2018 Last Revision Date: 02/23/2018 Table of Contents Table of Contents

More information

Foundation Level Syllabus Usability Tester Sample Exam

Foundation Level Syllabus Usability Tester Sample Exam Foundation Level Syllabus Usability Tester Sample Exam Version 2017 Provided by German Testing Board Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged.

More information

e-mds Patient Portal Version User Guide e-mds 9900 Spectrum Drive. Austin, TX Phone Fax e-mds.

e-mds Patient Portal Version User Guide e-mds 9900 Spectrum Drive. Austin, TX Phone Fax e-mds. e-mds Patient Portal Version 6.2.0 TM User Guide e-mds 9900 Spectrum Drive. Austin, TX 78717 Phone 512.257.5200 Fax 512.335.4375 e-mds.com 2008 e-mds, Inc. All rights reserved. Product and company names

More information

Memorandum Participants Method

Memorandum Participants Method Memorandum To: Elizabeth Pass, Associate Professor, School of Writing, Rhetoric and Technical Communication From: Andrew Carnes, WRTC 456 Section 1[ADC] Date: February 2, 2016 Re: Project 1 Competitor

More information

User Testing Study: Collaborizm.com. Jessica Espejel, Megan Koontz, Lauren Restivo. LIS 644: Usability Theory and Practice

User Testing Study: Collaborizm.com. Jessica Espejel, Megan Koontz, Lauren Restivo. LIS 644: Usability Theory and Practice User Testing Study: Collaborizm.com Jessica Espejel, Megan Koontz, Lauren Restivo LIS 644: Usability Theory and Practice TABLE OF CONTENTS EXECUTIVE SUMMARY... 3 INTRODUCTION... 4 METHODOLOGY... 5 FINDINGS

More information

2016 e-mds, Inc.

2016 e-mds, Inc. Capability and Certification Criteria Electronic Prescribing of Medications Certification Criteria: 170.314(a)(1) Computerized provider order entry (CPOE) of Medications Description of capability Electronic

More information

Patient Portal User s Guide

Patient Portal User s Guide 650 Peter Jefferson Parkway, Suite 100 Charlottesville, VA 22911 Office: (434) 293 4072 Fax: (434) 293 4265 www.cvilleheart.com Patient Portal User s Guide Table of Contents What is the Patient Portal?

More information

USABILITY IN HEALTHCARE IT: DATA COLLECTION AND ANALYSIS APPROACHES

USABILITY IN HEALTHCARE IT: DATA COLLECTION AND ANALYSIS APPROACHES USABILITY IN HEALTHCARE IT: DATA COLLECTION AND ANALYSIS APPROACHES Andre Kushniruk, PhD, School of Health Information Science, University of Victoria Usability in Healthcare IT Usability -Measures of

More information

PRODUCT UNDER TEST TEST EVENT RESULT. Quality Manual ISO Test Lab Test Report

PRODUCT UNDER TEST TEST EVENT RESULT. Quality Manual ISO Test Lab Test Report PRODUCT UNDER TEST Organization Name: SRS-Health Address of Vendor: 155 Chestnut Ridge Road Montvale NJ 07645 Test Product Name: SRS EHR Test Product Version-with-Release: v10 TEST EVENT RESULT Criteria

More information

Certification for Meaningful Use Experiences and Observations from the Field June 2011

Certification for Meaningful Use Experiences and Observations from the Field June 2011 Certification for Meaningful Use Experiences and Observations from the Field June 2011 Principles for Certification to Support Meaningful Use Certification should promote EHR adoption by giving providers

More information

SLI Compliance ONC-ATL Testing Program Guide

SLI Compliance ONC-ATL Testing Program Guide SLI Compliance A Division of Gaming Laboratories International, LLC 4720 Independence St. Wheat Ridge, CO 80033 303-422-1566 www.slicompliance.com SLI Compliance ONC-ATL Testing Program Guide Document

More information

PRODUCT UNDER TEST TEST EVENT RESULT. Quality Manual ISO Test Lab Test Report

PRODUCT UNDER TEST TEST EVENT RESULT. Quality Manual ISO Test Lab Test Report PRODUCT UNDER TEST Organization Name: Varian Medical Systems Address of Vendor: 3100 Hansen Way Palo Alto CA 94304 Test Product Name: 360 Oncology Patient Portal Test Product Version-with-Release: 1.0

More information

Consumers Energy Usability Testing Report

Consumers Energy Usability Testing Report Consumers Energy Usability Testing Report SI 622 Section 2 Group 2: Katrina Lanahan, Michael Grisafe, Robert Wang, Wei Wang, Yu-Ting Lu April 18th, 2014 1 Executive Summary This report addresses a usability

More information

MEDITECH. ARRA Meaningful Use Stage 3 Usability Study. Acute CS 5.67

MEDITECH. ARRA Meaningful Use Stage 3 Usability Study. Acute CS 5.67 MEDITECH ARRA Meaningful Use Stage 3 Usability Study Usability Issues and Recommendations Acute CS 5.67 Medication Allergy List Medication List Drug-Drug, Drug-Allergy Interaction Checks Electronic Prescribing

More information

Usability: An Introduction

Usability: An Introduction Usability: An Introduction Joseph Kannry, MD Lead Technical Informaticist Mount Sinai Health System Professor Medicine Icahn School of Medicine at Mount Sinai Defining Usability Usability has been studied

More information

ICSA Labs ONC Health IT Certification Program Certification Manual

ICSA Labs ONC Health IT Certification Program Certification Manual Document Version 3.7 August 24, 2018 www.icsalabs.com Table of Contents Background... 1 About ICSA Labs... 1 About the ONC Health IT Certification Program... 1 Doing Business with ICSA Labs... 2 Pre-Application...

More information

Office of the Director. Contract No:

Office of the Director. Contract No: Centers for Disease Control and Prevention and Prevention (CDC) Office of Infectious Diseases (OID) National Center for Immunization and Respiratory Diseases (NCIRD) Office of the Director Contract No:

More information

What is Usability? What is the Current State? Role and Activities of NIST in Usability Reactions from Stakeholders What s Next?

What is Usability? What is the Current State? Role and Activities of NIST in Usability Reactions from Stakeholders What s Next? What is Usability? What is the Current State? Role and Activities of NIST in Usability Reactions from Stakeholders What s Next? Usability is "the extent to which a product can be used by specified users

More information

ONC Health IT Certification Program

ONC Health IT Certification Program ONC Health IT Certification Program Certification Requirements Update March 17, 2016 ICSA Labs Health IT Program Agenda Introduction Mandatory Product Disclosures and Transparency Requirements Certified

More information

MEDICITY NETWORK ONC CERTIFICATION COST AND LIMITATIONS

MEDICITY NETWORK ONC CERTIFICATION COST AND LIMITATIONS MEDICITY NETWORK ONC CERTIFICATION COST AND LIMITATIONS Medicity is proud to offer health IT solutions that are certified under the Office of the National Coordinator for Health Information Technology.

More information

PrescribeIT 2.0. User Guide

PrescribeIT 2.0. User Guide PrescribeIT 2.0 User Guide Revisions and Approvals Document Version Date Approved By Description Page# 1.0 Aug 21 2017 Diana Cius First Draft 2.0 March 9 2018 Diana Cius Updated Sections: - Retrieving

More information

Main challenges for a SAS programmer stepping in SAS developer s shoes

Main challenges for a SAS programmer stepping in SAS developer s shoes Paper AD15 Main challenges for a SAS programmer stepping in SAS developer s shoes Sebastien Jolivet, Novartis Pharma AG, Basel, Switzerland ABSTRACT Whether you work for a large pharma or a local CRO,

More information

2018 HIPAA One All Rights Reserved. Beyond HIPAA Compliance to Certification

2018 HIPAA One All Rights Reserved. Beyond HIPAA Compliance to Certification 2018 HIPAA One All Rights Reserved. Beyond HIPAA Compliance to Certification Presenters Jared Hamilton CISSP CCSK, CCSFP, MCSE:S Healthcare Cybersecurity Leader, Crowe Horwath Erika Del Giudice CISA, CRISC,

More information

Level 1 Certificate in Reception Services ( )

Level 1 Certificate in Reception Services ( ) Level 1 Certificate in Reception Services (8067-01) Assessment pack www.cityandguilds.com January 2012 Version 1.01 About City & Guilds City & Guilds is the UK s leading provider of vocational qualifications,

More information

How to Add Usability Testing to Your Evaluation Toolbox

How to Add Usability Testing to Your Evaluation Toolbox How to Add Usability Testing to Your Evaluation Toolbox Christine Andrews Paulsen, Ph.D. Concord Evaluation Group cpaulsen@ Presented at AEA, 11/5/11, Anaheim, CA 1 Goals Develop an understanding of usability

More information

Working with Health IT Systems is available under a Creative Commons Attribution-NonCommercial- ShareAlike 3.0 Unported license.

Working with Health IT Systems is available under a Creative Commons Attribution-NonCommercial- ShareAlike 3.0 Unported license. Working with Health IT Systems is available under a Creative Commons Attribution-NonCommercial- ShareAlike 3.0 Unported license. Johns Hopkins University. Welcome to Quality Improvement: Data Quality Improvement.

More information

USABILITY REPORT A REPORT OF USABILITY FINDINGS FOR OFF THE BEATEN PATH WEBSITE

USABILITY REPORT A REPORT OF USABILITY FINDINGS FOR OFF THE BEATEN PATH WEBSITE USABILITY REPORT A REPORT OF USABILITY FINDINGS FOR OFF THE BEATEN PATH WEBSITE Prepared by: Cori Vandygriff, Joseph Kmetz, Cammy Herman, and Kate DeBusk To: Off the Beaten Path Team From: Cammy Herman

More information

WASHINGTON UNIVERSITY HIPAA Privacy Policy # 7. Appropriate Methods of Communicating Protected Health Information

WASHINGTON UNIVERSITY HIPAA Privacy Policy # 7. Appropriate Methods of Communicating Protected Health Information WASHINGTON UNIVERSITY HIPAA Privacy Policy # 7 Appropriate Methods of Communicating Protected Health Information Statement of Policy Washington University and its member organizations (collectively, Washington

More information

NextGen Share Direct Messaging. End User Guide

NextGen Share Direct Messaging. End User Guide NextGen Share Direct Messaging End User Guide 1 Introduction This guide provides step-by-step instructions on how to send and receive referrals and summary of care records through NextGen Share. Structured

More information

Stream Features Application Usability Test Report

Stream Features Application Usability Test Report Stream Features Application Usability Test Report Erin Norton and Katelyn Waara HU 4628: Usability and Instruction Writing Michigan Technological University April 24, 2013 Table of Contents Executive Summary

More information

Measuring the User Experience

Measuring the User Experience Measuring the User Experience Collecting, Analyzing, and Presenting Usability Metrics Chapter 6 Self-Reported Metrics Tom Tullis and Bill Albert Morgan Kaufmann, 2008 ISBN 978-0123735584 Introduction Learn

More information

Reviewers Guide on Clinical Trials

Reviewers Guide on Clinical Trials Reviewers Guide on Clinical Trials Office of Research Integrity & Compliance Version 2 Updated: June 26, 2017 This document is meant to help board members conduct reviews for Full Board: Clinical Trial

More information

Inpatient Quality Reporting (IQR) Program

Inpatient Quality Reporting (IQR) Program PSVA Demonstration and ecqm Q&A Session Questions & Answers Moderator: Debra Price, PhD Manager, Continuing Education Hospital Inpatient Value, Incentives, and Quality Reporting (VIQR) Outreach and Education

More information

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it? Concepts of Usability Usability Testing What is usability? How to measure it? Fang Chen ISO/IS 9241 Usability concept The extent to which a product can be used by specified users to achieve specified goals

More information

March 20, Division of Dockets Management (HFA-305) Food and Drug Administration 5630 Fishers Lane, Room 1061 Rockville, MD 20852

March 20, Division of Dockets Management (HFA-305) Food and Drug Administration 5630 Fishers Lane, Room 1061 Rockville, MD 20852 701 Pennsylvania Avenue, NW Suite 800 Washington, D.C. 20004 2654 Tel: 202 783 8700 Fax: 202 783 8750 www.advamed.org March 20, 2017 Division of Dockets Management (HFA-305) Food and Drug Administration

More information

What is New in MyChart? My Medical Record Health Preferences Settings Appointments and Visits Visits Schedule an Appointment Update Information

What is New in MyChart? My Medical Record Health Preferences Settings Appointments and Visits Visits Schedule an Appointment Update Information What is New in MyChart? On August 26th, we will be upgrading and changing the look and feel to our MyChart patient portal site. We would like to make you aware of a few differences that you will see, when

More information

NHS Education for Scotland Portal https://www.portal.scot.nhs.uk Dental Audit: A user guide from application to completion

NHS Education for Scotland Portal https://www.portal.scot.nhs.uk Dental Audit: A user guide from application to completion Dental Audit: A user guide from application to completion 1. Audit Guidance 2. New Application: Getting Started 3. New Application: The Audit Application Form 4. New Application: Submitting Your Application

More information

Test Procedure for (a) Computerized Provider Order Entry

Test Procedure for (a) Computerized Provider Order Entry Test Procedure for 170.304 (a) Computerized Provider Order Entry This document describes the draft test procedure for evaluating conformance of complete EHRs or EHR modules 1 to the certification criteria

More information

Provider File Management Guide

Provider File Management Guide Provider File Management Guide March 2018 Independence Blue Cross offers products through its subsidiaries Independence Hospital Indemnity Plan, Keystone Health Plan East, and QCC Insurance Company, and

More information

User Manual/Guide for Direct Using encompass 3.0. Prepared By: Arête Healthcare Services, LLC

User Manual/Guide for Direct Using encompass 3.0. Prepared By: Arête Healthcare Services, LLC User Manual/Guide for Direct Using encompass 3.0 Prepared By: Arête Healthcare Services, LLC Document Version: V1.0 10/02/2015 Contents Direct Overview... 3 What is Direct?... 3 Who uses Direct?... 3 Why

More information

Presenter(s): Dave Venier. Topic Rosetta Interoperability (C-CDA and IHE) Level 200

Presenter(s): Dave Venier. Topic Rosetta Interoperability (C-CDA and IHE) Level 200 Presenter(s): Dave Venier Topic Rosetta Interoperability (C-CDA and IHE) Level 200 Safe Harbor Provisions/Legal Disclaimer This presentation may contain forward-looking statements within the meaning of

More information

CPOE (Computerized Provider Order Entry) Basics

CPOE (Computerized Provider Order Entry) Basics CPOE lets you create, revise, and track orders for different categories, including Medications, Laboratory, and Radiology/Imaging. Please note when you use the PowerSoftMD escripts interface or use the

More information

s, Texts and Social Media: What Physicians Need to Know

s, Texts and Social Media: What Physicians Need to Know Emails, Texts and Social Media: What Physicians Need to Know 1 Today s Learning Objectives By the end of today s program, you will be able to : Identify the risks to patients privacy which email, text

More information

Foundation Level Syllabus Usability Tester Sample Exam Answers

Foundation Level Syllabus Usability Tester Sample Exam Answers Foundation Level Syllabus Usability Tester Sample Exam s Version 2017 Provided by German Testing Board Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged.

More information

Covisint DocSite Enterprise

Covisint DocSite Enterprise Covisint DocSite Enterprise June 2013 Site Administrator User Guide Covisint DocSite Enterprise Site Administrator Guide Compuware-Covisint All rights reserved 2013 SiteAdminGuideDocSite-2013.5-061113

More information

Running Head: TREE TAP USABILITY TEST 1

Running Head: TREE TAP USABILITY TEST 1 Running Head: TREE TAP USABILITY TEST 1 Gogglefox Tree Tap Usability Test Report Brandon S. Perelman April 23, 2014 Final Design Documents Final Design Prototype White Paper Team Gogglefox Website Author's

More information

Usability Testing CS 4501 / 6501 Software Testing

Usability Testing CS 4501 / 6501 Software Testing Usability Testing CS 4501 / 6501 Software Testing [Nielsen Normal Group, https://www.nngroup.com/articles/usability-101-introduction-to-usability/] [TechSmith, Usability Basics: An Overview] [Ginny Redish,

More information

Usability Testing. November 14, 2016

Usability Testing. November 14, 2016 Usability Testing November 14, 2016 Announcements Wednesday: HCI in industry VW: December 1 (no matter what) 2 Questions? 3 Today Usability testing Data collection and analysis 4 Usability test A usability

More information