ONC HIT Certification Program
|
|
- Ethelbert Farmer
- 8 years ago
- Views:
Transcription
1 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Panacea Product Version: Domain: Ambulatory Test Type: Complete EHR 1.2 Developer/Vendor Information Developer/Vendor Name: OA Systems, Inc Address: Edison Court Rancho Cucamonga CA Website: Phone: Developer/Vendor Contact: Rifat Ali Syed Page 1 of 12
2 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015 Part 2: ONC-Authorized Certification Body Information 2.1 ONC-Authorized Certification Body Information ONC-ACB Name: Drummond Group Address: North Hwy 183, Ste B , Austin, TX Website: Phone: ONC-ACB Contact: Bill Smith This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative: Bill Smith ONC-ACB Authorized Representative Certification Body Manager Function/Title Signature and Date 6/23/ Gap Certification The following identifies criterion or criteria certified via gap certification (a)(1) (a)(19) (d)(6) (h)(1) (a)(6) (a)(20) (d)(8) (h)(2) (a)(7) (b)(5)* (d)(9) (h)(3) (a)(17) (d)(1) (f)(1) (a)(18) (d)(5) (f)(7)** *Gap certification allowed for Inpatient setting only **Gap certification allowed for Ambulatory setting only x No gap certification Page 2 of 12
3 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun Inherited Certification The following identifies criterion or criteria certified via inherited certification (a)(1) (a)(16) Inpt. only x (c)(2) x (f)(2) x (a)(2) (a)(17) Inpt. only x (c)(3) x (f)(3) x (a)(3) (a)(18) x (d)(1) (f)(4) Inpt. only x (a)(4) (a)(19) x (d)(2) x (a)(5) (a)(20) x (d)(3) x (a)(6) x (b)(1) x (d)(4) x (a)(7) x (b)(2) x (d)(5) x (a)(8) x (b)(3) x (d)(6) (f)(7) x (a)(9) x (b)(4) x (d)(7) (g)(1) x (a)(10) (b)(5) x (d)(8) (g)(2) x (a)(11) (b)(6) Inpt. only (d)(9) Optional x (g)(3) (a)(12) x (b)(7) x (e)(1) x (g)(4) x (a)(13) (b)(8) x (e)(2) Amb. only (h)(1) x (a)(14) (b)(9) x (e)(3) Amb. only (h)(2) x (a)(15) x (c)(1) x (f)(1) (h)(3) (f)(5) Amb. only (f)(6) Amb. only No inherited certification Page 3 of 12
4 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015 Part 3: NVLAP-Accredited Testing Laboratory Information Report Number: KAM Test Date(s): 6/2/ NVLAP-Accredited Testing Laboratory Information ATL Name: Drummond Group EHR Test Lab Accreditation Number: NVLAP Lab Code Address: North Hwy 183, Ste B , Austin, TX Website: Phone: ATL Contact: Beth Morrow For more information on scope of accreditation, please reference NVLAP Lab Code Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory Representative: Kyle Meadors Test Proctor ATL Authorized Representative Function/Title Signature and Date 6/22/2015 Nashville, TN (Remote) Location Where Test Conducted 3.2 Test Information Additional Software Relied Upon for Certification Additional Software Surescripts Network for Clinical Interoperability Applicable Criteria b.1, b.2, e.1 Functionality provided by Additional Software Direct HISP No additional software required Page 4 of 12
5 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun Test Tools Test Tool Version x Cypress x eprescribing Validation Tool HL7 CDA Cancer Registry Reporting Validation Tool HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool x HL7 v2 Immunization Information System (IIS) Reporting Validation Tool x HL7 v2 Laboratory Results Interface (LRI) Validation Tool x HL7 v2 Syndromic Surveillance Reporting Validation Tool 1.7 x Transport Testing Tool 173 x Direct Certificate Discovery Tool 2.1 Edge Testing Tool No test tools required Test Data Alteration (customization) to the test data was necessary and is described in Appendix [insert appendix letter] x No alteration (customization) to the test data was necessary Standards Multiple Standards Permitted The following identifies the standard(s) that has been successfully tested where more than one standard is permitted Criterion # Standard Successfully Tested (a)(8)(ii)(a)(2) (a)(13) (b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain x (a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release (b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide (j) HL7 Version 3 Standard: Clinical Genomics; Pedigree Page 5 of 12
6 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015 Criterion # (a)(15)(i) x (b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain Standard Successfully Tested (b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide (a)(16)(ii) (b)(2)(i)(a) (b)(7)(i) (g) Network Time Protocol Version 3 (RFC 1305) (i) The code set specified at 45 CFR (c)(2) (ICD-10- CM) for the indicated conditions (i) The code set specified at 45 CFR (c)(2) (ICD-10- CM) for the indicated conditions (g) Network Time Protocol Version 4 (RFC 5905) x (a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release x (a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release (b)(8)(i) (e)(1)(i) (e)(1)(ii)(a)(2) (e)(3)(ii) (i) The code set specified at 45 CFR (c)(2) (ICD-10- CM) for the indicated conditions Annex A of the FIPS Publication [list encryption and hashing algorithms] AES SHA (g) Network Time Protocol Version 3 (RFC 1305) Annex A of the FIPS Publication [list encryption and hashing algorithms] AES SHA (a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release x (g) Network Time Protocol Version 4 (RFC 5905) Common MU Data Set (15) x (a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release (b)(2) The code set specified at 45 CFR (a)(5) (HCPCS and CPT-4) Page 6 of 12
7 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015 Criterion # Standard Successfully Tested None of the criteria and corresponding standards listed above are applicable Newer Versions of Standards The following identifies the newer version of a minimum standard(s) that has been successfully tested Newer Version Applicable Criteria No newer version of a minimum standard was tested Optional Functionality Criterion # x (a)(4)(iii) (b)(1)(i)(b) (b)(1)(i)(c) (b)(2)(ii)(b) (b)(2)(ii)(c) (e)(1) (f)(3) (f)(7) Common MU Data Set (15) Common MU Data Set (15) Optional Functionality Successfully Tested Plot and display growth charts Receive summary care record using the standards specified at (a) and (b) (Direct and XDM Validation) Receive summary care record using the standards specified at (b) and (c) (SOAP Protocols) Transmit health information to a Third Party using the standards specified at (a) and (b) (Direct and XDM Validation) Transmit health information to a Third Party using the standards specified at (b) and (c) (SOAP Protocols) View, download and transmit data to a third party utilizing the Edge Protocol IG version 1.1 Ambulatory setting only Create syndrome-based public health surveillance information for transmission using the standard specified at (d)(3) (urgent care visit scenario) Ambulatory setting only transmission to public health agencies syndromic surveillance - Create Data Elements Express Procedures according to the standard specified at (b)(3) (45 CFR (a)(4): Code on Dental Procedures and Nomenclature) Express Procedures according to the standard specified at (b)(4) (45 CFR (c)(3): ICD-10-PCS) No optional functionality tested Page 7 of 12
8 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun Edition Certification Criteria* Successfully Tested Criteria # Version Version Criteria # TP** TD*** TP TD x (a)(1) (c)(3) (a)(2) 1.2 (d)(1) 1.2 (a)(3) (d)(2) 1.6 (a)(4) (d)(3) 1.3 (a)(5) (d)(4) 1.3 (a)(6) (d)(5) 1.2 (a)(7) (d)(6) 1.2 (a)(8) 1.3 (d)(7) 1.2 (a)(9) (d)(8) 1.2 (a)(10) (d)(9) Optional 1.2 (a)(11) 1.3 (e)(1) x (a)(12) 1.3 (e)(2) Amb. only (a)(13) 1.2 (e)(3) Amb. only 1.3 (a)(14) 1.2 (f)(1) (a)(15) 1.5 (f)(2) (a)(16) Inpt. only (f)(3) (a)(17) Inpt. only 1.2 (f)(4) Inpt. only (a)(18) (a)(19) (a)(20) (b)(1) (f)(5) Amb. only (f)(6) Amb. only (b)(2) (f)(7) Amb. only 1.1 (b)(3) (g)(1) (b)(4) x (g)(2) x (b)(5) (g)(3) 1.4 (b)(6) Inpt. only (g)(4) 1.2 (b)(7) (h)(1) 1.1 (b)(8) (h)(2) 1.1 (b)(9) (h)(3) 1.1 (c)(1) (c)(2) Page 8 of 12
9 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015 Criteria # No criteria tested Version Version Criteria # TP** TD*** TP TD *For a list of the 2014 Edition Certification Criteria, please reference (navigation: 2014 Edition Test Method) **Indicates the version number for the Test Procedure (TP) ***Indicates the version number for the Test Data (TD) Page 9 of 12
10 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun Clinical Quality Measures* Type of Clinical Quality Measures Successfully Tested: x Ambulatory Inpatient No CQMs tested *For a list of the 2014 Clinical Quality Measures, please the CMS ecqm Library (Navigation: June 2014 and April 2014 Updates) Ambulatory CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version x 2 v3 x 90 v3 x 136 v3 x 155 v2 22 x 117 v2 137 x 156 v2 x 50 v2 122 x 138 v x 126 v x 165 v2 x 68 v3 130 x 146 v2 x 166 v3 x 69 v x 75 v x 153 v x 154 v2 182 Inpatient CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version Page 10 of 12
11 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun Automated Numerator Recording and Measure Calculation Automated Numerator Recording Automated Numerator Recording Successfully Tested (a)(1) (a)(11) (a)(18) (b)(6) (a)(3) (a)(12) (a)(19) (b)(8) (a)(4) (a)(13) (a)(20) (b)(9) (a)(5) (a)(14) (b)(2) (e)(1) (a)(6) (a)(15) (b)(3) (e)(2) (a)(7) (a)(16) (b)(4) (e)(3) (a)(9) (a)(17) (b)(5) x Automated Numerator Recording was not tested Automated Measure Calculation Automated Measure Calculation Successfully Tested x (a)(1) x (a)(11) (a)(18) (b)(6) x (a)(3) x (a)(12) (a)(19) (b)(8) x (a)(4) x (a)(13) (a)(20) (b)(9) x (a)(5) x (a)(14) x (b)(2) x (e)(1) x (a)(6) x (a)(15) x (b)(3) x (e)(2) x (a)(7) (a)(16) x (b)(4) x (e)(3) x (a)(9) (a)(17) x (b)(5) Automated Measure Calculation was not tested Attestation Attestation Forms (as applicable) x Safety-Enhanced Design* x Quality Management System** x Privacy and Security Appendix A B C 3.3 Appendices Attached below. *Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (a)(18), (a)(19), (a)(20), (b)(3), (b)(4), (b)(9). **Required for every EHR product Page 11 of 12
12 Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015 Test Results Summary Change History Test Report ID Description of Change Date 2014 Edition Test Report Summary Page 12 of 12
13
14
15 Usability Test Report for Panacea EHR version 2.0 Report based on NISTIR 7742: Customized Common Industry Format Template for Electronic Health Record Usability Testing Date of Usability Test: August 1, 2013 November 27 th, 2013 Date of Report: November 27 th 2013 Prepared by: Hanif Wardak, Project Manager Asad Iqbal, Project Coordinator Hassaan Latif, Project Coordinator Page 1 of 33
16 Table of Contents EXECUTIVE SUMMARY... 4 INTRODUCTION.5 3 METHOD Participants Study Design Tasks Testing Procedure Test Location Test Environment Test Forms and Tools Participant Instructions Usability Metrics..9 4 COMPUTERIZED PROVIDER ORDER ENTRY RESULTS Data Analysis and Reporting Discussion of Findings CLINICAL DECISION SUPPORT RESULTS Data Analysis and Reporting Discussion of Findings CLINICIAL INFORMATION RECONCILIATION RESULTS Data Analysis and Reporting Discussion of Findings MEDICATION LIST RESULTS Data Analysis and Reporting Discussion of Findings MEDICATION ALLERGY LIST RESULTS.17 Page 2 of 33
17 8.1 Data Analysis and Reporting Discussion of Findings ELECTRONIC PRESCRIBING RESULTS Data Analysis and Reporting Discussion of Findings DRUG-DRUG, DRUG-ALLERGY INTERVENTION RESULTS Data Analysis and Reporting Discussion of Findings ADJUST SEVERITY AND CDS SUPPORT RULES RESULTS Data Analysis and Reporting Discussion of Findings..21 OVERALL RESULTS...22 Appendix 1: Participant Recruiting Questionnaire Appendix 2: Example Test Case Appendix 3: Post Test Questionnaire Appendix 4: Moderator's Guide.. 29 Appendix 5: System Usability Scale Questionnaire...31 Appendix 6: Incentive Receipt and Acknowledgement Form Page 3 of 33
18 EXECUTIVE SUMMARY A usability test of Panacea EHR version 2.0 was conducted between August 1, 2013 and November 27 th, 2013 by OA Systems. The purpose of this test was to evaluate the usability of the EHR Under Test (Panacea) by a group of select participants. During the usability test, seven total participants were engaged. Out of this group, three were healthcare providers, two were medical assistants, and two were office administrators. These participants matched the target demographic criteria and ultimately used Panacea in simulated, but representative tasks. This study collected performance data on twenty-seven tasks typically conducted on an EHR which involved the following features: Medication Allergy List Medication List Drug-Drug, Drug-Allergy Intervention Computerized Provider Order Entry Electronic Prescribing Clinical Information Reconciliation Clinical Decision Support Adjust Severity and CDS Support Rules The usability tests lasted between 60 and 120 minutes in total. Each test involved one or more participants at a time who fit the profile of target end users, one moderator who introduced the test and series of tasks, and two data loggers who recorded user performance data on paper. The moderator did not give the participant assistance on how to complete the tasks. Participants had varying levels of prior experience with the EHR, and each received a brief orientation of the system prior to beginning the tasks. The following types of quantitative data were collected for each participant: Number of tasks completed without assistance Time to complete the tasks Number of errors/deviations from path Participant satisfaction ratings System Usability Scale (SUS) ratings In addition to the quantitative performance data, the following qualitative observations were made: Page 4 of 33
19 Major findings Areas for improvement All participant data was de-identified. Following the conclusion of the testing, participants were asked to complete a post-test questionnaire and were each compensated with $100 gift cards for their time. Two administrator participants were not compensated. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of Panacea. These metrics may be reviewed in the Results section of the report. INTRODUCTION The EHRUT tested for this study was Panacea EHR v 2.0. by OA Systems Healthcare Technologies, Inc. Panacea was designed to present medical information to healthcare providers in an ambulatory setting. The usability test attempted to represent realistic exercises and scenarios that may be performed and encountered during every day use of the system. The purpose of this study was to test and evaluate the usability of the features identified above in Panacea. To this end, measures of effectiveness, efficiency, and user satisfaction, such as task success rates, average time per task, number and type of deviations, and task ratings, were captured during the usability testing. METHOD 3.1 Participants A total of seven participants were used in this study. Participants in the test fit into three main categories: Providers (physicians) Nurses/Medical Assistants Office Administrators Providers and clinicians were compensated for their time and had no direct connection to the development of or organization producing Panacea. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. For the test purposes, end-user characteristics were identified and translated into a participant recruitment questionnaire, which was used to solicit/screen potential participants; an example of a screener is provided in Appendix 1. Recruited participants had a mix of backgrounds including professional and computing experience. Participants names were replaced with participant IDs so that an individual s data cannot be tied back to his or her identity. Page 5 of 33
20 The following is a table of participants by characteristics, including demographics, professional experience, and computing experience: Part. ID Gender Age Range Occupation Professional Experience Daily Computer Experience (Low, Medium, High) Product Experience (Years w/ehr) 1 Par1 Female Medical Assistant 2 Years High 2 Years 2 Par2 Female Medical Assistant 8 Years High 2 Years 3 Par3 Male Physician 2 years High 2 Years 4 Par4 Male Office Administrator 11 Years High 5 years 5 Par5 Female Office Administrator 15 Years High 1 year 6 Par6 Female Physician 12 Years Low 2 Years 7 Par7 Male Physician 20 Years Low 6 years 3.2. Study Design Overall, the objective of this test was to uncover areas where the application performed well that is, effectively, efficiently, and with satisfaction and areas where the application did not meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements could be made. During the usability test, participants interacted with one EHR: Panacea EHR. Each participant was provided with the same instructions. The system was evaluated for effectiveness, efficiency, and satisfaction as defined by measures collected and analyzed for each participant: Time to complete the tasks Path deviations Participant comments Participant satisfaction ratings System Usability Scale (SUS) ratings Page 6 of 33
21 3.3 Tasks A total of eight tasks were designed that would be representative of the kinds of duties a Clinician/Office Administrator might routinely perform. Tasks were selected based on their frequency of use, criticality of function, and those that may be more challenging for users. User tasks are shown in the table below that showcase the level of risk associated with each task. TASK High Risk Medium Risk Low Risk Computerized Provider Order Entry Clinical Decision Support Clinical Information Reconciliation Medication List Medication Allergy List Electronic Prescribing Drug-Drug, Drug-Allergy Intervention Adjust Severity and CDS Support Rules 3.4 Testing Procedure Six participants tested at physical locations while one participant tested remotely via GoToMeeting, an online screen sharing tool. Before Testing For all test sessions, a moderator was assigned to administer the tests and two data loggers were assigned to monitor the participants screens during the execution of the tasks. The team itself consisted of certified project professionals, those with graduate degrees in health information technology, and those trained on usability theory. Execution of Testing The assigned moderator began each session by welcoming the participants and thanking them for attending. Upon introducing himself, the moderator would proceed to read the session instructions out loud. See Appendix 4 for moderator s guide. Participants were instructed to perform the tasks within each test as quickly as possible and without assistance. The moderator was instructed by the Project Manager to only give clarification on tasks, but not instructions on system functionality. The moderator would then begin the testing portion of the session by reading off the name of each test. A stop watch was utilized to track the start and end times of each task. GoToMeeting enabled the data loggers to view the participant s screen in order to review each participant s task times and mouse movement. Each data logger monitored path adherence/deviation from his or her own screen. Page 7 of 33
22 The task time was stopped once the participant indicated that he/or she had successfully completed the task by verbally declaring done. End of Testing Following each session, the moderator gave each participant the post-test questionnaire (see Appendix 3), the system usability scale form (see Appendix 5), and the acknowledgement of receipt of incentive form (see Appendix 6), and finally thanked each participant for his or her time. 3.5 Test Location As mentioned above, the tests were conducted at either a physical location or remotely depending on the location preference of each participant. In the test environment, the moderator and data logger were present, and only the moderator interacted with the participants verbally. 3.6 Test Environment Panacea would typically be used in a healthcare office or facility. In this instance, the testing was conducted while participants were in a clinical office setting. The moderator was in the office with the participants. The data loggers were in separate rooms or areas from the participants. One of the tests was performed where the participant was at a different location than the moderator and data logger. As a result, GoToMeeting was utilized to administer the test. For testing, PC computers running Windows operating systems were used. The participants sat directly in front of their computer monitors and utilized mice when interacting with Panacea. The application itself was a test version using a training/test database on a WAN connection. Additionally, participants were instructed not to change any of the default system settings (such as control of font size). 3.7 Test Forms and Tools During the usability test, various documents and instruments were used, including: Post-test questionnaire System Usability Scale GoToMeeting Stop Watch Page 8 of 33
23 3.8 Participant Instructions The moderator read instructions aloud to each participant. See the full moderator s guide in Appendix Usability Metrics According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess: 1. Effectiveness of Panacea EHR by measuring participant success rates and failures 2. Efficiency of Panacea EHR by measuring the average task time and path deviations 3. Satisfaction with Panacea EHR by measuring ease of use ratings Data Scoring Effectiveness: Task Success Success was measured by percentage complete across the number of participants per task executed. A task was counted as a Success if the participant was able to achieve the correct outcome without assistance. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage and interpretation is made by indicating if the module was ineffective, effective, or highly effective. Effectiveness: Task Failures If the participant abandoned the task, did not reach the correct answer, or performed it incorrectly, the task was counted as a failure. A qualitative account of the observed failures was made. Efficiency: Task Deviations The participant s navigation path through the application was recorded. For example, if a participant went to a wrong screen, clicked on an incorrect menu item, clicked on an incorrect link, or clicked on the wrong button, that was considered a deviation. Deviations are evaluated qualitatively. See discussion of deviations under the Major Findings sections of each task group under the Results section. Page 9 of 33
24 Efficiency: Task Time Each task was timed from when the administrator said Begin until the participant said, Done. Average time per task was calculated for each task and interpretation is made based on a scale of inefficient, efficient, or very efficient. Satisfaction: Task Rating Participant s subjective impression of the ease of use of the application was measured by asking the participant after each task group to score the tasks on a scale of 1 (Very Easy) to 5 (Very Difficult). This data was averaged across participants. Scores from the System Usability Scale are also elaborated upon to measure satisfaction with the system. See discussion of these results under the Overall section that follows. Page 10 of 33
25 RESULTS COMPUTERIZED PROVIDER ORDER ENTRY 4.1 Data Analysis and Reporting Task N Task Success Task Time Task Ratings 1=Very Easy; 2=Somewhat Easy; 3=Normal; 4=Somewhat Difficult; 5=Very Difficult # % Complete Mean (minutes and/or Mean seconds) 1. Record Medication Order 5 100% (5 out of 5) 1:48 1.4/5 2. Change Medication Order 3. Access Medication Order 4. Record Radiology/Imaging Order 5. Change Radiology/Imaging Order 6. Access Radiology/Imaging Order 7. Record Laboratory Order 8. Change Laboratory Order 9. Access Laboratory Order 5 100% (5 out of 5) :37 1.2/5 5 80% (4 out of 5) :05 1/ % (5 out of 5) 1:39 1.6/ % (5 out of 5) :10 1.2/ % (5 out of 5) :10 1/ % (5 out of 5) 1:21 1.2/ % (5 out of 5) :06 1/ % (5 out of 5) :09 1/5 Page 11 of 33
26 4.2 Discussion of Findings Effectiveness It can be argued that the CPOE module was highly effective given that out of nine tasks only one of them resulted in a percentage complete rate of less than 100%. Despite the one failure, the module seems to be highly effective on the majority of tasks. Efficiency Based on the average (mean) time per task, it can be argued that the CPOE module is very efficient, with most tasks taking between 5 and 40 seconds to complete. Tasks that involved recording of items took less than 2 minutes to complete. Deviations, as discussed in the Major Findings section, did not really impact efficiency. Satisfaction The satisfaction level on the CPOE feature was excellent given that the average (mean) score for the majority of the tasks was below 1.5, which indicates that most participants found the module to be very easy to use. Major Findings P6 1 deviation. P6 chose Lab box instead of blue Labs tab (Task: Record Radiology/Imaging Order) P2 3 deviations, did not change duration as stated in the directions. Second, participant clicked on Save instead of clicking Select next to MRI Brain Study. Lastly, participant clicked on Select button twice (Task 1: Change Medication Order; Task 2: Record Radiology/Imaging Order; Task 3: Record Laboratory Order) P1 1 deviation, put 30 in Dispense field instead of Duration (Task: Change Medication Order) P3 1 deviation, clicked on appointment then went into medication instead of clicking Create Rx (Task: Record Medication Order) P7 1 deviation, changed Dispensed instead of Duration (Task: Change Medication Order) Page 12 of 33
27 Areas for Improvement The New Order button should be placed closer to the center of the module. Some users found the button to be too far removed from the rest of the fields. Lab results and radiology results should be placed under two separate tabs. Radiology orders tab should be added along with the Labs order tab. CLINICAL DECISION SUPPORT 5.1 Data Analysis and Reporting Task N Task Success Task Time Task Ratings 1=Very Easy; 2=Somewhat Easy; 3=Normal; 4=Somewhat Difficult; 5=Very Difficult # % Complete Mean (minutes and Mean seconds) 1. Trigger Problem List Intervention 5 100% (5/5) 1: Discussion of Findings Effectiveness It can be argued that the Clinical Decision Support module was highly effective given that the percentage complete rate was 100%. Efficiency Based on the average (mean) time of the task, it can be argued that the module is efficient, since the task took almost 2 minutes to complete, which is a little longer of a time than expected for this task. Deviations, as discussed in the Major Findings section, did not really impact efficiency. Satisfaction The satisfaction level on this module was excellent given that the average (mean) score for the task was below 1.5, which indicates that most participants found the module to be very easy to use. In addition, users found that consolidating all the Clinical Decision Alerts in one window was very convenient. For example, in the patient s banner under the Clinical Alerts section, users can find all of the following interventions: Problem List Medication List Medication Allergy List Page 13 of 33
28 Demographics Lab Tests and Results Vital Signs User Diagnostic and Therapeutic Reference Information Major Findings P1 1 deviation, clicked on Message first instead of clicking on the Resource Link (Task: Trigger Problem List Intervention) Areas for Improvement The display box for clinical alerts should be larger Resource links within the box should be more prominently displayed to indicate that it is selectable or clickable. CLINICAL INFORMATION RECONCILIATION 6.1 Data Analysis and Reporting Task N Task Success Task Time Task Ratings 1=Very Easy; 2=Somewhat Easy; 3=Normal; 4=Somewhat Difficult; 5=Very Difficult # % Complete Mean (minutes and seconds) Mean 1. Reconcile Active Problem List 2. Reconcile Active Medication List 3. Reconcile Medication Allergy List 5 80% (4/5) 1: % (4/5) : % (3/5) : Page 14 of 33
29 6.2 Discussion of Findings Effectiveness Out of three tasks, one of the participants failed on all three tasks. In addition, one participant failed on Reconcile Medication Allergy List. Despite the failures of two of the participants, the module is still effective. Efficiency Based on the average (mean) time per task, it can be argued that the module is very efficient, with the majority of tasks taking less than 50 seconds to complete. Deviations, as discussed in the Major Findings section, did not really have a significant impact on efficiency. Satisfaction The satisfaction level on this module was excellent given that the average (mean) score for the majority of the tasks was below 1.5, which indicates that most participants found the module to be very easy to use. Major Findings P6 1 deviation, took some time and clicked on Inbound Referral tab twice (Task: Reconcile Active Problem List) P2 2 deviations, clicked on Reconcile Medications tab instead of clicking on Reconcile Medication Allergy tab and did not perform arrow clicking in the middle of the screen (Task: Reconcile Medication Allergy List for both deviations) Areas for Improvement A better method of merging items from the CCDA into the problem list, medication list, and medication allergy list needs to be developed Preview and Save functionalities need to be merged into one screen Page 15 of 33
30 MEDICATION LIST 7.1 Data Analysis and Reporting Task N Task Success Task Time Task Ratings 1=Very Easy; 2=Somewhat Easy; 3=Normal; 4=Somewhat Difficult; 5=Very Difficult # % Complete Mean (minutes and seconds) Mean 1. Record Medication 5 100% (5/5) 1: Change Medication List 3. Access Medication List 5 100% (5/5) : % (5/5) : Discussion of Findings Effectiveness It can be argued that the module was highly effective given that for each task, the percentage complete was 100%. Efficiency Based on the average (mean) time per task, it can be argued that the module is very efficient, with the majority of tasks taking between 5 and 40 seconds to complete. Deviations, as discussed in the Major Findings section, did not really have a significant impact on efficiency. Satisfaction The satisfaction level on this module was excellent given that the average (mean) score for the majority of the tasks was below 1.5, which indicates that most participants found the module to be very easy to use. Major Findings P3 1 deviation, clicked on appointment instead of Patient Care (Task: Record Medication) Page 16 of 33
31 Areas for Improvement A Search button should replace the current magnifying glass icon. It should be implemented throughout the entire software MEDICATION ALLERGY LIST 8.1 Data Analysis and Reporting Task N Task Success Task Time Task Ratings 1=Very Easy; 2=Somewhat Easy; 3=Normal; 4=Somewhat Difficult; 5=Very Difficult # % Complete Mean Mean 1. Record Medication Allergy 2. Change Medication Allergy 3. Access Medication Allergy List 5 100% (5/5) 2: % (5/5) : % (5/5) : Discussion of Findings Effectiveness It can be argued that the module was highly effective given that for each task, the percentage complete was 100%. Efficiency Based on the average (mean) time per task, it can be argued that the module is efficient. Record Medication Allergy seemed to have taken a bit longer than expected on average (2:20). The average time it took to complete the other two tasks was between 8 and 30 seconds, which is relatively fast. Deviations, as discussed in the Major Findings section, did not really have a significant impact on efficiency. Satisfaction The satisfaction level on this module was excellent given that the average (mean) score for the majority of the tasks was below 1.5, which indicates that most participants found the module to be very easy to use. Page 17 of 33
32 Major Findings P6 1 Deviation, clicked on Allergies tab twice (Task: Record Medication Allergy) P2 1 Deviation, Deviated by clicking Print and Add Allergy instead of just Add Allergy (Task: Record Medication Allergy) P7 1 Deviation, clicked on OK without first clicking on Lipitor (Task: Record Medication Allergy) P3 1 Deviation, Clicked on drop down next to add allergy instead of just Add Allergy (Task: Record Medication Allergy) Areas for Improvement The Add Allergy and the drop down menu should be separated. Currently, both are combined within one functionality. Under Environmental Allergies field, Users should be able to type within the field instead of having to click on the magnifying glass icon first. Page 18 of 33
33 ELECTRONIC PRESCRIBING 9.1 Data Analysis and Reporting Task N Task Success Task Time Task Ratings 1=Very Easy; 2=Somewhat Easy; 3=Normal; 4=Somewhat Difficult; 5=Very Difficult # % Complete Mean Mean 1. Create Electronic Prescription 5 100% (5/5) 1: Discussion of Findings Effectiveness It can be argued that the module was highly effective given that for each task, the percentage complete was 100%. Efficiency Based on the average (mean) time per task, it can be argued that the module is inefficient given that it takes over a minute and a half on average to send a prescription, which is longer than expected. It is noted under Areas for Improvement that the path it takes to complete the task can be improved upon by reducing the number of steps it takes to write a prescription. Deviations, as discussed in the Major Findings section, did not significantly impact efficiency. Satisfaction The satisfaction level on this module was excellent given that the average (mean) score for the majority of the tasks was below 1.5, which indicates that most participants found the module to be very easy to use. Major Findings P6 1 deviation, clicked on appointment then went back and clicked on Create Rx (Task: Create Electronic Prescription) P2 1 deviation, clicked on appointment then went back and clicked on Create Rx (Task: Create Electronic Prescription) Areas for Improvement Screens within the module should be consolidated within one or two screens in order to reduce the number of steps and increase efficiency Page 19 of 33
34 DRUG-DRUG, DRUG-ALLERGY INTERVENTION 10.1 Data Analysis and Reporting Task N Task Success Task Time Task Ratings 1=Very Easy; 2=Somewhat Easy; 3=Normal; 4=Somewhat Difficult; 5=Very Difficult # % Complete Mean (minutes and Mean seconds) 1. Drug-Drug Alert 5 100% (5/5) 1: Drug-Allergy Alert 5 100% (5/5) : Discussion of Findings Effectiveness It can be argued that the module was highly effective given that for each task, the percentage complete was 100%. Efficiency Based on the average (mean) time per task, it can be argued that the module is efficient. Since it takes around a minute to complete each task within this module, which is longer than expected, therefore, the module can benefit from some improvement. Deviations, as discussed in the Major Findings section, did seem to slightly impact efficiency. For example, the Next button seemed to be an impediment. Satisfaction The satisfaction level on this module was excellent given that the average (mean) score for each task was 1.6, which indicates that most participants found the module to be very easy to use. Major Findings P6 2 deviations, clicked on Medications box instead of Medications tab; clicked on Home icon instead of staying on Patient Chart and clicking Browse Favorite (Task: Drug-Drug Alert; Task 2: Drug-Allergy Alert) P7 1 deviation, went to Next screen which was one screen too far (Task: Drug-Allergy Alert) Page 20 of 33
35 P3 1 deviation, clicked on appointment instead of Patient Care (Task: Drug-Drug Alert) Areas for Improvement The Alert icon should be more prominently displayed. Also, on the alert page, vertical tabs that describe the type of alerts are currently difficult to distinguish. Can benefit from an improvement on the path to completion, which could make the module very efficient instead of just efficient ADJUST SEVERITY AND CDS SUPPORT RULES 11.1 Data Analysis and Reporting Task N Task Success Task Time Task Ratings 1=Very Easy; 2=Somewhat Easy; 3=Normal; 4=Somewhat Difficult; 5=Very Difficult # % Complete Mean (minutes and Mean seconds) 1. Adjust Severity and CDS Support Rules 2 100% (2/2) 2: Discussion of Findings Effectiveness It can be argued that the module was highly effective given that for each task, the percentage complete was 100%. Efficiency Based on the average (mean) time per task, it can be argued that the module is very efficient since it takes around 2 minutes to complete the task on average within this module, which is a reasonable amount of time for this task. Deviations, as discussed in the Major Findings section, did seem to slightly impact efficiency. Satisfaction The satisfaction level on this module was excellent given that the average (mean) score for the task was 1.5, which indicates that most participants found the module to be very easy to use. Page 21 of 33
36 Major Findings P5 2 Deviations, Checked both minor and moderate document levels instead of checking just Minor Document Levels; also clicked on CDS Security Test twice instead of just once (Task: Administrator) Areas for Improvement The Create CDS Intervention button should be more prominently displayed within the module. Currently, it is too far to the right of the screen. OVERALL RESULTS The results from the SUS (System Usability Scale) scored the subjective satisfaction with Panacea EHR based on performance with the tasks tested in this report. Questions with No Answer as a response were assigned a zero during calculation of the score. Broadly interpreted, obtaining a score of under 68 indicates a system with poor usability while obtaining a score of over 68 indicates a system with above average usability. The mean SUS score for Panacea EHR is: Page 22 of 33
37 Appendix 1 Participant Recruiting Questionnaire Please complete the following information to register as a participant in the Panacea EHR Usability Study Participant Information Date Name of Participant Clinic Name Phone Number Address Specialty Age Group Title MD, DO Admin MA Nurse, Nurse Practitioner Other Please specify : Describe your work environment [e.g., private practice, hospital] Page 23 of 33
38 How long have you been at this position? Years Software and Computer Experience How many hours per day do you spend on the computer? What computer hardware do you use? or more Windows PC Mac Other Please specify: Are you using any mobile devices? If yes, are you using work related applications? Yes No Work Related Apps? Yes No Do you currently use Panacea EHR? If not, do you have a vested interest, commercial or research related, with an EHR vendor? Yes No Are you using an EHR? If yes, how many hours per week do you use the EHR? Yes How Many Hours? No Page 24 of 33
39 How long have you been using an EHR (months)? Is it Web-based or Install? Length of time using an EHR: Web-Based or Install? Have you used other EHRs? Yes No Page 25 of 33
40 Appendix 2 Change Medication Order: Computerized Provider Order Entry In the Current Medication List, click on Re-Prescribe on the first Lipitor 10MG Click Next Change Duration from 30 Days to 10 Days and click Next Review Alert and click Next Review Pharmacy and click Next Review and click Send Score: Very Easy Somewhat Easy Neutral Somewhat Difficult Very Difficult Access Medication Order: To Access Medication Order, Review Current Medication List Score: Very Easy Somewhat Easy Neutral Somewhat Difficult Very Difficult Page 26 of 33
41 Task Time: Minutes Seconds Optimal Path: Followed Deviated Record Deviations or Other Comments: Page 27 of 33
42 Appendix 3 Post- Test Questionnaire What was your overall impression of this system? What aspects of the system did you like most? What aspects of the system did you like least? Were there any features that you were surprised to see? What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Compare this system to other systems you have used. Would you recommend this system to your colleagues? Page 28 of 33
43 Appendix 4 Moderator s Guide Step 1: Read: Statement of Instructions to Participants Orientation Thank you for participating in this study. Our session today will last approximately 60 minutes. During that time you will take a look at Panacea EHR, our electronic health record system. We greatly appreciate your honesty and assistance during this session. I will ask you to complete a few tasks using Panacea and answer some questions. We are interested in how easy (or how difficult) Panacea is for you to use and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible mistakes. Do not do anything more than asked. If you get lost or have difficulty I cannot help you with anything to do with the system itself. Please save your detailed comments until the end of a task. Please be honest with your opinions as it will help us improve upon the software. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. Do you have any questions or concerns? Step 2: Administer tests by stating that, We will now begin testing. I will read off the name of the tasks and before each task shall commence, I will say Begin. Upon completion of your task, please say Done. Administer Tests in Following Order: Medication Allergy List Medication List Drug-Drug, Drug-Allergy Intervention Computerized Provider Order Entry Electronic Prescribing Clinical Information Reconciliation Clinical Decision Support Adjust Severity and CDS Support Rules Page 29 of 33
44 Step 3: Distribute Post-Test Questionnaire, System Usability Scale (SUS), and Acknowledgement of Receipt of Incentive Form Page 30 of 33
45 Appendix 5 System Usability Scale (1=Strongly Disagree; 5=Strongly Agree) 1) I think I would like to use this system frequently ) I found the system unnecessarily complex ) I thought the system was easy to use ) I think that I would need the support of a technical person to be able to use the system ) I found the various functions in this system were well integrated ) I thought there was too much inconsistency in the system Page 31 of 33
46 7) I would imagine that most people would learn to use the system very quickly ) I found the system very cumbersome to use ) I felt very confident using the system ) I needed to learn a lot of things before I could get going with this system Page 32 of 33
47 Appendix 6 Acknowledgement of Receipt of Incentive I hereby acknowledge receipt of $ for my participation in a research study run by OA Systems. Printed Name: Address: Signature: Date: Usability Researcher: Signature of Usability Researcher: Date: Page 33 of 33
48
49
ONC HIT Certification Program
ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Product Version: Domain:
More informationONC HIT Certification Program
ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Product Version: v6.15 Domain:
More informationONC HIT Certification Program
ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: MEDHOST Enterprise Product
More informationONC HIT Certification Program
ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Fusion RIS Extended Application
More informationONC HIT Certification Program
ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Medical and Practice Management
More informationONC HIT Certification Program
ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Business Intelligence Product
More informationONC HIT Certification Program
ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Merge Eye Care PACS Product
More informationONC HIT Certification Program
ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: McKesson Patient Folder with
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Nov-2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 20-Nov-2013 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Nov-2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jan-2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 10-Feb-2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 17-Feb-2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 2-Oct-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 2-Oct-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 07-Aug-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Aug-2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Nov-2014 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer
More informationEHR Usability Test Report of Radysans EHR version 3.0
1 EHR Usability Test Report of Radysans EHR version 3.0 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports Radysans EHR 3.0 Date of Usability Test: December 28, 2015 Date
More informationAttesting for Meaningful Use Stage 2 in 2014 Customer Help Guide
Attesting for Meaningful Use Stage 2 in 2014 Customer Help Guide Table of Contents PURPOSE OF THIS DOCUMENT 4 MEANINGFUL USE STAGE 2 OVERVIEW 4 ATTESTING FOR CORE OBJECTIVES 5 CORE OBJECTIVE #1: CPOE 7
More informationSTANDARDS. MEANINGFUL USE 42 CFR 495.6(j)-(m) Stage 2 Objective. 2014 Edition EHR CERTIFICATION CRITERIA 45 CFR 170.314
Use CPOE for medication, laboratory and radiology orders directly entered by any licensed healthcare professional who can enter orders into the medical record per State, local and professional guidelines.
More informationMeaningful Use of Certified EHR Technology with My Vision Express*
Insight Software, LLC 3050 Universal Blvd Ste 120 Weston FL 33331-3528 Tel. 877-882-7456 www.myvisionexpress.com Meaningful Use of Certified EHR Technology with My Vision Express* Eligible Professional
More informationDrummond Group, Inc. Re: Price Transparency Attestation
Drummond Group, Inc. Re: Price Transparency Attestation Organization: Criterions, LLC Product: Criterions EHR 3.0 Product Type: Complete EHR - Ambulatory Certification Number: A014E01O2Q8JEAB Certification
More information2014 Edition Marketing Materials Requirement
2014 Edition Marketing Materials Requirement I. Disclaimer This Complete EHR is 2014 Edition compliant and has been certified by an ONC-ACB in accordance with the applicable certification criteria adopted
More informationHow to Achieve Meaningful Use with ICANotes
How to Achieve Meaningful Use with ICANotes Meaningful use involves using an EHR in a way that the government has defined as meaningful to collect incentive payments. but do not participate. Note: If you
More informationEligible Professionals please see the document: MEDITECH Prepares You for Stage 2 of Meaningful Use: Eligible Professionals.
s Preparing for Meaningful Use in 2014 MEDITECH (Updated December 2013) Professionals please see the document: MEDITECH Prepares You for Stage 2 of Meaningful Use: Professionals. Congratulations to our
More informationMeaningful Use Qualification Plan
Meaningful Use Qualification Plan Overview Certified EHR technology used in a meaningful way is one piece of a broader Health Information Technology infrastructure intended to reform the health care system
More informationMeaningful Use Cheat Sheet CORE MEASURES: ALL REQUIRED # Measure Exclusions How to Meet in WEBeDoctor
Meaningful Use Cheat Sheet CORE MEASURES: ALL REQUIRED # Measure Exclusions How to Meet in WEBeDoctor 1 CPOE (Computerized Physician Order Entry) More than 30 percent of all unique patients with at least
More informationEHR Meaningful Use Guide
EHR Meaningful Use Guide for Stage I (2011) HITECH Attestation Version 2.0 Updated May/June 2014 in partnership with 1-866-866-6778 platinum@medicfusion.com www.medicfusion.com/platinum Medicfusion EMR
More informationMICROMD EMR VERSION 9.0 2014 OBJECTIVE MEASURE CALCULATIONS
MICROMD EMR VERSION 9.0 2014 OBJECTIVE MEASURE CALCULATIONS TABLE OF CONTENTS PREFACE Welcome to MicroMD EMR... i How This Guide is Organized... i Understanding Typographical Conventions... i Cross-References...
More informationGE Healthcare Detailed Comments
GE Healthcare Detailed Comments Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 Edition; Revisions to the
More informationCertification Guidance for EHR Technology Developers Serving Health Care Providers Ineligible for Medicare and Medicaid EHR Incentive Payments
I. Background Certification Guidance for EHR Technology Developers Serving Health Care Providers Ineligible for Medicare and Medicaid EHR Incentive Payments The Medicare and Medicaid EHR Incentive Programs,
More informationAAP Meaningful Use: Certified EHR Technology Criteria
AAP Meaningful Use: Certified EHR Technology Criteria On July 13, 2010, the US Centers for Medicare and Medicaid Services (CMS) released a Final Rule establishing the criteria with which eligible pediatricians,
More information2013 Meaningful Use Dashboard Calculation Guide
2013 Meaningful Use Dashboard Calculation Guide Learn how to use Practice Fusion s Meaningful Use Dashboard to help you achieve Meaningful Use. For more information, visit the Meaningful Use Center. General
More informationMEDICFUSION / HERFERT. MEANINGFUL USE STAGE 1 and 2 ATTESTATION GUIDE 2015
MEDICFUSION / HERFERT MEANINGFUL USE STAGE 1 and 2 ATTESTATION GUIDE 2015 The following document is intended to aid in preparation for gathering necessary information to attest in early 2016. All Medicfusion
More informationStage 1 Meaningful Use - Attestation Worksheet: Core Measures
Stage 1 Meaningful Use - Attestation Worksheet: Core Measures Core Measures Objective # Objective Title / Explanation Goal Attestation Response - Values below reflect reponses of most radiologists Explanation
More informationHL7 and Meaningful Use
HL7 and Meaningful Use Grant M. Wood HL7 Ambassador HIMSS14 2012 Health Level Seven International. All Rights Reserved. HL7 and Health Level Seven are registered trademarks of Health Level Seven International.
More informationMeaningful Use Stage 2 Administrator Training
Meaningful Use Stage 2 Administrator Training 1 During the call please mute your line to reduce background noise. 2 Agenda Review of the EHR Incentive Programs for Stage 2 Meaningful Use Measures and Corresponding
More informationThe Meaning Behind Meaningful Use Stage 2
Meaningful Use White Paper The Meaning Behind Meaningful Use Stage 2 What You Need to Know pulseinc.com Meaningful Use Stage 2 Stage 2 of the Meaningful Use (MU) program officially began January 1, 2014.
More informationStage 1 measures. The EP/eligible hospital has enabled this functionality
EMR Name/Model Ingenix CareTracker - version 7 EMR Vendor Ingenix CareTracker Stage 1 objectives Use CPOE Use of CPOE for orders (any type) directly entered by authorizing provider (for example, MD, DO,
More informationPresented by. Terri Gonzalez Director of Practice Improvement North Carolina Medical Society
Presented by Terri Gonzalez Director of Practice Improvement North Carolina Medical Society Meaningful Use is using certified EHR technology to: Improve quality, safety, efficiency, and reduce errors Engage
More informationMeaningful Use Updates Stage 2 and 3. Julia Moore, Business Analyst SMC Partners, LLC July 8, 2015
Meaningful Use Updates Stage 2 and 3 Julia Moore, Business Analyst SMC Partners, LLC July 8, 2015 Stage 2 Requirements 2015 EPs beyond 1st year of MU must report on a full year of data EPs in 1 st year
More informationMEANINGFUL USE STAGE 2 USERS GUIDE
MEANINGFUL USE STAGE 2 USERS GUIDE V10 - November 2014 eclinicalworks, 2014. All rights reserved CONTENTS CONTENTS List of Enhancements 7 MEANINGFUL USE STAGE 2 INTRODUCTION 8 Excluding Visit Types from
More informationMeaningful Use 2015: Modified Stage 2 Objectives and Measures
Meaningful Use 2015: Modified Stage 2 Objectives and Measures Table of Contents Meaningful Use 2015:... 1 Modified Stage 2 Objectives and Measures... 1 Introduction to Stage 2 in 2015 CEHRT... 3 Stage
More informationMeaningful Use: Stage 1 and 2 Hospitals (EH) and Providers (EP) Lindsey Mongold, MHA HIT Practice Advisor Oklahoma Foundation for Medical Quality
Meaningful Use: Stage 1 and 2 Hospitals (EH) and Providers (EP) Lindsey Mongold, MHA HIT Practice Advisor Oklahoma Foundation for Medical Quality Meaningful Use Stage 1 Focuses on Functional & Interoperability
More informationThe EP/eligible hospital has enabled this functionality
EMR Name/Model Amazing Charts Version 5 EMR Vendor Amazing Charts Please note: All of our answers refer to use for an Eligible Professional. Amazing Charts is not Stage 1 objectives Use CPOE Use of CPOE
More informationStage 1 vs. Stage 2 Comparison for Eligible Professionals
Stage 1 vs. Comparison for Eligible Professionals CORE OBJECTIVES (17 Total) Stage 1 Objective Stage 1 Measure Objective Measure Use CPOE for Medication orders directly entered by any licensed healthcare
More informationStage 1 vs. Stage 2 Comparison Table for Eligible Hospitals and CAHs Last Updated: August, 2012
CORE OBJECTIVES (16 total) Stage 1 vs. Stage 2 Comparison Table for Eligible Hospitals and CAHs Last Updated: August, 2012 Stage 1 Objective Use CPOE for medication orders directly entered by any licensed
More informationMeaningful Use Objectives
Meaningful Use Objectives The purpose of the electronic health records (EHR) incentive program is not so much the adoption of health information technology (HIT), but rather how HIT can further the goals
More informationHealth Care February 28, 2012. CMS Issues Proposed Rule on Stage 2 Meaningful Use,
ROPES & GRAY ALERT Health Care February 28, 2012 CMS Issues Proposed Rule on Stage 2 Meaningful Use, ONC Issues Companion Proposed Rule on 2014 EHR Certification Criteria On February 23, 2012, the Centers
More informationGuide To Meaningful Use
Guide To Meaningful Use Volume 1 Collecting the Data Contents INTRODUCTION... 3 CORE SET... 4 1. DEMOGRAPHICS... 5 2. VITAL SIGNS... 6 3. PROBLEM LIST... 8 4. MAINTAIN ACTIVE MEDICATIONS LIST... 9 5. MEDICATION
More informationThe EP/eligible hospital has enabled this functionality
EMR Name/Model EMR Vendor MD-Reports/Version 9i Infinite Software Solutions Stage 1 objectives Eligible professionals Hospitals Use CPOE Use of CPOE for orders (any type) directly entered by authorizing
More informationThe EP/eligible hospital has enabled this functionality
EMR Name/Model EMR Vendor Electronic Patient Charts American Medical Software Stage 1 objectives Use CPOE Use of CPOE for orders (any type) directly entered by authorizing provider (for example, MD, DO,
More informationThe EP/eligible hospital has enabled this functionality. At least 80% of all unique patients. seen by the EP or admitted to the
EMR Name/Model EMR Vendor Allscripts Stage 1 objectives Eligible professionals Hospitals Use CPOE Use of CPOE for orders (any type) directly entered by authorizing provider (for example, MD, DO, RN, PA,
More informationStage 1 measures. The EP/eligible hospital has enabled this functionality
EMR Name/Model EMR Vendor Epic Epic Stage 1 objectives Use CPOE Use of CPOE for orders (any type) directly entered by authorizing provider (for example, MD, DO, RN, PA, NP) Stage 1 measures For EPs, CPOE
More informationStage 1 vs. Stage 2 Comparison Table for Eligible Professionals Last Updated: August, 2012
Stage 1 vs. Stage 2 Comparison Table for Eligible Professionals Last Updated: August, 2012 CORE OBJECTIVES (17 total) Stage 1 Objective Stage 1 Measure Stage 2 Objective Stage 2 Measure Use CPOE for medication
More informationMeaningful Use Stage 2 Certification: A Guide for EHR Product Managers
Meaningful Use Stage 2 Certification: A Guide for EHR Product Managers Terminology Management is a foundational element to satisfying the Meaningful Use Stage 2 criteria and due to its complexity, and
More informationEHR Incentive Programs: 2015 through 2017 (Modified Stage 2) Overview
EHR Incentive Programs: 2015 through 2017 (Modified Stage 2) Overview CMS recently released a final rule that specifies criteria that eligible professionals (EPs), eligible hospitals, and critical access
More informationStage 3/2015 Edition Health IT Certification Criteria Proposed Rules Overview May 11, 2015
Stage 3/2015 Edition Health IT Certification Criteria Proposed Rules Overview May 11, 2015 Disclaimer» CMS must protect the rulemaking process and comply with the Administrative Procedure Act. During the
More informationCustomized Common Industry Format Template for Electronic Health Record Usability Testing
Version 0.2 Page 1 of 37 NISTIR 7742 Customized Common Industry Format Template for Electronic Health Record Usability Testing Robert M. Schumacher User Centric. Inc, Svetlana Z. Lowry Information Access
More informationMedicaid EHR Incentive Program. Focus on Stage 2. Kim Davis-Allen, Outreach Coordinator Kim.davis@ahca.myflorida.com
Medicaid EHR Incentive Program Focus on Stage 2 Kim Davis-Allen, Outreach Coordinator Kim.davis@ahca.myflorida.com Understanding Participation Program Year Program Year January 1 st - December 31st. Year
More informationEHR Incentive Programs for Eligible Professionals: What You Need to Know for 2015 Tipsheet
EHR Incentive Programs for Eligible Professionals: What You Need to Know for 2015 Tipsheet CMS recently published a final rule that specifies criteria that eligible professionals (EPs), eligible hospitals,
More informationMEDICAL ASSISTANCE STAGE 2 SUMMARY
MEDICAL ASSISTANCE STAGE 2 SUMMARY OVERVIEW On September 4, 2012, CMS published a final rule that specifies the Stage 2 Meaningful Use criteria that eligible professionals (EPs), eligible hospitals (EHs)
More informationMEETING MEANINGFUL USE IN MICROMD -STAGE TWO- Presented by: Anna Mrvelj EMR Training Specialist
MEETING MEANINGFUL USE IN MICROMD -STAGE TWO- Presented by: Anna Mrvelj EMR Training Specialist 1 Proposed Rule On April 15, 2015 CMS Issued a new proposal rule for the Medicare and Medicaid EHR Incentive
More informationMEANINGFUL USE. Community Center Readiness Guide Additional Resource #13 Meaningful Use Implementation Tracking Tool (Template) CONTENTS:
Community Center Readiness Guide Additional Resource #13 Meaningful Use Implementation Tracking Tool (Template) MEANINGFUL USE HITECH s goal is not adoption alone but meaningful use of EHRs that is, their
More informationStage Two Meaningful Use Measures for Eligible Professionals
Stage Two Meaningful Use Measures for Eligible Professionals GENERAL REQUIREMENT FOR ELIGIBLE PROFESSIONALS Objective Measure Numerator, Denominator, & Exclusion Application Tips Required by the Final
More informationAchieving Meaningful Use Training Manual
Achieving Meaningful Use Training Manual Terms EP Eligible Professional Medicare Eligible Professional o Doctor of Medicine or Osteopathy o Doctor of Dental Surgery or Dental Medicine o Doctor of Podiatric
More informationA Guide to Understanding and Qualifying for Meaningful Use Incentives
A Guide to Understanding and Qualifying for Meaningful Use Incentives A White Paper by DrFirst Copyright 2000-2012 DrFirst All Rights Reserved. 1 Table of Contents Understanding and Qualifying for Meaningful
More informationMeaningful Use Stage 1:
Whitepaper Meaningful Use Stage 1: EHR Incentive Program Information -------------------------------------------------------------- Daw Systems, Inc. UPDATED: November 2012 This document is designed to
More informationTHE STIMULUS AND STANDARDS. John D. Halamka MD
THE STIMULUS AND STANDARDS John D. Halamka MD THE ONC STRATEGY Grants - Accelerating Adoption Standards - Interim Final Rule Meaningful Use - Notice of Proposed Rulemaking Certification - Notice of Proposed
More informationSTAGE 2 of the EHR Incentive Programs
EHR Incentive Programs A program administered by the Centers for Medicare & Medicaid Services (CMS) Eligible Professional s Guide to STAGE 2 of the EHR Incentive Programs September 2013 TABLE OF CONTENTS...
More informationAPPENDIX A: OBJECTIVES AND MEASURES FOR 2015 THROUGH 2017 (MODIFIED STAGE 2) EP Objectives and Measures
APPENDIX A: OBJECTIVES AND MEASURES FOR 2015 THROUGH (MODIFIED STAGE 2) Objectives for Measures for Providers in EP Objectives and Measures Objective 1: Protect Patient Health Information Objective 2:
More informationE Z BIS ELECTRONIC HEALTH RECORDS
E Z BIS ELECTRONIC HEALTH RECORDS CERTIFICATION AND THE HITECH INCENTIVE PROGRAM The Incentives On July 13, 2010, the U.S. Department of Health and Human Services finalized the Electronic Health Record
More informationHow To Qualify For EHR Stimulus Funds Under
BEST PRACTICES: How To Qualify For EHR Stimulus Funds Under Meaningful Use & Certified EHR Technology The American Recovery and Reinvestment Act (ARRA) set aside early $20 billion in incentive payments
More information2015 Modified Stage 2 Requirements
2015 Modified Stage 2 Requirements Your Guide To Being A Meaningful CEHRT User In 2015 Property of Advanced Provider Solutions, LLC. All rights reserved. Executive Summary The Medicare and Medicaid EHR
More informationCore Set of Objectives and Measures Must Meet All 15 Measures Stage 1 Objectives Stage 1 Measures Reporting Method
Core Set of Objectives and Measures Must Meet All 15 Measures Stage 1 Objectives Stage 1 Measures Reporting Method Use Computerized Provider Order Entry (CPOE) for medication orders directly entered by
More informationModified Stage 2 Meaningful Use Measures 2015-2017
Modified Stage 2 Meaningful Use s 2015-2017 Objective 1: Protect Electronic Health Information NONE Conduct or review a security risk analysis in accordance with the requirements in 45 CFR 164.308(a)(1)
More informationEHR Incentive Program Focus on Stage One Meaningful Use. Kim Davis-Allen, Outreach Coordinator Kim.davis@ahca.myflorida.com October 16, 2014
EHR Incentive Program Focus on Stage One Meaningful Use Kim Davis-Allen, Outreach Coordinator Kim.davis@ahca.myflorida.com October 16, 2014 Checklist Participation Explanation Program Updates Stage One
More informationMedicare & Medicaid EHR Incentive Programs- Past, Present, & Future. Travis Broome, Centers for Medicare & Medicaid Services 12/18/2012
Medicare & Medicaid EHR Incentive Programs- Past, Present, & Future Travis Broome, Centers for Medicare & Medicaid Services 12/18/2012 Medicare-only Eligible Professionals Medicaid-only Eligible Professionals
More informationAgenda. What is Meaningful Use? Stage 2 - Meaningful Use Core Set. Stage 2 - Menu Set. Clinical Quality Measures (CQM) Clinical Considerations
AQAF Health Information Technology Forum Meaningful Use Stage 2 Clinical Considerations Marla Clinkscales & Mike Bice Alabama Regional Extension Center (ALREC) August 13, 2013 0 Agenda What is Meaningful
More informationREQUIREMENTS GUIDE: How to Qualify for EHR Stimulus Funds under ARRA
REQUIREMENTS GUIDE: How to Qualify for EHR Stimulus Funds under ARRA Meaningful Use & Certified EHR Technology The American Recovery and Reinvestment Act (ARRA) set aside nearly $20 billion in incentive
More informationIncentives to Accelerate EHR Adoption
Incentives to Accelerate EHR Adoption The passage of the American Recovery and Reinvestment Act (ARRA) of 2009 provides incentives for eligible professionals (EPs) to adopt and use electronic health records
More informationMeaningful Use. Goals and Principles
Meaningful Use Goals and Principles 1 HISTORY OF MEANINGFUL USE American Recovery and Reinvestment Act, 2009 Two Programs Medicare Medicaid 3 Stages 2 ULTIMATE GOAL Enhance the quality of patient care
More informationMeaningful Use. Medicare and Medicaid EHR Incentive Programs
Meaningful Use Medicare and Medicaid Table of Contents What is Meaningful Use?... 1 Table 1: Patient Benefits... 2 What is an EP?... 4 How are Registration and Attestation Being Handled?... 5 What are
More informationWebinar #1 Meaningful Use: Stage 1 & 2 Comparison CPS 12 & UDS 2013
New York State-Health Centered Controlled Network (NYS HCCN) Webinar #1 Meaningful Use: Stage 1 & 2 Comparison CPS 12 & UDS 2013 January 31, 2014 Ekem Merchant-Bleiberg, Director of Implementation Services
More informationStage 2 Meaningful Use What the Future Holds. Lindsey Wiley, MHA HIT Manager Oklahoma Foundation for Medical Quality
Stage 2 Meaningful Use What the Future Holds Lindsey Wiley, MHA HIT Manager Oklahoma Foundation for Medical Quality An Important Reminder For audio, you must use your phone: Step 1: Call (866) 906-0123.
More informationStage 2 Final Rule Overview: Updates to Stage 1 and New Stage 2 Requirements
Stage 2 Final Rule Overview: Updates to Stage 1 and New Stage 2 Requirements The Centers for Medicare and Medicaid Services (CMS) issued the Stage 2 Final Rule on September 4, 2012. The Stage 2 Final Rule
More informationMeaningful Use Stage 2. Presenter: Linda Wise, EMR Training Specialist
Meaningful Use Stage 2 Presenter: Linda Wise, EMR Training Specialist 1 AGENDA 2 Agenda Meaningful Use in Review Moving Into Stage 2 Meaningful Use Learning the Requirements Understanding the Measures
More informationSummary of the Final Rule for Meaningful Use for 2015 and 2016. Meaningful Use Objectives for 2015 and 2016
Image Research, LLC Christopher B. Sullivan, Ph.D. 2901 Quail Rise Court, Tallahassee, FL 32309 Summary of the Final Rule for Meaningful Use for 2015 and 2016 DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers
More informationEligible Professionals (EPs) Purdue Research Foundation
Understanding STAGE 2 Meaningful Use and the Incentive Program Eligible Professionals (EPs) About Incentives Eligible Professionals report during a calendar year Eligible Professionals can only attest
More informationInteroperability Testing and Certification. Lisa Carnahan Computer Scientist Standards Coordination Office
Interoperability Testing and Certification Lisa Carnahan Computer Scientist Standards Coordination Office Discussion Topics US National Institute of Standards & Technology American Recovery & Reinvestment
More informationAgenda. Overview of Stage 2 Final Rule Impact to Program
Electronic Health Record (EHR) Incentive Payment Program Review of Meaningful Use Stage 2 Regulation Changes and Other Impacts to the Medicaid EHR Incentive Program for 2014 that combines the effective
More information