Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Part 1: Product and Developer Information 1.1 Certified Product Information Product Name: Uprise Product Version: 1.3 Domain: Ambulatory Test Type: Complete EHR 1.2 Developer/Vendor Information Developer/Vendor Name: VisionWeb Address: 6500 River Place Blvd., Bldg 3, Suite 100 Austin TX 78730 Website: www.visionweb.com Email: ksolorzano@visionweb.com Phone: 512-241-8500 Developer/Vendor Contact: Keyea Solorzano Page 1 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 Part 2: ONC-Authorized Certification Body Information 2.1 ONC-Authorized Certification Body Information ONC-ACB Name: Drummond Group Address: 13359 North Hwy 183, Ste B-406-238, Austin, TX 78750 Website: Email: www.drummondgroup.com ehr@drummondgroup.com Phone: 817-294-7339 ONC-ACB Contact: Bill Smith This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative: Bill Smith ONC-ACB Authorized Representative Certification Body Manager Function/Title Signature and Date 4/8/2015 2.2 Gap Certification The following identifies criterion or criteria certified via gap certification 170.314 (a)(1) (a)(17) (d)(5) (d)(9) (a)(6) (b)(5)* (d)(6) (f)(1) (a)(7) (d)(1) (d)(8) *Gap certification allowed for Inpatient setting only x No gap certification Page 2 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 2.3 Inherited Certification The following identifies criterion or criteria certified via inherited certification 170.314 x (a)(1) x (a)(14) x (c)(3) x (f)(1) x (a)(2) x (a)(15) x (d)(1) x (f)(2) x (a)(3) (a)(16) Inpt. only x (d)(2) x (f)(3) x (a)(4) (a)(17) Inpt. only x (d)(3) (f)(4) Inpt. only x (a)(5) x (b)(1) x (d)(4) (f)(5) Optional & x (a)(6) x (b)(2) x (d)(5) Amb. only x (a)(7) x (b)(3) x (d)(6) (f)(6) Optional & x (a)(8) x (b)(4) x (d)(7) Amb. only x (a)(9) x (b)(5) x (d)(8) (g)(1) x (a)(10) (b)(6) Inpt. only (d)(9) Optional x (g)(2) x (a)(11) x (b)(7) x (e)(1) x (g)(3) x (a)(12) x (c)(1) x (e)(2) Amb. only x (g)(4) x (a)(13) x (c)(2) x (e)(3) Amb. only No inherited certification Page 3 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 Part 3: NVLAP-Accredited Testing Laboratory Information Report Number: TEB-04062015-2243 Test Date(s): 9/18/2013, 9/19/2013, 9/26/2013, 9/27/2013, 7/15/2014, 8/7/2014, 11/6/14, 11/18/14 3.1 NVLAP-Accredited Testing Laboratory Information ATL Name: Drummond Group EHR Test Lab Accreditation Number: NVLAP Lab Code 200979-0 Address: 13359 North Hwy 183, Ste B-406-238, Austin, TX 78750 Website: Email: www.drummondgroup.com ehr@drummondgroup.com Phone: 512-335-5606 ATL Contact: Beth Morrow For more information on scope of accreditation, please reference NVLAP Lab Code 200979-0. Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory Representative: Timothy Bennett Test Proctor ATL Authorized Representative Function/Title Signature and Date 4/8/2015 Nashville, TN Location Where Test Conducted 3.2 Test Information 3.2.1 Additional Software Relied Upon for Certification Additional Software NewCropRx Applicable Criteria a.1, a.2, a.6, a.8, a.10, a.15, b.3 Functionality provided by Additional Software medication-related functions Eyemaginations a.15 patient education resources Secure Exchange Solutions b.1, b.2, e.1 direct messaging functions No additional software required Page 4 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 3.2.2 Test Tools Test Tool Version x Cypress 2.4.1 x eprescribing Validation Tool 1.0.3 HL7 CDA Cancer Registry Reporting Validation Tool 1.0.3 HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool 1.8 x HL7 v2 Immunization Information System (IIS) Reporting Validation Tool 1.6 x HL7 v2 Laboratory Results Interface (LRI) Validation Tool 1.6 x HL7 v2 Syndromic Surveillance Reporting Validation Tool 1.6 x Transport Testing Tool 179 x Direct Certificate Discovery Tool 3.0.2 No test tools required 3.2.3 Test Data Alteration (customization) to the test data was necessary and is described in Appendix [insert appendix letter] No alteration (customization) to the test data was necessary 3.2.4 Standards 3.2.4.1 Multiple Standards Permitted The following identifies the standard(s) that has been successfully tested where more than one standard is permitted Criterion # Standard Successfully Tested (a)(8)(ii)(a)(2) (a)(13) 170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain x 170.207(a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release 170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide 170.207(j) HL7 Version 3 Standard: Clinical Genomics; Pedigree Page 5 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 Criterion # (a)(15)(i) (a)(16)(ii) (b)(2)(i)(a) (b)(7)(i) (e)(1)(i) (e)(1)(ii)(a)(2) (e)(3)(ii) x 170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain 170.210(g) Network Time Protocol Version 3 (RFC 1305) 170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10- CM) for the indicated conditions 170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10- CM) for the indicated conditions Standard Successfully Tested Annex A of the FIPS Publication 140-2 [list encryption and hashing algorithms] AES-128 SHA-1 x 170.210(g) Network Time Protocol Version 3 (RFC 1305) Annex A of the FIPS Publication 140-2 [list encryption and hashing algorithms] AES-128 SHA-1 170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide 170. 210(g) Network Time Protocol Version 4 (RFC 5905) x 170.207(a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release x 170.207(a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release 170. 210(g) Network Time Protocol Version 4 (RFC 5905) Common MU Data Set (15) 170.207(a)(3) IHTSDO SNOMED CT International Release July 2012 and US Extension to SNOMED CT March 2012 Release x 170.207(b)(2) The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT-4) None of the criteria and corresponding standards listed above are applicable 3.2.4.2 Newer Versions of Standards Page 6 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 The following identifies the newer version of a minimum standard(s) that has been successfully tested Newer Version Applicable Criteria No newer version of a minimum standard was tested 3.2.5 Optional Functionality Criterion # (a)(4)(iii) (b)(1)(i)(b) (b)(1)(i)(c) (b)(2)(ii)(b) (b)(2)(ii)(c) (f)(3) Common MU Data Set (15) Common MU Data Set (15) Optional Functionality Successfully Tested Plot and display growth charts Receive summary care record using the standards specified at 170.202(a) and (b) (Direct and XDM Validation) Receive summary care record using the standards specified at 170.202(b) and (c) (SOAP Protocols) Transmit health information to a Third Party using the standards specified at 170.202(a) and (b) (Direct and XDM Validation) Transmit health information to a Third Party using the standards specified at 170.202(b) and (c) (SOAP Protocols) Ambulatory setting only Create syndrome-based public health surveillance information for transmission using the standard specified at 170.205(d)(3) (urgent care visit scenario) Express Procedures according to the standard specified at 170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature) Express Procedures according to the standard specified at 170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS) x No optional functionality tested Page 7 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 3.2.6 2014 Edition Certification Criteria* Successfully Tested Criteria # Version Version Criteria # TP** TD*** TP TD (a)(1) 1.2 1.5 (c)(3) 1.6 1.6 (a)(2) 1.2 (d)(1) 1.2 (a)(3) 1.2 1.4 (d)(2) 1.5 (a)(4) 1.4 1.3 (d)(3) 1.3 (a)(5) 1.4 1.3 (d)(4) 1.3 (a)(6) 1.3 1.4 (d)(5) 1.2 (a)(7) 1.3 1.3 (d)(6) 1.2 (a)(8) 1.2 (d)(7) 1.2 (a)(9) 1.3 1.3 (d)(8) 1.2 (a)(10) 1.2 1.4 (d)(9) Optional 1.2 (a)(11) 1.3 (e)(1) 1.8 1.5 (a)(12) 1.3 (e)(2) Amb. only 1.2 1.6 (a)(13) 1.2 (e)(3) Amb. only 1.3 (a)(14) 1.2 (f)(1) 1.2 1.2 (a)(15) 1.5 (f)(2) 1.3 1.7.1 (a)(16) Inpt. only 1.3 1.2 (f)(3) 1.3 1.7 (a)(17) Inpt. only 1.2 (f)(4) Inpt. only 1.3 1.7 (b)(1) 1.7 1.4 (f)(5) Optional & (b)(2) 1.4 1.6 Amb. only 1.2 1.2 (b)(3) 1.4 1.2 (f)(6) Optional & (b)(4) 1.3 1.4 Amb. only 1.3 1.0.3 (b)(5) 1.4 1.7 (g)(1) 1.7 1.9 (b)(6) Inpt. only 1.3 1.7 (g)(2) 1.7 1.9 (b)(7) 1.4 1.6 (g)(3) 1.3 (c)(1) 1.6 1.6 (g)(4) 1.2 (c)(2) 1.6 1.6 x No criteria tested *For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method) **Indicates the version number for the Test Procedure (TP) ***Indicates the version number for the Test Data (TD) Page 8 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 3.2.7 2014 Clinical Quality Measures* Type of Clinical Quality Measures Successfully Tested: x Ambulatory Inpatient No CQMs tested *For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures) Ambulatory CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version 2 90 136 x 155 v2 x 22 v2 117 137 x 156 v2 x 50 v2 x 122 v2 x 138 v2 157 52 123 139 158 56 124 140 159 61 125 141 160 62 126 x 142 v2 161 64 127 x 143 v2 163 65 128 144 164 66 129 145 x 165 v2 x 68 v3 130 146 166 x 69 v2 x 131 v2 147 x 167 v2 74 132 148 169 75 x 133 v2 149 177 77 134 153 179 82 135 154 182 Inpatient CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version 9 71 107 172 26 72 108 178 30 73 109 185 31 91 110 188 32 100 111 190 53 102 113 55 104 114 60 105 171 Page 9 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 3.2.8 Automated Numerator Recording and Measure Calculation 3.2.8.1 Automated Numerator Recording Automated Numerator Recording Successfully Tested (a)(1) (a)(9) (a)(16) (b)(6) (a)(3) (a)(11) (a)(17) (e)(1) (a)(4) (a)(12) (b)(2) (e)(2) (a)(5) (a)(13) (b)(3) (e)(3) (a)(6) (a)(14) (b)(4) (a)(7) (a)(15) (b)(5) x Automated Numerator Recording was not tested 3.2.8.2 Automated Measure Calculation Automated Measure Calculation Successfully Tested x (a)(1) x (a)(9) (a)(16) (b)(6) x (a)(3) x (a)(11) (a)(17) x (e)(1) x (a)(4) x (a)(12) x (b)(2) x (e)(2) x (a)(5) x (a)(13) x (b)(3) x (e)(3) x (a)(6) x (a)(14) x (b)(4) x (a)(7) x (a)(15) x (b)(5) Automated Measure Calculation was not tested 3.2.9 Attestation Attestation Forms (as applicable) x Safety-Enhanced Design* x Quality Management System** x Privacy and Security Appendix A B C 3.3 Appendices Attached below. *Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4) **Required for every EHR product Page 10 of 11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Apr-2015 Test Results Summary Change History Test Report ID Description of Change Date 2014 Edition Test Report Summary Page 11 of 11
EHR Usability Test Report of Uprise Version 1.0, Release 1.0 Test Methodology based on NISTIR 7741 (publication November, 2010) Process Approach for Improving the Usability of EHRs Test Report based on NISTIR 7742 Customized Common Industry Format Template for EHR Usability Testing Uprise Version 1.0, Release 1.0 Date of Usability Test : 09/23/2013 Date of Report : 9/24/2013 Report Prepared By : Tanya Bateman
EHR Usability Test Uprise Version 1.0 Table of Contents 1. EXECUTIVE SUMMARY... 3 2. INTRODUCTION... 4 3. METHOD... 4 3.1 PARTICIPANTS... 4 3.2 STUDY DESIGN... 4 3.3 TASKS... 5 3.4 TEST LOCATION AND ENVIRONMENT... 6 3.5 USABILITY METRICS... 6 3.6 DATA SCORING... 7 4 RESULTS... 8 4.1 DATA REPORTING... 8 4.2 DISCUSSION OF THE FINDINGS... 11 5. APPENDICES... 12 APPENDIX 1: PARTICIPANT DEMOGRAPHICS... 12 APPENDIX 2: NON-DISCLOSURE AGREEMENT... 13 APPENDIX 3: INFORMED CONSENT FORM... 14 APPENDIX 4: FINAL QUESTIONS... 15 APPENDIX 5: SYSTEM USABILITY SCALE QUESTIONNAIRE... 16 Page 2 of 16
EHR Usability Test Uprise Version 1.0 1. EXECUTIVE SUMMARY On 9/23/2013, a usability test of Uprise Version 1.0, Release 1.0 was conducted by VitalHealth Software, via online one-on-one GoToMeeting sessions with test participants. The purpose of this test was to verify and validate the usability of the current user interface of the EHR Under Test (EHRUT), and provide evidence of usability. For this usability test, four optometrists as provider users and/or admin users matching the target demographic criteria served as participants. They used the EHRUT in simulated, but representative tasks. The participants completed a series of usability tests using local browser sessions via logins to the cloud based EHR. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: Path deviations from the optimal workflow Number and types of errors encountered Participant s satisfaction ratings of the system Suggested necessary improvements Participant data was de-identified, so that no correspondence could be made from the identity of the participants to the data collected. Following the conclusion of the testing, the participants were asked to complete a post-test questionnaire. The results from the test scored the overall subjective satisfaction with the EHRUT based on performance with these tasks to between Easy and Very Easy. Page 3 of 16
EHR Usability Test Uprise Version 1.0 2. INTRODUCTION The EHRUT for this study was Uprise Version 1.0. Designed to record and present medical information to eye care providers in optometry practice settings, the EHRUT consists of modules to document patient visits including full medical history and current status, including eye exams, diagnosis, treatment recommendations and actions, patient education, as well as coding of the visit. The usability testing attempted to represent realistic exercises and conditions. The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHRUT. To this end, it recorded the end users subjective opinions on the easy-of-use of the EHRUT to support the various tasks required for documenting patient visits. 3. METHOD 3.1 PARTICIPANTS For this test, four participants matching the target demographic criteria participated in testing the EHRUT. Three of the participants were optometrists/providers that also served the system administrator role for their practice. One participant was an optometrist that currently serves a system administrator function only. All participants served as subject matter expert consultants contributing to the requirements definition during the development of the EHRUT. Participants were not employees from the testing or supplier organization. A demographics profile of the participants is provided in appendix 1. Participants were greeted by the administrator and asked to review and sign an informed consent/release form (examples included in appendix 2 and 3). They were instructed that they could withdraw at any time. Participants had prior experience with Uprise EHR. The administrator introduced the test, and instructed participants to complete a series of tasks using the EHRUT. The participants completed the tests using local browser sessions via logins to the cloud based EHR. During the testing, the administrator timed the test and recorded user performance data. The administrator did not give the participant assistance in how to complete the task. 3.2 STUDY DESIGN The User Centered Design process followed for development of Uprise EHR is based on the guidelines detailed in NIST publication NISTIR 7741, (November 2010). Consequently, the study design and testing process is based on the template detailed in NIST publication NISTIR 7742. Overall, the objective of this test was to uncover areas where the application performed well that is, effectively, efficiently, and with satisfaction and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements are needed. Page 4 of 16
EHR Usability Test Uprise Version 1.0 3.3 TASKS The tasks that were performed by the participants represent a subset of the typical daily tasks that would be performed under normal use of the EHRUT, and included: 170.314(a)(1) Computerized provider order entry a) Electronically Record Orders in an Ambulatory Setting b) Electronically Change Orders in an Ambulatory Setting c) Electronically Access Orders in an Ambulatory Setting 170.314(a)(2) Drug-drug, drug-allergy interaction checks a) Automatically Generate and Electronically Indicate Drug-drug and Drug-allergy Interventions b) Adjust Severity Level of Interventions Indicated for Drug-drug Interaction Checking 170.314(a)(6) Medication list a) Electronically Record Patient Medication List in Ambulatory Setting b) Electronically Change Patient Medication List in Ambulatory Setting c) Electronically Access Patient Medication List in Ambulatory Setting 170.314(a)(7) Medication allergy list a) Electronically Record Patient Medication Allergy List and Medication Allergy List History in Ambulatory Setting b) Electronically Change Patient Medication Allergy List and Medication Allergy List History in Ambulatory Setting c) Electronically Access Patient Medication Allergy List and Medication Allergy List History in Ambulatory Setting 170.314(a)(8) Clinical decision support a) Select/Activate Clinical Decision Support Interventions for Problem List b) Select/Activate Clinical Decision Support Interventions for Medication List c) Select/Activate Clinical Decision Support Interventions for Medication Allergy List d) Select/Activate Clinical Decision Support Interventions for Demographics e) Select/Activate Clinical Decision Support Interventions for Lab Tests and Results f) Select/Activate Clinical Decision Support Interventions for Vital Signs g) Select/Activate Clinical Decision Support Interventions for Demographics and Vital Signs h) Trigger Clinical Decision Support Interventions for Problem List i) Trigger Clinical Decision Support Interventions for Medication List j) Trigger Clinical Decision Support Interventions for Medication Allergy List k) Trigger Clinical Decision Support Interventions for Demographics l) Trigger Clinical Decision Support Interventions for Lab Tests and Results m) Trigger Clinical Decision Support Interventions for Vital Signs Page 5 of 16
EHR Usability Test Uprise Version 1.0 n) Trigger Clinical Decision Support Interventions for a combination of Demographics and Vital Signs o) Identify Diagnostic and Therapeutic Reference Resources p) Configure Clinical Decision Support Interventions and Diagnostic and Therapeutic Reference Resources q) Review Attributes r) Trigger Clinical Decision Support Interventions For Incorporate CCDA Problems, Medications and Medication Allergies s) Trigger Clinical Decision Support for incoming Lab Results 170.314(b)(3) Electronic prescribing a) Electronically Create Prescriptions 170.314(b)(4) Clinical information reconciliation a) Electronically and Simultaneously Display Medication List Data in a Single View b) Create a Single Reconciled Medication List c) Review, Validate, Confirm, and Submit the Final Reconciled Medication List d) Electronically and Simultaneously Display Problem List Data in a Single View e) Create a Single Reconciled Problem List f) Review, Validate, Confirm, and Submit the Final Reconciled Problem List g) Electronically and Simultaneously Display Medication Allergy List Data in a Single View h) Create a Single Reconciled Medication Allergy List i) Review, Validate, Confirm, and Submit the Final Reconciled Medication Allergy List 3.4 TEST LOCATION AND ENVIRONMENT The test was performed in a virtual setting, using one-on-one online GoToMeeting sessions. The participants joined the sessions from their respective home or work locations via a personal computer and browser. They were given access to the EHRUT via a dedicated and secure login to the cloud based EHR. 3.5 USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess: Effectiveness of EHRUT by measuring participant success rates and errors Efficiency of EHRUT by measuring the average task time and path deviations Satisfaction with EHRUT by measuring ease of use ratings Page 6 of 16
EHR Usability Test Uprise Version 1.0 3.6 DATA SCORING The following table details how tasks were evaluated and scored. Column Risk User(s) Path Deviation Task Errors Task Rating Rationale A rating of the level of risk associated with the task, in terms of potential negative impact on the quality of care. Possible values are None, Low, Medium, High. Indicates if a user having the role Optometrist or Admin performed the task, or if users of Both roles performed the task. The participant s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control, as compared to the optimal path. If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, the task was counted as Failure No task times were taken into account for errors. Participant s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a postsession questionnaire. After each task, the participant was asked to rate Overall, this task was: on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants. Page 7 of 16
EHR Usability Test Uprise Version 1.0 4 RESULTS 4.1 DATA REPORTING The following tasks were performed as a realistic representation of the kinds of activities a user might do with the EHRUT, along with a task rating. Task Ratings 1= Very Easy 2= Easy 3= Neither Easy, Nor Difficult 4= Difficult 5= Very Difficult Risk Ratings Low, Medium, High User (Participant) 1=Optometrist, 2=Administrator, 3=Both Task Risk User(s) Path Deviations Errors User Suggestions Record Laboratory Order Low Both None None 1,3 None Change Laboratory Low Both None None 1,3 None Order Access Laboratory Order Low Both None None 1,3 None Record Medication Order High Both Users did not initially realize that orders needed to be expressly moved from pending to active list. None 1,3 Workflow requires a little trial and error learning curve, but is effective once the process is learned. Change Medication High Both None None 1,3 None Order Access Medication Order Low Both None None 1,3 None Record Imaging Order Low Both None None 1,3 None Change Imaging Order Low Both None None 1,3 None Access Imaging Order Low Both None None 1,3 None Generate and High Both None None 1,3 None Electronically Indicate Drug-drug and Drugallergy Interventions Adjust Severity Level of Medium Admin None None 2,3 None Interventions Indicated for Drug-drug Interaction Checking Record Medication List Medium Both None None 1,3 Unify user interface to match the rest of the EHR. Change Medication List Medium Both None None 1,3 None Access Medication List Low Both None None 1,3 None Record Medication Medium Both None None 1,3 None Allergy List and History Change Medication Medium Both None None 1,3 None Allergy List and History Access Medication Low Both None None 1,3 None Allergy List and History Select/Activate Clinical Decision Support Interventions for Problem List Low Admin None None 2,3 Make this part of physician configuration capabilities instead of it being a system administration function. Page 8 of 16
EHR Usability Test Uprise Version 1.0 Task Risk User(s) Path Deviations Errors User Suggestions Select/Activate Clinical Decision Support Interventions for Medication List Low Admin None None 2,3 Make this part of physician configuration capabilities instead of it being a system administration function. Select/Activate Clinical Decision Support Interventions for Medication Allergy List Select/Activate Clinical Decision Support Interventions for Demographics Select/Activate Clinical Decision Support Interventions for Lab Test and Results Select/Activate Clinical Decision Support Interventions for Vital Signs Select/Activate Clinical Decision Support Interventions for Vital Signs and Demographics Trigger Clinical Decision Support Interventions for Problem List Trigger Clinical Decision Support Interventions for Medication List Trigger Clinical Decision Support Interventions for Medication Allergy List Trigger Clinical Decision Support Interventions for Demographics Trigger Clinical Decision Support Interventions for Lab Test and Results Trigger Clinical Decision Support Interventions for Vital Signs Trigger Clinical Decision Support Interventions for Vital Signs and Demographics Low Admin None None 2,3 Make this part of physician configuration capabilities instead of it being a system administration function. Low Admin None None 2,3 Make this part of physician configuration capabilities instead of it being a system administration function. Low Admin None None 2,3 Make this part of physician configuration capabilities instead of it being a system administration function. Low Admin None None 2,3 Make this part of physician configuration capabilities instead of it being a system administration function. Low Admin None None 2,3 Make this part of physician configuration capabilities instead of it being a system administration function. High Both None None 1,3 None High Both None None 1,3 None High Both None None 1,3 None Medium Both None None 1,3 None Medium Both None None 1,3 None High Both None None 1,3 None High Both None 1,3 None Page 9 of 16
EHR Usability Test Uprise Version 1.0 Task Risk User(s) Path Deviations Errors User Suggestions Identify Diagnostic and Low Both None None 2,3 None Therapeutic Reference Resources Configure Clinical Decision Support Interventions and Diagnostic and Therapeutic Reference Medium Admin None None 2,3 Make this part of physician configuration capabilities instead of it being a system administration function. Resources Review Attributes Medium Both None None 1,3 None Trigger Clinical Decision Support Interventions for CCDA High Both None None 2,3 None Electronically Create Prescriptions Electronically and Simultaneously Display Medication List Data in a Single View Create a Single Reconciled Medication List Review, Validate, Confirm, and Submit the Final Reconciled Medication List Electronically and Simultaneously Display Problem List Data in a Single View Create a Single Reconciled Problem List Review, Validate, Confirm, and Submit the Final Reconciled Problem List Electronically and Simultaneously Display Medication Allergy List Data in a Single View Create a Single Reconciled Medication Allergy List Review, Validate, Confirm, and Submit the Final Reconciled Medication Allergy List High Both None None 1,3 Unify user interface to match the rest of the EHR. Medium Both None None 1,3 None High Both None None 1,3 Unify user interface to match the rest of the EHR. High Both None None 1,3 None Medium Both None None 1,3 None Medium Both None None 1,3 None High Both None None 1,3 None Medium Both None None 1,3 None High Both None None 1,3 None High Both None None 1,3 None Page 10 of 16
EHR Usability Test Uprise Version 1.0 4.2 DISCUSSION OF THE FINDINGS EFFECTIVENESS AND EFFICIENCY Participants did not significantly deviate from optimal workflow paths. The user interface for electronic prescribing and recording of allergies took a little more effort to understand and to get a level of end user proficiency, but was found to be effective after completing the learning curve. SATISFACTION Given the subjective comments on effectiveness and efficiency recorded above, the participants recorded an overall score between Easy and Very Easy. AREAS FOR IMPROVEMENT The participants would prefer that the user interface styling of the electronic prescribing and allergy recording would more closely match the rest of the UI style of the EHRUT. Page 11 of 16
EHR Usability Test Uprise Version 1.0 5. APPENDICES APPENDIX 1: PARTICIPANT DEMOGRAPHICS Following is a high-level overview of de-identified information about the participants in this study. Gender Men [3] Women [1] Total participants [4] Occupation/Role LPN [0] Optometrist [4] Admin Staff [4] Years of Experience All paper [0] Some paper, some electronic [25] All electronic [] Page 12 of 16
EHR Usability Test Uprise Version 1.0 APPENDIX 2: NON-DISCLOSURE AGREEMENT THIS AGREEMENT is entered into as of (date), between ( the Participant ) and the testing organization VitalHealth Software located at 80 South 8th Street, Suite 900, Minneapolis, MN 55402. The Participant acknowledges his or her voluntary participation in today s usability study may bring the Participant into possession of Confidential Information. The term "Confidential Information" means all technical and commercial information of a proprietary or confidential nature which is disclosed by Test Company, or otherwise acquired by the Participant, in the course of today s study. By way of illustration, but not limitation, Confidential Information includes trade secrets, processes, formulae, data, know-how, products, designs, drawings, computer aided design files and other computer files, computer software, ideas, improvements, inventions, training methods and materials, marketing techniques, plans, strategies, budgets, financial information, or forecasts. Any information the Participant acquires relating to this product during this study is confidential and proprietary to Test Company and is being disclosed solely for the purposes of the Participant s participation in today s usability study. By signing this form the Participant acknowledges that s/he will receive monetary compensation for feedback and will not disclose this confidential information obtained today to anyone else or any other organizations. Participant s printed name: Signature: Date: Page 13 of 16
EHR Usability Test Uprise Version 1.0 APPENDIX 3: INFORMED CONSENT FORM VitalHealth Software would like to thank you for participating in this study. The purpose of this study is to evaluate an electronic health records system. If you decide to participate, you will be asked to perform several tasks using the prototype and give your feedback. The study will last about 120 minutes. Agreement I understand and agree that as a voluntary participant in the present study conducted by VitalHealth Software. I am free to withdraw consent or discontinue participation at any time. I understand and agree to participate in the study conducted by VitalHealth Software. I understand and agree that the purpose of this study is to make software applications more useful and usable in the future. I understand and agree that the data collected from this study may be shared outside of VitalHealth Software. I understand and agree that data confidentiality is assured, because only de-identified data i.e., identification numbers not names will be used in analysis and reporting of the results. I agree to immediately raise any concerns or areas of discomfort with the study administrator. I understand that I can discontinue participation at any time. Please check one of the following: YES, I have read the above statement and agree to be a participant. NO, I choose not to participate in this study. Signature: Date: Page 14 of 16
EHR Usability Test Uprise Version 1.0 APPENDIX 4: FINAL QUESTIONS The questionnaire below lists the responses of one of the study participants. What was your overall impression of this system? Easy to use, follows a practical clinical workflow. What aspects of the system did you like most? Annotation tool, clinical decision support. What aspects of the system did you like least? Medication entry required a slight learning curve. Were there any features that you were surprised to see? Ease of use of Clinical Decision Support was nice to see. What features did you expect to encounter but did not see? That is, is there anything that is missing in this application? Not that I could determine. Compare this system to other systems you have used. Would you recommend this system to your colleagues? Yes Page 15 of 16
EHR Usability Test Uprise Version 1.0 APPENDIX 5: SYSTEM USABILITY SCALE QUESTIONNAIRE The questionnaire below lists the responses of one of the study participants. # Question Answer 1 I think that I would like to use this system frequently. Strongly agree 2 I found the system unnecessarily complex. Strongly disagree 3 I thought the system was easy to use. Agree 4 I think that I would need the support of a technical person to be able to use Strongly disagree this system. 5 I found the various functions in this system were well integrated. Agree 6 I thought there was too much inconsistency in this system. Disagree 7 I would imagine that most people would learn to use this system very Agree quickly. 8 I found the system very cumbersome to use. Strongly disagree 9 I felt very confident using the system. Strongly agree 10 I needed to learn a lot of things before I could get going with this system. Disagree Page 16 of 16
VisionWeb s Quality Management System 170.314(g)(4) Quality Management System Applicable Certification Criterion Number 170.314(a)(3) 170.314(d)(1) 170.314(d)(2) 170.314(d)(5) 170.314(d)(6) 170.314(d)(7) 170.314(d)(8) 170.314(e)(1) 170.314(e)(3) 170.314(g)(4) Development Home-grown QMS is used. Name Demographics Authentication, Access Control, and Authorization Auditable Events and Tamper Resistance Automatic Log-off Emergency Access End-User Device Encryption (N/A) Integrity View, Download, and Transmit to 3 rd Party Secure Messaging Quality Management System Salesforce.com is utilized by all departments to record software feedback from customers, subject-matter experts, as well as internal departments. Business analysts elicit requirements from users, subject-matter experts, and partners in order to create and distribute Business Requirement Documents (BRD). Project Management (PM) team reviews proposed features, determines priorities, estimate tasks, and determine the scope of future releases of the software. Development leads review documentation, resolve any open questions, and discuss any alternatives to proposed solutions with the business analysts. Viewpath.com it utilized by the PM team to create and track development tasks within a project. Team Foundation Server (TFS) is utilized by development leads to assign tasks to developers and manage the progress and status. Testing Home-grown QMS is used. The Quality Assurance (QA) team creates test plan documentation based on software requirements specifications and development understanding documentation. TFS is utilized to record, assign, and track issues with the software. HP LoadRunner is utilized to test application performance. Burp Suite is utilized to test the cross-site scripting vulnerability and security of the system. SoapUI Pro is utilized to test web services utilized throughout the system. The QA team assigns issues to developers to update code or business analysts to clarify requirements. Uprise by VisionWeb 1
VisionWeb s Quality Management System 170.314(g)(4) Quality Management System Implementation Home-grown QMS is used. The following environments are utilized throughout the implementation of the software: o Developer: each developer has their own environment o Development: developer code is combined into this environment o Unit Test: used for developer testing o Integration: test customer data deployed; integration testing o Alpha: test customer data deployed; QA testing o QA: used to test 3 rd party connectivity; end-user acceptance o Pre-Prod: used to test pre-go-live o Production: final release of tested software Maintenance Home-grown QMS is used. The following types of releases are utilized throughout the maintenance of the software: o Functional: planned and prioritized releases based on market analysis o Emergency: resolution of high-priority/blocking issues Updates to the software are performed outside of customer hours of operation, based on time zone and day-of-the-week, when possible. Uprise by VisionWeb 2