+ All Categories
Home > Documents > Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR...

Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR...

Date post: 11-Apr-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
28
Transcript
Page 1: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May
Page 2: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   2 

2.2 Gap Certification 

The following identifies criterion or criteria certified via gap certification. 

§170.314 

  (a)(1)    (a)(19)  (d)(6)    (h)(1) 

  (a)(6)    (a)(20)  (d)(8)    (h)(2) 

  (a)(7)    (b)(5)*  (d)(9)   

  (a)(17)    (d)(1)  (f)(1)   

  (a)(18)    (d)(5)  (f)(7)*   

*Gap certification allowed for Inpatient setting only 

 No gap certification 

 

2.3 Inherited Certification 

The following identifies criterion or criteria certified via inherited certification. 

§170.314 

  (a)(1)    (a)(16) Inpt. only  (c)(2)    (f)(2) 

  (a)(2)    (a)(17) Inpt. only  (c)(3)    (f)(3) 

  (a)(3)    (a)(18)  (d)(1)    (f)(4) Inpt. only 

  (a)(4)    (a)(19)  (d)(2)    (f)(5) Optional & Amb. only

  (a)(5)    (a)(20)  (d)(3)    (f)(6) Optional & Amb. only

  (a)(6)    (b)(1)  (d)(4)    (f)(7) Amb. only 

  (a)(7)    (b)(2)  (d)(5)    (g)(1) 

  (a)(8)    (b)(3)  (d)(6)    (g)(2) 

  (a)(9)    (b)(4)  (d)(7)    (g)(3) 

  (a)(10)    (b)(5)  (d)(8)    (g)(4) 

  (a)(11)    (b)(6) Inpt. only  (d)(9) Optional    (h)(1) 

  (a)(12)    (b)(7)  (e)(1)    (h)(2) 

  (a)(13)    (b)(8)  (e)(2) Amb. only    (h)(3) 

  (a)(14)    (b)(9)  (e)(3) Amb. only     

  (a)(15)    (c)(1)  (f)(1)     

 No inherited certification 

 

Page 3: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May
Page 4: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   4 

3.2.2 Test Tools 

Test Tool  Version 

 Cypress  2.6.1 

 ePrescribing Validation Tool  1.0.6 

 HL7 CDA Cancer Registry Reporting Validation Tool    

 HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool  1.7.2 

 HL7 v2 Immunization Information System (IIS) Reporting Validation Tool  1.8.2 

 HL7 v2 Laboratory Results Interface (LRI) Validation Tool  1.7.2 

 HL7 v2 Syndromic Surveillance Reporting Validation Tool  1.7.2 

 Transport Testing Tool  182 

 Direct Certificate Discovery Tool  3.0.4 

 Edge Testing Tool   

 No test tools required   

 

3.2.3 Test Data 

 Alteration (customization) to the test data was necessary and is described in Appendix 

 No alteration (customization) to the test data was necessary 

 

3.2.4 Standards 

3.2.4.1 Multiple Standards Permitted 

The following identifies the standard(s) that has been successfully tested where more than one standard is permitted. 

Criterion #  Standard Successfully Tested 

(a)(8)(ii)(A)(2)    §170.204(b)(1) HL7 Version 3 Implementation Guide: URL‐Based Implementations of the Context‐Aware Information Retrieval (Infobutton) Domain 

  §170.204(b)(2) HL7 Version 3 Implementation Guide: Context‐Aware Knowledge Retrieval (Infobutton) Service‐Oriented Architecture Implementation Guide 

(a)(13)    §170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release 

  §170.207(j) HL7 Version 3 Standard: Clinical Genomics; Pedigree 

(a)(15)(i)    §170.204(b)(1)  HL7 Version 3 Implementation Guide: URL‐Based Implementations of the Context‐Aware Information Retrieval (Infobutton) Domain 

  §170.204(b)(2) HL7 Version 3 Implementation Guide: Context‐Aware Knowledge Retrieval (Infobutton) Service‐Oriented Architecture Implementation Guide 

(a)(16)(ii)    §170.210(g)  Network Time Protocol Version 3 (RFC 1305)  

  §170. 210(g) Network Time Protocol Version 4 (RFC 5905) 

(b)(2)(i)(A)    §170.207(i)  The code set specified at 45 CFR 162.1002(c)(2) (ICD‐10‐CM) for the indicated conditions  

  §170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release 

Page 5: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   5 

Criterion #  Standard Successfully Tested 

(b)(7)(i)    §170.207(i)  The code set specified at 45 CFR 162.1002(c)(2) (ICD‐10‐CM) for the indicated conditions  

  §170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release 

(b)(8)(i)    §170.207(i)  The code set specified at 45 CFR 162.1002(c)(2) (ICD‐10‐CM) for the indicated conditions  

  §170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release 

(e)(1)(i)    Annex A of the FIPS Publication 140‐2 

SHA1; AES 

(e)(1)(ii)(A)(2)    §170.210(g)  Network Time Protocol Version 3 (RFC 1305)  

  §170. 210(g) Network Time Protocol Version 4 (RFC 5905) 

(e)(3)(ii)    Annex A of the FIPS Publication 140‐2 

SHA1; AES 

Common  MU Data Set (15) 

  §170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release 

  §170.207(b)(2) The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT‐4) 

 None of the criteria and corresponding standards listed above are applicable 

 

3.2.4.2 Newer Versions of Standards  

The following identifies the newer version of a minimum standard(s) that has been successfully tested.  

Newer Version  Applicable Criteria 

   

 No newer version of a minimum standard was tested 

 

3.2.5 Optional Functionality 

Criterion #  Optional Functionality Successfully Tested 

(a)(4)(iii)   Plot and display growth charts 

(b)(1)(i)(B)   Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation) 

(b)(1)(i)(C)   Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols) 

(b)(2)(ii)(B)   Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation) 

(b)(2)(ii)(C)   Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols) 

(f)(3)   Ambulatory only – Create syndrome‐based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario) 

(f)(7)   Ambulatory only – transmission to public health agencies – syndromic surveillance ‐ Create Data Elements 

Common MU Data Set (15)  

 Express Procedures according to the standard specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature) 

Page 6: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   6 

Criterion #  Optional Functionality Successfully Tested 

Common MU Data Set (15) 

 Express Procedures according to the standard specified at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD‐10‐PCS) 

 No optional functionality tested 

 

3.2.6 2014 Edition Certification Criteria* Successfully Tested 

Criteria # Version 

Criteria # Version 

TP**  TD***  TP**  TD*** 

  (a)(1)    (c)(1)  1.12 2.6.1

  (a)(2)  1.2      (c)(2)  1.12 2.6.1

  (a)(3)  1.2 1.4   (c)(3)  1.12 2.6.1

  (a)(4)  1.4 1.3   (d)(1) 

  (a)(5)  1.4 1.3   (d)(2)  1.6   

  (a)(6)          (d)(3)  1.3   

  (a)(7)          (d)(4)  1.2   

  (a)(8)  1.3      (d)(5)       

  (a)(9)  1.3 1.3   (d)(6)       

  (a)(10)  1.2 1.4   (d)(7)  1.2   

  (a)(11)  1.3      (d)(8)       

  (a)(12)  1.3      (d)(9) Optional       

  (a)(13)  1.2      (e)(1)  1.11 1.5

  (a)(14)  1.2      (e)(2) Amb. only  1.2 1.6

  (a)(15)  1.5      (e)(3) Amb. only  1.3   

  (a)(16) Inpt. only    (f)(1) 

  (a)(17) Inpt. only    (f)(2)  1.3 1.3.0

  (a)(18)    (f)(3)  1.3 1.3.0

  (a)(19)    (f)(4) Inpt. only 

  (a)(20)    (f)(5) Optional & Amb. only 

  (b)(1)  1.7 1.4   (f)(6) Optional & Amb. only 

  (b)(2)  1.4 1.6   (f)(7) Amb. only 

  (b)(3)  1.4 1.2   (g)(1) 

  (b)(4)  1.3 1.4   (g)(2)  2.0 2.0

  (b)(5)  1.4 1.7.2   (g)(3)  1.4

  (b)(6) Inpt. only        (g)(4)  1.2

  (b)(7)  1.5 1.7   (h)(1) 

  (b)(8)    (h)(2) 

  (b)(9)    (h)(3) 

*For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method) 

**Indicates the version number for the Test Procedure (TP) 

***Indicates the version number for the Test Data (TD)

 

Page 7: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   7 

3.2.7 2014 Clinical Quality Measures* 

Type of Clinical Quality Measures Successfully Tested: 

  Ambulatory 

  Inpatient 

  No CQMs tested 

*For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures) 

Ambulatory CQMs 

CMS ID  Version  CMS ID  Version  CMS ID  Version  CMS ID  Version 

  2  v4    90  v4  136      155    

  22       117     137      156  v3 

  50  v3    122     138  v3    157    

  52       123     139      158    

  56       124  v3  140      159    

  61       125  v3  141      160    

  62       126     142      161    

  64       127     143      163    

  65       128     144      164    

  66       129     145      165  v3 

  68  v4    130  v3  146      166  v4 

  69  v3    131    147      167    

  74      132    148      169    

  75      133    149      177    

  77      134    153      179    

  82      135    154      182    

 

Inpatient CQMs 

CMS ID  Version  CMS ID  Version  CMS ID  Version  CMS ID  Version 

  9      71    107      172   

  26      72    108      178   

  30      73    109      185   

  31      91    110      188   

  32      100    111      190   

  53      102    113   

   55      104    114   

  60      105    171   

Page 8: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   8 

3.2.8 Automated Numerator Recording and Measure Calculation 

3.2.8.1 Automated Numerator Recording 

Automated Numerator Recording Successfully Tested 

  (a)(1)    (a)(11)  (a)(18)    (b)(6) 

  (a)(3)    (a)(12)  (a)(19)    (b)(8) 

  (a)(4)    (a)(13)  (a)(20)    (b)(9) 

  (a)(5)    (a)(14)  (b)(2)    (e)(1) 

  (a)(6)    (a)(15)  (b)(3)    (e)(2) 

  (a)(7)    (a)(16)  (b)(4)    (e)(3) 

  (a)(9)    (a)(17)  (b)(5)     

 Automated Numerator Recording was not tested  

3.2.8.2 Automated Measure Calculation 

Automated Measure Calculation Successfully Tested 

  (a)(1)    (a)(11)  (a)(18)    (b)(6) 

  (a)(3)    (a)(12)  (a)(19)    (b)(8) 

  (a)(4)    (a)(13)  (a)(20)    (b)(9) 

  (a)(5)    (a)(14)  (b)(2)    (e)(1) 

  (a)(6)    (a)(15)  (b)(3)    (e)(2) 

  (a)(7)    (a)(16)  (b)(4)    (e)(3) 

  (a)(9)    (a)(17)  (b)(5)     

 Automated Measure Calculation was not tested  

 

3.2.9 Attestation 

Attestation Forms (as applicable)  Appendix 

 Safety‐Enhanced Design*  A 

 Quality Management System**  B 

 Privacy and Security  C 

*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4) 

**Required for every EHR product 

 

Page 9: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   9 

Appendix A: Safety Enhanced Design 

Page 10: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

1

EHR Usability Test Report of Medi-EHR

Version 2.1 Report Format based on ISO/IED 25062:2006 Common Industry Format for Usability Test Reports1 and NISTIR 7742:2010 Customized Common Industry Format Template for Electronic Health Record Usability Testing2

Name and Version: Medi-EHR Version 2.1 Date of Usability Testing: Feb 2014 – April 2014 Date of Report: May 2014 Report Prepared By: Medi-EHR User Research Team Matthew D’Alessandro, UI/ Product Designer, Medi-EHR, LLC 732-580-1732 [email protected] 90 Washington Valley Road Bedminster, NJ 07921

Table of Contents

1 EXECUTIVE SUMMARY 3

2 INTRODUCTION 3

3 METHOD 4

3.1 PARTICIPANTS 4

3.2 STUDY DESIGN 5

3.3 TASKS 6

3.4 PROCEDURE 6

3.5 TEST LOCATION 7

3.6 TEST ENVIRONMENT 7

3.7 PARTICIPANT INSTRUCTIONS 7

3.8 USABILITY METRICS 8

4 RESULTS 10

4.1 DATA ANALYSIS AND REPORTING 10

4.2 DISCUSSION OF THE FINDINGS 11

Page 11: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

2

EXECUTIVE SUMMARY

A usability test OF Medi-EHR, V2.1 was conducted between Feb 2014-April 2014. The testing was conducted on-site and online The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT).

During the usability test, 5 healthcare providers and/or other intended users matching the target demographic criteria served as participants and used the EHRUT in simulated, but representative tasks. This study collected performance data on 7 tasks typically conducted on an EHR:

1. 170.314(a)(1) Computerized provider order entry (CPOE) 2. 170.314(a)(2) Drug-drug, drug-allergy interaction checks 3. 170.314(a)(6) Medication list 4. 170.314(a)(7) Medication allergy list 5. 170.314(a)(8) Clinical decision support 6. 170.314(b)(3) Electronic Prescribing 7. 170.314(b)(4) Clinical information reconciliation

During the 30 one-on-one usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form (included in Appendix 3); they were instructed that they could withdraw at any time. Participants had prior experience with the EHR. The administrator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data on paper and electronically. The administrator did not give the participant assistance in how to complete the task.

The following types of data were collected for each participant:

• Number of tasks successfully completed within the allotted time without assistance • Time to complete the tasks • Number and types of errors • Path deviations • Participant’s verbalizations • Participant’s satisfaction ratings of the system

All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a post-test questionnaire and were not compensated for their time. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

Page 12: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

3

INTRODUCTION

The EHRUT tested for this study was Medi- EHR, Version V2.1; Complete EHR which is designed to record and present medical information to healthcare providers in Ambulatory settings, surgical centers, behavioral health, ob/gyn, orthopedic, cardiology, physical therapy, internal medicine. The usability testing attempted to represent realistic exercises and conditions.

The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). To this end, measures of effectiveness, efficiency and user satisfaction, such as:

• Task success • Task errors • Task time • Path deviation • Task ratings

Task #

Mean Task Time

(in Seconds)

Expected Task time

(in Seconds) Task Success ErrorsOptimal Path

Mean Deviations Optimal Path

Mean TaskSatisfaction

(5 = Easy)Electronically Access Laboratory Orders 1,2,3,4,5 71 60 100% 0.6 2.00 5Electronically Record Laboratory Orders 1,2,3,4,5 85 90 100% 0 1.6 7.00 5Electronically Access Patient Active Medication List 1,2,3,4,5 111 90 100% 0 0.4 2.00 5Electronically Change Patient Active Medication List 1,2,3,4,5 67 60 100% 0 1.2 4.00 5Electronically Record Patient Active Medication List 1,2,3,4,5 104 120 100% 0 0.4 3.00 4View clinical decision support rule based on Problem List data 1,2,3,4,5 71.8 60 100% 0 0 2.00 5View clinical decision support rule based on Medication List data 1,2,3,4,5 68 60 100% 0 0 2.00 5View clinical decision support rule based on Rx allergy data 1,2,3,4,5 64 60 100% 0 0 2.00 5View clinical decision support rule based on lab test data 1,2,3,4,5 63 60 100% 0 0 2.00 5Generate and Electronically Indicate Drug-drug and Drug- allergy Interventions 1,2,3,4,5 60 90 100% 0 0.6 2.00 5Electronically prescribe a medication 1,2,3,4,5 105 120 100% 0 1 5.00 3.5Record a new medication 1,2,3,4,5 93 90 100% 0 0.4 4.00 4Change an existing Medication 1,2,3,4,5 59 60 100% 0 0.4 2.00 5Merge clinical information into record from an external source 1,2,3,4,5 118 120 100% 0 1.2 6.00 3Review and Save Clinical information 1,2,3,4,5 91.4 90 100% 0 0.2 3.00 4

Page 13: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

4

METHOD

DESCRIPTION OF ENVISIONED USERS

The envisioned real world users of software include physicians, nurses, medical assistance’s, office managers, billing manager and front desk staff. For this usability test study we envisioned to use healthcare providers typically physicians and/or nurse practitioners in the age of 25-60 years from different specialties. There was no gender, race, preferred language or ethnicity restriction.

PARTICIPANTS

We approve that the minimum number of participants is 5 for all tests except admin tasks of configuration of clinical decision support intervention and drug-drug severity settings for which minimum number of participants were two. We also approve that the participants contest the description of the intended users. Total of 5 participants were tested on the EHRUT. Participants in the test were physicians , nurses , Billing Specialist, Meaningful Use Specialist, Administrative Staff. Participants were recruited by Medi-EHR, LLC. Furthermore, participants had no direct link to the development of our organization producing the EHRUT. Participants were not from the testing or supplier organization. Participants were given the chance to have the same positioning and level of training as the definite end users would have acknowledged. Here is a table of participants by features, including professional experience, demographics, computing experience and user needs for assistive technology. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual characteristics.

All identifiable information was de-identified such that the data gathered in the current study cannot be linked back to any individual person. All participants were made aware of their rights as voluntary participants as is described in the Informed Consent Form (see: Appendix A: Informed Consent). Before proceeding with the testing session, each participant was required to sign the informed consent form and to provide the test administrator with pertinent demographic information, as is summarized in the following table.

PARTICIPANT DEMOGRAPHICS

N Part ID Gender Age Occupation/ roleProfessional Experience

Computer Experience (5= HIGH)

EHR Experience

Assistive Technology Needs

1 P01 M 55 Biller 12 years 5 NONE2 P02 M 51 Practice Manager 15 years 5 NONE3 P03 M 46 Surgeon 20 years 3 NONE4 P04 F 25 Front Desk Associat 15 years 5 NONE5 P05 F 38 MU Software Specialist 9 years 4 NONE

Page 14: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

5

Study Design

Objective

The primary objective of the current study is to verify that each interface and interaction pattern tested performs effectively, efficiently, and with user satisfaction and to ensure that the system is designed according to the principles of user centered design and safety enhanced design. The methods of the current study were based on the user testing principles and methodologies defined in the NISTIR 7742 and were elaborated upon and executed by user experience professionals at Medi-EHR. Structurally, the current study was comprised of blocks, each of which included tests for related tasks. Tasks within each block and blocks across the study were presented in a randomized order across participants.

Structure

The current study was comprised of 7 blocks, each of which focused on a particular area of Medi-EHR (see the following Tasks section). Each block included several related tests, each of which in turn included relevant tasks. Tasks related to each test included in the current report were analyzed and prioritized according to associated risk by the Development Team at Medi-EHR. Those tasks involving interactions that were deemed to be liable to safety risks were included in testing and are reported in the current analysis.

Methodology

At the start of each session, participants were provided with a brief overview of the general area of focus for the current session. Each participant was asked about their prior experience with the current area of the software before they were given an introduction to the basic functionality of the current workflow.

The introduction was dictated by the test administrator and was aimed at mirroring the training that any new user would have before using that area of the software. Participants were provided with time to ask questions before the testing for each task began.

Following the introduction of each task, participants were asked to complete the task while the test

administrator noted user interactions as well as time to complete the task. If the user was able to perform the task within the maximum allotted time, the task was considered to be successfully completed; otherwise, it was considered to be a failure to complete. For all successful tasks completed, the time to complete the task, the interaction path between user and system, and the users’ responses to the ease of use of the system were recorded. Experimenters were not permitted to provide additional instruction during each task in the current study; however, if a user indicated confusion or frustration, the test administrator was permitted to remind the participant of the goals of the current task and to break down the goals of the current task into sub-goals, if necessary according to the discretion of the test administrator. In an effort to reduce any effects due to the interaction between the test administrator and the participant, the same test administrator introduced and oversaw each task for the duration of the study. In order to counterbalance any learning curves that may have occurred throughout each session, sessions were conducted in a randomized order for each participant group.

Page 15: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

6

Following completion of the study, the practice from which each participant was recruited was provided with monetary compensation.

TASKS

We confirm that the user tasks are prioritized in accordance with the risks associated with user errors. A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

§ 170.314(a)(1) Computerized provider order entry

§ 170.314(a)(2) Drug-drug, drug-allergy interaction checks

§ 170.314(a)(6) Medication list

§ 170.314(a)(7) Medication allergy list

§ 170.314(a)(8) Clinical decision support

§ 170.314(b)(3) Electronic prescribing

§ 170.314(b)(4) Clinical information reconciliation

The test scenarios for each participant were centered on common specified tasks. We provided the participants with descriptions of each task. However, we did not provide specific, step-by-step instructions on how to accomplish the designated task. These tasks were ranked based on the risks associated, frequency of use and those that may be troublesome for users.

These tasks are listed in ascending risk ranking:

PROCEDURES

A participant ID was assigned to the participants. For the purpose of test confirmation, two staff members contributed in this test, one is data logger and the other is usability administrator. The administrator toned-down the session including managing tasks and instructions. The administrator also examined task times, took notes on participant comments and acquired post-task rating data. Second person assisted as took notes on task success, path deviations ,number, comments and type of errors.

Participants were trained to perform the tasks (see specific instructions below):

• Without using a think aloud practice. • Making as few errors and deviations as possible in an effectively less time.

Risk Ranking Task Description1 170.314(a)(2)Drug-drug, drug-allergy interaction checks2 170.314(a)(8) Clinical Decision Support3 170.314(b)(4) Clinical information reconciliation4 170.314(b)(3) Electronic Prescribing5 170.314(a)(1) Computerized Provider Order Entry6 170.314(a)(7) Medication Allergy List7 170.314(a)(6) Medication List

Page 16: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

7

• Administrators were permitted to give clarifications on tasks and immaterial guidance, without giving proper help and support, but not instructions on use.

For each task, the participants were given a written copy of the task. As soon as administrator finished with reading the questions, ask timing began. Once participants shown that they had successfully finished the tasks, the task time was ended. The administrator gave the participant the post-test questionnaire proceeding the session and acknowledged each individual for their participation. Data such as participants' demographic information, time on task, task success rate, errors, deviations, and post-test questionnaire were documented into a spreadsheet.

TEST LOCATION

The test was performed in a virtual setting, using online meeting sessions and on-site meetings at the participants office. The online participants joined the sessions from their respective home or work locations via a personal computer and browser. They were given access to the EHRUT through a steadfast and secure cloud based EHR login.

TEST ENVIRONMENT

The EHRUT would be usually be used in a facility or healthcare office. For online cases, the testing was directed in Medi-EHR’s Corporate office. For on-site and online cases the local desktop computer was used running Windows O/S and Google Chrome Browser. The participants used a mouse and a keyboard and the screen size was a 28 inch monitor with resolution of 1024×768 with default color settings when interacting with the EHRUT.

The application was set up by highly experienced Medi-EHR Healthcare IT professions. According to the vendor’s documentation describing the system set-up and preparation. The application itself was a training system running on a Linux Server using a test database over a LAN internet connection. Technically, the system performance (i.e., response time) was illustrative to what definite users would experience in a field implementation. Moreover, participants were instructed not to change any of the default system settings.

PARTICIPANT INSTRUCTIONS

The administrator reads the following instructions aloud to the each participant:

Thank you for participating in this study. Your input is very important. Our session today will last about 60 minutes. During that time you will use an instance of an electronic health record. I will ask you to complete a few tasks using this system and answer some questions. You should complete the tasks as quickly as possible making as few errors as possible. Please try to complete the tasks on your own following the instructions as close as possible. Please note that we are not testing your abilities , we are testing how the system will work for your peers. the system, therefore if you have difficulty all this means is that something needs to be improved in the system so that other users will have a better experience and that patient safety is improved. I will be here in case you need specific help, but I am not able to instruct you or provide help in how to use the application once the test has started.

Our goal is to see how easy (or how difficult) this system is to use, what in it would be useful to you, and how we could improve it. Your answers help improve the system so please be honest with your opinions. All of the information that you provide will be kept confidential and your name will not be

Page 17: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

8

associated with your comments at any time. Should you feel it necessary you are able to withdraw at any time during the testing.

Following the procedural instructions, participants were shown the EHR and as their first task, were given time 10 minutes to explore the system and make comments. Once this task was complete, the administrator gave the following instructions:

For each task, I will read the description to you and say “Begin.” At that point, please perform the task and say “Done” once you believe you have successfully completed the task. I would like to request that you don’t talk aloud or verbalize while you are doing the task. I will ask you your impressions about the task once you are done. Participants were then given 7 tasks to complete.

Usability Metrics

According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

• Satisfaction with EHRUT by measuring ease of use ratings • Effectiveness of EHRUT by measuring participant errors and success rates • Efficiency of EHRUT by measuring the path deviations and average task time

DATA SCORING

The following table details how tasks were scored, errors evaluated, and the time data

analyzed.

Page 18: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

9

Measure Rationale and Scoring

Effectiveness:Task Success

A task was counted as a “Success” if the participant was able toachieve the correct outcome, without assistance, within the timeallotted on a per task basis.Total number of tasks and max allowed time are recorded for eachindividual task.Task times were recorded for successes.

Effectiveness:Task Errors

If the participant abandoned the task, did not reach the correctanswer or performed it incorrectly, or reached the end of the allottedtime before successful completion, the task was counted as an Error.

Effectiveness:Task Deviations

The participant’s path (i.e., steps) through the application wasrecorded. Deviations occur if the participant, for example, went to awrong screen, clicked on an incorrect menu item, followed anincorrect link, or interacted incorrectly with an on-screen control. Thispath was compared to the optimal path.

Effectiveness:Task Time

Each task was timed from when the administrator said “Begin” untilthe participant said, “Done.” If he or she failed to say “Done,” thetime was stopped when the participant stopped performing the task.Only task times for tasks that were successfully completed wereincluded in the average task time analysis

Satisfaction:Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post- task question as well as a post-session questionnaire. After each task, the participant was asked to rate the task on a scale of 1 (Very Difficult) to 5 (Very Easy). These data are averaged across participants.To measure participants’ confidence in and likeability of Medi-EHR overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions were:1. I think that I would like to use this system frequently.2. I found the system unnecessarily complex.3. I thought the system was easy to use.4. I think that I would need the support to use this system.5. I found the various functions in this system were well integrated.6. I thought there was too much inconsistency in this system.7. I would imagine that most people would learn this system very quickly.8. I found the system very cumbersome to use.9. I felt very confident using the system.10. I needed to learn a lot before I could get going with this system.

Page 19: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

10

RESULTS

DATA ANALYSIS AND REPORTING

We confirm that the test results provided an analysis of the use, tested performance and error rates in order to identify risk prone errors. The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. None of the participants or participant data was excluded from test data analysis and reporting.

The usability testing results for the EHRUT are detailed below with individual test results in the beginning and a complete data analysis report in the end:

DATA REPORTING

The following tasks were performed as a realistic representation of the kinds of activities a user might do with the EHRUT, along with a task rating.

Task Ratings :

5=Very Easy

2= Easy

3= Neither Easy/Nor Difficult

4= Difficult

5= Very Difficult

Risk Ratings:

Low

Medium

High

Page 20: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

11

DISCUSSION OF THE FINDINGS

EFFECTIVENESS AND EFFICIENCY

The table above shows the relative performance data based on perceived standards .Some tasks were found to be less difficult than expected and some more. Overall most participants were able to successfully complete the tasks without help of instruction. Participants did not significantly deviate from optimal workflow paths. The user interface for electronic prescribing and merging clinical information took a more effort to understand but was found to be effective after completing the training and education.

Task #

Mean Task Time

(in Seconds)

Expected Task time

(in Seconds) Task Success ErrorsOptimal Path

Mean Deviations Optimal Path

Mean TaskSatisfaction

(5 = Easy)Electronically Access Laboratory Orders 1,2,3,4,5 71 60 100% 0.6 2.00 5Electronically Record Laboratory Orders 1,2,3,4,5 85 90 100% 0 1.6 7.00 5Electronically Access Patient Active Medication List 1,2,3,4,5 111 90 100% 0 0.4 2.00 5Electronically Change Patient Active Medication List 1,2,3,4,5 67 60 100% 0 1.2 4.00 5Electronically Record Patient Active Medication List 1,2,3,4,5 104 120 100% 0 0.4 3.00 4View clinical decision support rule based on Problem List data 1,2,3,4,5 71.8 60 100% 0 0 2.00 5View clinical decision support rule based on Medication List data 1,2,3,4,5 68 60 100% 0 0 2.00 5View clinical decision support rule based on Rx allergy data 1,2,3,4,5 64 60 100% 0 0 2.00 5View clinical decision support rule based on lab test data 1,2,3,4,5 63 60 100% 0 0 2.00 5Generate and Electronically Indicate Drug-drug and Drug- allergy Interventions 1,2,3,4,5 60 90 100% 0 0.6 2.00 5Electronically prescribe a medication 1,2,3,4,5 105 120 100% 0 1 5.00 3.5Record a new medication 1,2,3,4,5 93 90 100% 0 0.4 4.00 4Change an existing Medication 1,2,3,4,5 59 60 100% 0 0.4 2.00 5Merge clinical information into record from an external source 1,2,3,4,5 118 120 100% 0 1.2 6.00 3Review and Save Clinical information 1,2,3,4,5 91.4 90 100% 0 0.2 3.00 4

Page 21: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

12

SATISFACTION

Participants ranked the satisfaction of the Medi-EHR system on a scale of 1-5. Values ranging from 3.50 to 5.00. Most notable the satisfaction of 5.00 were mostly attributed to tasks that were self guided and required very little participant recollection of training. Overall the satisfaction score was 4.56.

MAJOR FINDINGS

Based on the user feedback and ratings, we have found Medi-EHR to be an efficient effective system that achieves a high level of satisfaction and ease of use.

AREAS FOR IMPROVEMENT

The participants would prefer that the user interface be customized for their individual workflow. It was explained that the customization features were not applied for testing purposes.

Some Participants would prefer a larger font size on certain screens.

Some participants would like to add color indicators or styles to certain user screens.

User Centered Design

Medi-EHR’ was developed with our home grown UCD method that we call “MEDI-UCD” a method based on ISO 9241. We have direct input from, and observation of, the end user. All features, enhancements are a result of working with the end user for the most efficient user experience.

Our user-centered design process helps our software designers to fulfill the goal of a product designed for our users. User requirements are considered right from the beginning and included into the whole product cycle. These requirements are noted and refined through investigative methods including: questioning our users, prototype development and planning, prototype testing , usability testing and other methods. Generative methods may also be used like participatory design sessions. In addition, user requirements are inferred by careful analysis of usable products similar to our EHR.

Page 22: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

13

This is an outline of our Medi-UCD process for user centered design :

The ISO standard describes 6 key principles that will ensure a design is user centered:

1. The design is based upon an explicit understanding of users, tasks and environments. 2. Users are involved throughout design and development. 3. The design is driven and refined by user-centered evaluation. 4. The process is iterative. 5. The design addresses the whole user experience. 6. The design team includes multidisciplinary skills and perspectives.

These approaches follow the ISO standard Human-centered design for interactive systems. We also utilize: Cooperative design: involving designers and users and other partners and Contextual design, “customer-centered design” in the actual context, including some ideas from Participatory design.

Page 23: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   10 

Appendix B: Quality Management System 

Page 24: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Quality Management System Attestation Form-EHR-37-V03

InfoGard Laboratories, Inc. Page 1

For reporting information related to testing of 170.314(g)(4).

Vendor and Product Information

Vendor Name Medi-EHR,LLC

Product Name Medi-EHR

Quality Management System

Type of Quality Management

System (QMS) used in the

development, testing,

implementation, and

maintenance of EHR product.

Based on Industry Standard (for example ISO9001, IEC 62304, ISO

13485, etc.). Standard:

A modified or “home-grown” QMS.

No QMS was used.

Was one QMS used for all

certification criteria or were

multiple QMS applied?

One QMS used.

Multiple QMS used.

Description or documentation of QMS applied to each criteria:

Not Applicable.

Statement of Compliance

I, the undersigned, attest that the statements in this document are completed and accurate.

Vendor Signature by an

Authorized Representative Matthew J D'Alessandro

Date 1/25/2016

Page 25: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   11 

Appendix C: Privacy and Security 

Page 26: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Privacy and Security Attestation Form-EHR-36-V04

InfoGard Laboratories, Inc. Page 1

Vendor and Product Information

Vendor Name Medi-EHR,LLC

Product Name Medi-EHR

Privacy and Security

170.314(d)(2) Auditable

events and tamper-

resistance

Not Applicable (did not

test to this criteria)

Audit Log:

Cannot be disabled by any user.

Audit Log can be disabled.

• The EHR enforces that the audit log is enabled by default

when initially configured

Audit Log Status Indicator:

Cannot be disabled by any user.

Audit Log Status can be disabled

• The EHR enforces a default audit log status. Identify the

default setting (enabled or disabled):

There is no Audit Log Status Indicator because the Audit Log cannot

be disabled.

Encryption Status Indicator (encryption of health information locally

on end user device):

Cannot be disabled by any user.

Encryption Status Indicator can be disabled

• The EHR enforces a default encryption status. Identify the

default setting (enabled or disabled):

There is no Encryption Status Indicator because the EHR does not

allow health information to be stored locally on end user devices.

Identify the submitted documentation that describes the inability of the

EHR to allow users to disable the audit logs, the audit log status, and/or

the encryption status: §170.314(d)(2) documentation SIGNED

Identify the submitted documentation that describes the method(s) by

which the EHR protects 1) recording of actions related to electronic

health information, 2) recording of audit log status, and 3) recording of

encryption status from being changed, overwritten, or deleted by the

EHR technology: §170.314(d)(2) documentation SIGNED

Identify the submitted documentation that describes the method(s) by

which the EHR technology detects whether the audit log has been

Page 27: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Privacy and Security Attestation Form-EHR-36-V04

InfoGard Laboratories, Inc. Page 2

altered: §170.314(d)(2) documentation SIGNED

170.314(d)(7) End-user

device encryption

Storing electronic health

information locally on end-

user devices (i.e. temp files,

cookies, or other types of

cache approaches).

Not Applicable (did not

test to this criteria)

The EHR does not allow health information to be stored locally on

end-user devices.

• Identify the submitted documentation that describes the

functionality used to prevent health information from being stored

locally: §170.314(d)(7) documentation signed

The EHR does allow health information to be stored locally on end

user devices.

• Identify the FIPS 140-2 approved algorithm used for encryption:

• Identify the submitted documentation that describes how health

information is encrypted when stored locally on end-user

devices:

The EHR enforces default configuration settings that either enforces

the encryption of locally stored health information or prevents health

information from being stored locally.

• Identify the default setting:

170.314(d)(8) Integrity

Not Applicable (did not

test to this criteria)

Identify the hashing algorithm used for integrity (SHA-1 or higher):

SHA-1

170.314(e)(1) View,

Download, and Transmit to

3rd Party

Not Applicable (did not

test to this criteria)

Identify the FIPS 140-2 approved algorithm used for encryption:

AES

Identify the FIPS 140-2 approved algorithm used for hashing:

SHA-1

170.314(e)(3) Secure

Messaging

Not Applicable (did not

test to this criteria)

Identify the FIPS 140-2 approved algorithm used for encryption:

AES

Identify the FIPS 140-2 approved algorithm used for hashing:

SHA-1

Statement of Compliance

I, the undersigned, attest that the statements in this document are accurate.

Vendor Signature by an Authorized Representative

Date

Matthew
Pencil
Matthew
Typewritten Text
1
Matthew
Typewritten Text
1/15/2016
Matthew
Typewritten Text
Matthew D'Alessandro
Page 28: Testconnect.ul.com/rs/365-LEA-623/images/16-2651-R... · Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015 ©2016 InfoGard. May

Test Results Summary for 2014 Edition EHR Certification 16‐2651‐R‐0004‐PRA V1.0, March 4, 2015  

©2016 InfoGard. May be reproduced only in its original entirety, without revision   12 

  

Test Results Summary Document History  

Version  Description of Change  Date 

V1.0  Initial release  3/04/2016 

  

END OF DOCUMENT 


Recommended