onc hit certification program test results …...version ehr-test-144 rev 01-jun-2015 page 1 of 12...
TRANSCRIPT
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 1 of 12
ONC HIT Certification Program
Test Results Summary for 2014 Edition EHR Certification
Part 1: Product and Developer Information
1.1 Certified Product Information
Product Name: CoCENTRIX CCP Product Version: Version 3.1 Domain: Inpatient Test Type: Complete EHR
1.2 Developer/Vendor Information
Developer/Vendor Name: CoCENTRIX Address: 540 N. Tamiami Trail Sarasota FL 34236 Website: www.cocentrix.com Email: [email protected] Phone: 941-306-4951 Developer/Vendor Contact: Bryan Griffiths
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 2 of 12
Part 2: ONC-Authorized Certification Body Information
2.1 ONC-Authorized Certification Body Information
ONC-ACB Name: Drummond Group
Address: 13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website: www.drummondgroup.com
Email: [email protected]
Phone: 817-294-7339
ONC-ACB Contact: Bill Smith
This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative:
Bill Smith
Certification Body Manager ONC-ACB Authorized Representative Function/Title
7/17/2015
Signature and Date
2.2 Gap Certification The following identifies criterion or criteria certified via gap certification
§170.314
(a)(1) (a)(19) (d)(6) (h)(1)
(a)(6) (a)(20) (d)(8) (h)(2)
(a)(7) (b)(5)* (d)(9) (h)(3)
(a)(17) (d)(1) (f)(1)
(a)(18) (d)(5) (f)(7)** *Gap certification allowed for Inpatient setting only **Gap certification allowed for Ambulatory setting only
x No gap certification
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 3 of 12
2.3 Inherited Certification
The following identifies criterion or criteria certified via inherited certification
§170.314
(a)(1) (a)(16) Inpt. only (c)(2) (f)(2)
x (a)(2) x (a)(17) Inpt. only (c)(3) (f)(3)
x (a)(3) (a)(18) x (d)(1) (f)(4) Inpt. only
x (a)(4) (a)(19) (d)(2) (f)(5) Amb. only
(a)(5) (a)(20) (d)(3)
x (a)(6) (b)(1) x (d)(4) (f)(6) Amb. only
x (a)(7) (b)(2) x (d)(5)
(a)(8) x (b)(3) x (d)(6) (f)(7)
x (a)(9) (b)(4) (d)(7) (g)(1)
x (a)(10) (b)(5) x (d)(8) (g)(2)
x (a)(11) (b)(6) Inpt. only (d)(9) Optional (g)(3)
(a)(12) (b)(7) (e)(1) (g)(4)
(a)(13) (b)(8) (e)(2) Amb. only (h)(1)
(a)(14) (b)(9) (e)(3) Amb. only (h)(2)
(a)(15) (c)(1) x (f)(1) (h)(3)
No inherited certification
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 4 of 12
Part 3: NVLAP-Accredited Testing Laboratory Information
Report Number: GI-07162015-3092
Test Date(s): 01/22/2013; 10/30/2014; 04/23/2015; 07/16/2015
3.1 NVLAP-Accredited Testing Laboratory Information
ATL Name: Drummond Group EHR Test Lab
Accreditation Number: NVLAP Lab Code 200979-0
Address: 13359 North Hwy 183, Ste B-406-238, Austin, TX 78750
Website: www.drummondgroup.com
Email: [email protected]
Phone: 512-335-5606
ATL Contact: Beth Morrow
For more information on scope of accreditation, please reference NVLAP Lab Code 200979-0.
Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory Representative:
Gary Isaac
Test Proctor ATL Authorized Representative Function/Title
7/17/2015
Sarasota, FL Signature and Date Location Where Test Conducted
3.2 Test Information
3.2.1 Additional Software Relied Upon for Certification
Additional Software Applicable Criteria Functionality provided by Additional Software
MS Healthvault 170.314.e.1 VDT DrFirst Rcopia
170.314.a.2, a.6, a.7, a.10;
b.3 eRx related
No additional software required
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 5 of 12
3.2.2 Test Tools
Test Tool Version
x Cypress 2.5.1 x ePrescribing Validation Tool 1.0.4
HL7 CDA Cancer Registry Reporting Validation Tool 1.0.3 x HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool 1.8
x HL7 v2 Immunization Information System (IIS) Reporting Validation Tool 1.8
x HL7 v2 Laboratory Results Interface (LRI) Validation Tool 1.7 x HL7 v2 Syndromic Surveillance Reporting Validation Tool 1.7 x Transport Testing Tool 179 x Direct Certificate Discovery Tool 3.0.2
Edge Testing Tool 0.0.5
No test tools required
3.2.3 Test Data
Alteration (customization) to the test data was necessary and is described in Appendix [insert appendix letter]
x No alteration (customization) to the test data was necessary
3.2.4 Standards
3.2.4.1 Multiple Standards Permitted The following identifies the standard(s) that has been successfully tested where more than one standard is permitted
Criterion # Standard Successfully Tested
(a)(8)(ii)(A)(2)
§170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain
§170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide
(a)(13)
x §170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release
§170.207(j) HL7 Version 3 Standard: Clinical Genomics; Pedigree
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 6 of 12
Criterion # Standard Successfully Tested
(a)(15)(i)
x §170.204(b)(1) HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain
§170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide
(a)(16)(ii) x §170.210(g)
Network Time Protocol Version 3 (RFC 1305)
§170. 210(g) Network Time Protocol Version 4 (RFC 5905)
(b)(2)(i)(A)
x §170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions
§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release
(b)(7)(i)
x §170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions
§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release
(b)(8)(i)
§170.207(i) The code set specified at 45 CFR 162.1002(c)(2) (ICD-10-CM) for the indicated conditions
§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release
(e)(1)(i)
Annex A of the FIPS Publication 140-2 [list encryption and hashing algorithms] AES-256 SHA-256
(e)(1)(ii)(A)(2) x §170.210(g)
Network Time Protocol Version 3 (RFC 1305)
§170. 210(g) Network Time Protocol Version 4 (RFC 5905)
(e)(3)(ii)
Annex A of the FIPS Publication 140-2 [list encryption and hashing algorithms]
Common MU Data Set (15)
x §170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release
§170.207(b)(2) The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT-4)
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 7 of 12
Criterion # Standard Successfully Tested
None of the criteria and corresponding standards listed above are applicable
3.2.4.2 Newer Versions of Standards The following identifies the newer version of a minimum standard(s) that has been successfully tested
Newer Version Applicable Criteria
No newer version of a minimum standard was tested
3.2.5 Optional Functionality
Criterion # Optional Functionality Successfully Tested
x (a)(4)(iii) Plot and display growth charts
(b)(1)(i)(B) Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)
(b)(1)(i)(C) Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols)
(b)(2)(ii)(B) Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)
(b)(2)(ii)(C) Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols)
(e)(1) View, download and transmit data to a third party utilizing the Edge Protocol IG version 1.1
(f)(3) Ambulatory setting only – Create syndrome-based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario)
(f)(7) Ambulatory setting only – transmission to public health agencies – syndromic surveillance - Create Data Elements
Common MU Data Set (15)
Express Procedures according to the standard specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature)
Common MU Data Set (15)
Express Procedures according to the standard specified at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)
No optional functionality tested
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 8 of 12
3.2.6 2014 Edition Certification Criteria* Successfully Tested
Criteria # Version
Criteria # Version
TP** TD*** TP TD
x (a)(1) 1.3 1.5 x (c)(3) 1.11 1.11
(a)(2) 1.2 (d)(1) 1.2
(a)(3) 1.2 1.4 x (d)(2) 1.6 (a)(4) 1.4 1.3 x (d)(3) 1.3
x (a)(5) 1.4 1.3 (d)(4) 1.3 (a)(6) 1.3 1.4 (d)(5) 1.2 (a)(7) 1.3 1.3 (d)(6) 1.2
x (a)(8) 1.3 x (d)(7) 1.2 (a)(9) 1.3 1.3 (d)(8) 1.2 (a)(10) 1.2 1.4 x (d)(9) Optional 1.2 (a)(11) 1.3
x (e)(1) 1.11 1.5 x (a)(12) 1.3 (e)(2) Amb. only 1.2 1.6 x (a)(13) 1.2 (e)(3) Amb. only 1.3
x (a)(14) 1.2 (f)(1) 1.2 1.2 x (a)(15) 1.5 x (f)(2) 1.3 1.3 x (a)(16) Inpt. only 1.3 1.2 x (f)(3) 1.3 1.3
(a)(17) Inpt. only 1.2 x (f)(4) Inpt. only 1.3 1.3
(a)(18) 1.1 1.5 (f)(5) Amb. only 1.2 1.2
(a)(19) 1.1 1.5
(a)(20) 1.1 1.5 (f)(6) Amb. only 1.4 1.4
x (b)(1) 1.7 1.4
x (b)(2) 1.4 1.6 (f)(7) Amb. only 1.1
(b)(3) 1.4 1.4 (g)(1) 2.0 2.0 x (b)(4) 1.3 1.4 x (g)(2) 2.0 2.0 x (b)(5) 1.4 1.2 x (g)(3) 1.4
x (b)(6) Inpt. only 1.3 1.3 x (g)(4) 1.2
x (b)(7) 1.4 1.7 (h)(1) 1.1
(b)(8) 1.2 1.2 (h)(2) 1.1
(b)(9) 1.1 1.1 (h)(3) 1.1 x (c)(1) 1.11 1.11
x (c)(2) 1.11 1.11
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 9 of 12
Criteria # Version
Criteria # Version
TP** TD*** TP TD
No criteria tested *For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method) **Indicates the version number for the Test Procedure (TP) ***Indicates the version number for the Test Data (TD)
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 10 of 12
3.2.7 2014 Clinical Quality Measures*
Type of Clinical Quality Measures Successfully Tested:
Ambulatory
x Inpatient
No CQMs tested
*For a list of the 2014 Clinical Quality Measures, please the CMS eCQM Library (Navigation: June 2014 and April 2014 Updates)
Ambulatory CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version
2 90 136 155
22 117 137 156
50 122 138 157
52 123 139 158
56 124 140 159
61 125 141 160
62 126 142 161
64 127 143 163
65 128 144 164
66 129 145 165
68 130 146 166
69 131 147 167
74 132 148 169
75 133 149 177
77 134 153 179
82 135 154 182
Inpatient CQMs CMS ID Version CMS ID Version CMS ID Version CMS ID Version
9 x 71 v4 x 107 v3 172
x 26 v2 72 x 108 v3 178
x 30 v4 x 73 v3 x 109 v3 185
31 x 91 v4 x 110 v3 x 188 v4
32 x 100 v3 111 x 190 v3
53 x 102 v3 113 55 x 104 v3 x 114 v3
60 x 105 v3 171
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 11 of 12
3.2.8 Automated Numerator Recording and Measure Calculation
3.2.8.1 Automated Numerator Recording
Automated Numerator Recording Successfully Tested
(a)(1) (a)(11) (a)(18) (b)(6)
(a)(3) (a)(12) (a)(19) (b)(8)
(a)(4) (a)(13) (a)(20) (b)(9)
(a)(5) (a)(14) (b)(2) (e)(1)
(a)(6) (a)(15) (b)(3) (e)(2)
(a)(7) (a)(16) (b)(4) (e)(3)
(a)(9) (a)(17) (b)(5)
x Automated Numerator Recording was not tested
3.2.8.2 Automated Measure Calculation
Automated Measure Calculation Successfully Tested
x (a)(1) x (a)(11) (a)(18) x (b)(6)
x (a)(3) x (a)(12) (a)(19) (b)(8)
x (a)(4) x (a)(13) (a)(20) (b)(9)
x (a)(5) (a)(14) x (b)(2) x (e)(1)
x (a)(6) x (a)(15) x (b)(3) (e)(2)
x (a)(7) x (a)(16) x (b)(4) (e)(3)
x (a)(9) x (a)(17) x (b)(5)
Automated Measure Calculation was not tested
3.2.9 Attestation
Attestation Forms (as applicable) Appendix
x Safety-Enhanced Design* A
x Quality Management System** B
x Privacy and Security C
*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (a)(18), (a)(19), (a)(20), (b)(3), (b)(4), (b)(9). **Required for every EHR product
3.3 Appendices
Attached below.
Test Results Summary for 2014 Edition EHR Certification Version EHR-Test-144 Rev 01-Jun-2015
Page 12 of 12
Test Results Summary Change History
Test Report ID Description of Change Date
2014 Edition Test Report Summary
EHR Usability Test Report – CCP Version 3.1 1
July 14, 2015
Melissa Martin
EHR Coordinator
Drummond Group, Inc.
13359 North Highway 183, Suite B-406-238
Austin, TX 78750
Re: Usability Study
Dear Ms. Martin,
Attached you will find the Usability Study, conducted by CoCENTRIX, as required for 2014 Edition
Certification. We attest to the veracity and authenticity of the enclosed report.
If you should have any questions or concerns, please feel free to contact me directly. Thank you for your
time and attention to this matter.
Sincerely,
EHR Usability Test Report – CCP Version 3.1 2
EHR Usability Test Report
Product: CCP
Version: 3.1
Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports
Date of Usability Test: December 19, 2013, January 9, 2014, September 11, 2014, April 15, 2015 and
July 13, 2015.
Date of Report: July 14, 2015
Report Prepared By: CoCentrix
Bryan Griffiths, Senior Director
941-306-4951 x613
540 Tamiami Trail
Sarasota, FL 34236
Note: The following study was developed using the NISTIR 7742 template as a guide for reporting our
findings: Customized Common Industry Format Template for Electronic Health Record Usability Testing.
EHR Usability Test Report – CCP Version 3.1 3
Executive Summary
A usability test of CCP Version 3.1, an EHR was conducted on December 19, 2013, January 9, 2014,
September 11, 2014, April 15, 2015 and July 13, 2015 by CoCentrix, Inc. The purpose of this test was to
test and validate the usability of the current user interface, and to provide evidence of usability.
During the usability test, 10 healthcare providers, matching the target demographic criteria served as
participants and used the EHR in simulated, but representative tasks to test medication allergies. 6
healthcare providers, matching the target demographic criteria served as participants and used the EHR
in simulated, but representative tasks to test drug-drug and drug allergy interventions, medication lists
and prescribing medications. 6 healthcare providers, matching the target demographic criteria served as
participants and used the EHR in simulated, but representative tasks to test CPOE, Clinical Decision
Support Interventions and Clinical Information Reconciliation. 6 healthcare providers, matching the
target demographic criteria served as participants and used the EHR in simulated, but representative
tasks to test Eligible Hospital CPOE medications. Six healthcare providers, matching the target
demographic criteria served as participants and used the EHR in simulated, but representative tasks to
test eMAR.
This study collected performance data on 30 tasks typically conducted on an EHR. The tasks conducted
were related to the following:
• Recording a Patient’s Allergy
• Changing a Patient’s Allergy
• Viewing a Patient’s Allergy
• Creating drug-drug and drug-allergy interventions prior to CPOE completion
• Adjustment of severity level of drug-drug interventions
• Record Medication List
• Change Medication List
• Access Medication List
• Create Prescriptions
• Record a Medication Order
• Change a Medication Order
• Access a Medication Order
• Record a Laboratory Order
• Change a Laboratory Order
• Access a Laboratory Order
• Record a Radiology/Imaging Order
• Change a Radiology/Imaging Order
• Access a Radiology/Imaging Order
• Problem List Interventions
• Medication List Interventions
• Medication Allergy List Interventions
EHR Usability Test Report – CCP Version 3.1 4
• Demographics Interventions
• Lab Tests and Results Interventions
• Vital Signs Interventions
• Identify User Diagnostic and Therapeutic Reference Information
• Configuration of CDS Interventions by user
• Reconcile Patient’s Active Medication List with Another Source
• Reconcile Patient’s Active Problem List with Another Source
• Reconcile Patient’s Active Medication Allergy List with Another Source
• Using assistive technology to verify right patient, med, dose, route and time
During the 15 minute one-on-one usability tests, each participant was greeted by the administrator and
asked to review and sign an informed consent/release form (included in Appendix). Participants may
have had prior experience with an EHR, but not experience with the CCP EHR or Dr. First Rcopia. The
administrator introduced the test, and instructed participants to complete a series of tasks (given one at
a time) using the EHR. During the testing, the administrator timed the test and recorded user
performance data on paper and electronically. The administrator did not give the participant assistance
in how to complete the task.
The following types of data were collected for each participant:
• Time to complete each task
• Number of and type of errors
• Path deviations
• Participants verbalizations
• Tasks completed in the allotted time
• Participant’s ease of use rating of each task
• Participant’s satisfaction ratings of the system
All participant data was de-identified – no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test questionnaire. The following is a summary of the performance and rating data
collected on the EHR.
Task
Participant Task
Success
Time to
Complete
(avg)
Errors Deviations
Task
Rating
(avg)
Risk
Rating
Number Number
Seconds Total
Average
(Total
Observed/
Completed
Task)
Total
(Observed/
Optimal)
Average
(Observed/
Optimal)
(5=Easy)
(5=Least
Risk)
1. Add Allergy 10 10 67 0 0 160/120 1.33 4.5 2
2. Change
Allergy 10 10 47 0 0 87/70 1.24 4.8 2
3. Access
Allergy 10 10 10.1 0 0 33/20 1.65 4.7 4
4. Drug-Drug
Drug-Allergy 6 6 13.1 0 0 24/24 1.0 4.5 1
EHR Usability Test Report – CCP Version 3.1 5
5. Adjust
Severity
Alert 6 4
29.25 2 33% 23/20 1.15
4.1 4
6. Record
Medication
(EP) 6 5
17.8 1 16% 25/25 1.0
4.0 3
7. Change
Medication
8. (EP) 6 5
16 1 16% 26/25 1.04
3.8 3
9. Access
Medication
10. (EP) 6 6
4.5 0 0 6/6 1.0
3.8 3
11. Create
Prescription 6 4 150.8 2 33% 142/92 1.54 3.1 2
12. Record
Medication
Order 6 4
150.8 2 33% 142/92 1.54
3.1 2
13. Change
Medication
Order 6 5
16 1 16% 26/25 1.04
3.8 3
14. Access
Medication
Order 6 6
4.5 0 0 6/6 1.0
3.8 3
15. Record Lab
Order 6 6 100.0 0 0 75/60 1.25 4.0 2
16. Change Lab
Order 6 6 57.5 0 0 54/48 1.125 4.0 2
17. Access Lab
Order 6 6 3.67 0 0 6/6 1.0 4.67 4
18. Record
Radiology
Order 6 6
80.0 0 0 76/66 1.15
4.0 2
19. Change
Radiology
Order 6 6
50.83 0 0 54/48 1.125
4.0 2
20. Access
Radiology
Order 6 6
3.83 0 0 7/6 1.17
4.67 4
21. Problem List
Intervention 6 6 66.5 0 0 49/42 1.17 4.67 3
22. Medication
List
Intervention 6 6
54.17 0 0 31/24 1.29
4.5 3
23. Medication
Allergy List
Intervention 6 6
92.83 0 0 64/54 1.19
4.5 3
24. Demograph
Intervention 6 6 25.67 0 0 14/12 1.17 4.17 3
25. Lab Test and
Results
Intervention 6 6
88.67 0 0 124/72 1.72
4.17 3
26. Vital Signs
Intervention 6 6 11.16 0 0 67/54 1.24 4.33 3
27. Identify User
Diagnostic
and
Therapeutic
Reference
6 6
13.83
0 0 33/30 1.10
4.5 3
28. Configure
CDS
intervention
by user
6 6
48.83
0 0 42/30 1.40
4.17 4
29. Reconcile
medication 6 6 41.5 0 0 39/36 1.08 4.5 3
EHR Usability Test Report – CCP Version 3.1 6
list with
other source
30. Reconcile
problem list
with other
source
6 6
48.33
0 0 42/36 1.17
4.5 3
31. Reconcile
medication
allergy list
with other
source
6 6
41.83
0 0 38/36 1.05
4.5 3
32. Order
Medication
(EH) 6 6
100.17 0 0 102/66 1.54
4.67 2
33. Change
Medication
Order (EH) 6 6
40.83 0 0 49/42 1.17
4.83 2
34. View/Access
Medication
Order (EH) 6 6
4.5 0 0 7/6 1.17
5 2
35. Use eMAR 6 6 1.58 0 0 77/72 1.07 4.88 2
The results from the Task Rating scored the subjective satisfaction with the CCP system for recording
allergies, lab orders, radiology orders, CDS interventions and information reconciliation based on
performance with these tasks to be 4.17. The results from the Task Rating scored the subjective
satisfaction with the Dr. First Rcopia system for receiving drug-drug interventions, adjusting severity
levels, record/change/access, prescribe medications and order medications to be based on performance
with these tasks to be 3.77. The results from the Task Rating scored the subjective satisfaction with the
CCP system for Eligible Hospital CPOE medication ordering based on performance with these tasks to be
4.67. The results from the Task Rating scored the subjective satisfaction with the CCP system for eMAR
based on performance with these tasks to be 4.83.
In addition to the performance data, the following qualitative observations were made:
1. Major Findings – Tasks 1 thru 3 - The majority of user reactions to work flows were positive and
navigation seemed very straight forward to the user once they began recording an allergy and
completing tasks 1 thru 3. While some users may have gotten lost during their workflow, none
experienced an error that prevented them from moving forward or required the test
administrator’s intervention. Users, with no training and just a simple quick demonstration
were able to catch on very quickly and navigate the system, displaying the intuitive nature of the
system.
Tasks 4 thru 12 - During completion of tasks 4 thru 12, several users commented that the screen
was too busy, cluttered and appeared to become easily lost during navigation. One user
commented as soon as sitting down that they did not care for the interface. Users voiced that
they were constantly scrolling down and wished that the top of the screen was used more
frequently for the actions they were attempting to complete.
EHR Usability Test Report – CCP Version 3.1 7
Tasks 13 thru 31 – During completion of tasks 13 thru 29, users commented that it was difficult
to determine required fields and consistency of drop down fields. Times and deviations were
quite similar and repetitive user tasks showed an increase in velocity. Users were provided very
little direction, but seemed to easily find their way thru the system with virtually no training.
This should indicate rapid training and deployment, which speaks to the ease of usability of the
system.
Tasks 32 thru 34 – While completing CPOE medication ordering for Eligible Hospitals, users
remarked about how clean the interface was and how the workflow simply guided them
through the process. Several participants commented about how good the layout was in
comparison to other CPOE systems they have used in the past. Any areas that were marked as
difficult for the user were commented by the participants that training and repetition would
easily solve those issues.
Task 35 – While completing eMAR, participants commented how easy it is to use scanning to
document medication administration. Several nurses expressed satisfaction with the way the
wrong medications can be caught before being administered. One participant wanted to start
using the eMAR the very same day!
2. Areas for Improvement – Tasks 1 thru 3 - For the majority of users the navigation involving the
searching of a medication was a constant source of path deviation. There is a text field and a
search button, users frequently attempted to type in the field, rather than click the search
button, launching another screen where they could type if a few letters and run a medication
search. It was also noted that while all the data elements were on the left hand side of the
screen, the scroll bar was on the right. The data entry screens were long enough requiring the
user to move from entering in data, to moving across the screen, to moving back to entering
data. The layout of the screen into two columns would have provided the user a more efficient
use of time and negated the need for scrolling.
Tasks 4 thru 12 – Limiting the amount of information presented to the user on a single screen
would be advisable, or possibly categorizing it in a manner where it limits the amount of data
displayed to the user at once. A more prevalent prompt for a signature when writing a
prescription would also reduce the searching for the entering of a password prompt, or
forgetting to complete the action of prescription creation.
Tasks 13 thru 31 – Orders could display brighter/bolder/large indications when a field is
required, as well as consistency when stepping through drop down fields and selections. Once
user remarked how easily it was to see the status of various orders. Reconciliation could
provide a user an indication when a save or reconcile is occurring, preventing user from
repeating clicks in the misunderstanding that nothing is occurring.
Tasks 32 thru 34 – The only area that participants commented on was in the area of signing a
pending order. There seemed to be confusion whether the order was completed or still in a
pending state, as the user is prompted to select the pending order to sign it. This may simply be
a training issue, but was noted for this study. Users also commented that having an icon
indication that an order was signed or is unsigned could be helpful. While the users rapidly
moved through the process of ordering a medication, the largest speed drain was in the system
EHR Usability Test Report – CCP Version 3.1 8
pulling back the full orders from First Data Bank. Also noticed on a few orders where they
would select the duration type, say days, then enter the number, which would clear the type.
Task 35 – One participant suggested a different layout for the data when the medication was
scanned – they wanted the medication, route dose and time to display next to the order detail.
One other participant suggested the bar code window not close after scanning the patient,
rather to remain open while the medication is being scanned.
Introduction
The EHR tested for this study was CCP, Version 3.1, an EHR, along with Rcopia version 3. Designed to
present clinical information to healthcare providers in ambulatory and inpatient healthcare settings, the
EHR allows providers to electronically maintain a complete electronic health record, primarily used in
the practice of behavioral health. The usability testing attempted to represent realistic exercises and
conditions.
The purpose of this study was to test and validate the usability of the current user interface, and provide
evidence of usability in the EHR. To this end, measure of effectiveness and efficiency (time to perform
tasks; total number of deviations; total number of errors; etc) were captured during the usability testing.
Method
Participants
A total of 34 participants were tested on the EHR. Participants in the test were individuals that work
within an ambulatory healthcare environment in the practice of behavioral health. Participants were
contacted by CoCentrix, Inc. staff to participate in the study. In addition, participants had no direct
connection to the development of the EHR. Participants were not from CoCentrix, Inc. All participants
were provided with a short demonstration by the test administrator of how to perform each task. This
was similar to an end user training, but in a much abbreviated state.
The following is a table of participants by characteristics, including demographics, user role, and clinical
experience. Participant names were replaced with Participant IDs so that an individual’s data cannot be
tied back to individual identities. A summary of the participant demographics can be found in the
Appendix.
EHR Usability Test Report – CCP Version 3.1 9
Participant
ID
Gender Age Staff Role Clinical
Experience
(Years)
Computer
Experience
(years)
EHR
Experience
(years)
1 2013121801 M 28 Post-Doctoral
Fellow
5 18 1
2 2013121802 F 59 Neuropsychologist
7 30 0
3 2013121803 F 47 RN Case Manager
20 20 10
4 2013121804 M 30 Clinical Associate
5 12 2
5 2013121805 M 31 Post-Doctoral
Neuropsychology
Fellow
11 25 7
6 2013121806 F 31 Clinical Case
Manager 10 16 2
7 2013121807 M 26 Clinical Case
Manager 5 20 4
8 2013121808 F 25 Psychology
Student 2 18 1.5
9 2013121809 F 25 Psychology Student 4 18 1
10 2013121810 F 46 Program Manager 18 20 0
11 2014010901 F 54 Director, Outpatient 24 15 4
12 2014010902 M 58 Psychotherapist 30 15 15
13 2014010903 F 48 RN 26 20 20
14 2014010904 F 46 Health IT 46 23 10
15 2014010905 F 44 Clinician 16 10 3
16 2014010906 F 61 APN 30 20 10
17 2014091101 M 43 RN Case Manager 20 20 1
18 2014091102 F 57 RN 35 20 0
EHR Usability Test Report – CCP Version 3.1 10
19 2014091103 F 49 Occupational
Therapist
30 20 2
20 2014091104 M 39 Cognitive Therapist 12 27 0
21 2014091105 F 27 Therapist 4 23 1
22 2014091106 M 28 Neuropsychologist 4 4 4
23 2015041501 M 67 Project Manager 10 30 8
24 2015041502 M 25 Clinical Operations
Navigator
3 21 1
25 2015041503 F 38 Clinical Operations
Coor
7 8 7
26 2015041504 F 60 Clinical Navigator 21 33 1
27 2015041505 F 32 Business Analyst 9 20 10
28 2015041506 F 34 Applications
Developer
10 17 6
29 2015071301 M 36 EHR Trainer and
Support
1 15 5
30 2015071302 F 29 RN 2 10 3
31 2015071303 M 44 RN 21 10
32 2015071304 M 42 RN 10 10 10
33 2015071305 F 36 Clinical Data
Specialist
6 20 4
34 2015071306 M 51 Director if Quality
Assurance
20 18 7
Thirty-four participants were recruited for the test and showed up to participate in the test. 0 number
of participants failed to show for the test.
EHR Usability Test Report – CCP Version 3.1 11
Participants were scheduled for 15 to 25 minute sessions with 5 to 15 minutes in between each session
for debrief with administrator and to reset systems to proper test conditions. A spreadsheet was used
to keep track of the participant schedule and included each participant’s name.
Study Design
Overall, the objective of this test was to uncover areas where the application performed well – that is,
effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs
of the participants. The data from this test may serve as a baseline for future tests with an updated
version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In
short, this testing serves as both a means to record or benchmark current usability, but also to identify
areas where improvements must be made.
During the usability test, participants interacted with two EHRs. Ten participants used one system in the
one location, six used the other system in another location and six in a third location. The same
instructions were provided to all participants. The system was evaluated for effectiveness, efficiency
and satisfaction as defined by measures collected and analyzed for each participant.
• Time to complete each task
• Number and types of errors
• Path deviations
• Participants verbalizations
• Tasks completed in the allotted time
• Participant’s satisfaction ratings of the system
A number of tasks were constructed that would be realistic and representative of the kinds of activities a
user might do with this EHR, including:
• Recording a patient allergy
• Changing a patient allergy
• Viewing a patient allergy
• Creating drug-drug and drug-allergy interventions prior to CPOE completion
• Adjustment of severity level of drug-drug interventions
• Record Medication List
• Change Medication List
• Access Medication List
• Create Prescriptions
• Record a Medication Order
• Change a Medication Order
• Access a Medication Order
• Record a Laboratory Order
• Change a Laboratory Order
• Access a Laboratory Order
• Record a Radiology/Imaging Order
EHR Usability Test Report – CCP Version 3.1 12
• Change a Radiology/Imaging Order
• Access a Radiology/Imaging Order
• Problem List Interventions
• Medication List Interventions
• Medication Allergy List Interventions
• Demographics Interventions
• Lab Tests and Results Interventions
• Vital Signs Interventions
• Identify User Diagnostic and Therapeutic Reference Information
• Configuration of CDS Interventions by user
• Reconcile Patient’s Active Medication List with Another Source
• Reconcile Patient’s Active Problem List with Another Source
• Reconcile Patient’s Active Medication Allergy List with Another Source
• Using assistive technology to verify right patient, med, dose, route and time
Tasks were selected based on their frequency of use, criticality of function, applicability towards
meaningful use and those that may be most troublesome for users.
Procedures
Upon arrival, participants were greeted and their identity was verified and matched with the
participant’s name on the schedule. Participants were then assigned a participant ID. All participants
signed an informed consent form prior to the testing. The test administrator witnessed each
participant’s signing of the consent form.
A single CoCentrix staff member acted in the role of both the test administrator and data logger and
worked individually with each participant. Instructions were provided for each test, all comments from
participants were noted, along with all times, deviations and errors. The CoCentrix staff members have a
clinical/technical background, working in the field of behavioral health technology for over 20 years.
Participants were instructed to perform the tasks:
• After listening to the instructions and viewing a brief demonstration of the task from the testing
administrator
• As quickly as possible
• Without assistance
Task timing began after the completion of the verbal instructions from the administrator; and after an
acknowledgement from the participant that they were ready to begin. The task time was stopped once
the participant indicated they had successfully completed the task.
Following the test, the administrator gave the participant the post-test questionnaire; and then thanked
them for their time.
EHR Usability Test Report – CCP Version 3.1 13
The CoCentrix staff member was responsible for logging data, recorded all participants’ demographic
information, task success rate, time on task, errors, and deviations into a spreadsheet.
Test Location
The test was administered in a quiet setting where only the participant and test administrator were
present. To ensure that the environment was comfortable for users, noise levels were kept to a
minimum with the ambient temperature within a normal range.
Test Environment
The computer used for the testing was a laptop running on Windows 7. Users also used a mouse, while
interacting with the EHR. The CCP EHR and Rcopia application are web-based solutions; so the
computer was running on high speed internet connections. All participants indicated that system
performance during the test was what they were used to seeing during their typical work day.
Test Forms and Tools
During the usability test, various instruments and documents were used, including:
• Informed consent
• Moderator guide
• Post-test questionnaire
Examples of these documents are to be found in the Appendix section.
Participant Instructions
The Administrator read the following instructions aloud to each participant:
“Thank you for participating in today’s usability study of CCP. In a few minutes, you will be asked to
perform a series of tasks and complete a user survey. Please attempt to complete each task as quickly
as possible. The idea behind this study is for CoCentrix to obtain information on where enhancements
are needed in the application based on how quickly, and easily, tasks are being performed in CCP.
When it is time to perform each task, I will state the instructions and then tell you to begin. Once you
have completed the task, please say ‘Done’. After you have completed the task, I will ask for feedback
on the actions you had taken during the task. You will be given a specified amount of time to complete
each task. This time will not be communicated to you as we are interested in seeing how long each task
does take for you to perform.”
Usability Metrics
The goals of this test were to assess:
1. The efficiency of CCP by measuring the length of time it takes for a user to complete specific
tasks; and the total number of tasks successfully completed during the study.
2. The efficiency of CCP by measuring the path deviations taken by the user during the tasks.
EHR Usability Test Report – CCP Version 3.1 14
3. The effectiveness of CCP by measuring the number and types of errors experienced by the user
during the tasks.
4. The satisfaction of the user with CCP by logging their comments on the tasks and satisfaction
scores.
Data Scoring
The table below details how each task was scored.
Measure Rationale and Scoring
Task Time Timing started when the administrator said ‘Begin’. The time ended when the
participant said ‘Done’. In the event that the participant finished, and did not
say ‘Done’, the administrator stopped the clock when it was clear the participant
had completed the task. Task times were only counted if the participant
completed the task in the allotted time. The average time per task was
calculated for each task.
Errors The task resulted in an error if the participant: failed to finish the task in the
allotted time; or, if they became ‘stuck’ and could not proceed without asking for
assistance. Task time was not counted when the task resulted in an error.
We calculated the error % for each task by taking the total number of errors for
each task and divided that number by the total attempts at the task.
Path Deviations Path deviations were recorded as actions taken during the task that were not
part of the necessary actions needed to complete the task.
We calculated path deviations by taking the total number of observed deviations
and dividing that number by the total number of steps taken using an optimal
path.
Task Success A task was considered a success if the participant completed the task in the
allotted time. To calculate the task success rate, we simply divided the total
number of successful tasks by the total number of tasks completed.
The time designated for each task was determined by taking the optimal time to
complete the task and multiplying it by a factor of 1.25 to allow for those users
that may not have been fully trained on the application.
Satisfaction Participant’s subjective impression of the ease of use of the application was
measured by administering a post-session questionnaire. The participant was
EHR Usability Test Report – CCP Version 3.1 15
asked to rate “Overall this task was:” on a scale of 1 (Very Difficult) to 5 (Very
Easy). This data was then averaged across participants.
EHR Usability Test Report – CCP Version 3.1 16
Results
Data Analysis and Reporting
The results of the usability test were calculated according to the methods specified in the Usability
Metrics section above.
The testing results for CCP and Dr. First Rcopia are detailed below. The table below easily identifies the
tasks performed and the performance level for each task.
Task
Participant Task
Success
Time to
Complete
(avg)
Errors Deviations
Task
Rating
(avg)
Risk
Rating
Number Number
Seconds Total
Average
(Total
Observed/
Completed
Task)
Total
(Observed/
Optimal)
Average
(Observed/
Optimal)
(5=Easy)
(5=Least
Risk)
1. Add Allergy 10 10 67 0 0 160/120 1.33 4.5 2
2. Change
Allergy 10 10 47 0 0 87/70 1.24 4.8 2
3. Access
Allergy 10 10 10.1 0 0 33/20 1.65 4.7 4
4. Drug-Drug
Drug-Allergy 6 6 13.1 0 0 24/24 1.0 4.5 1
5. Adjust
Severity
Alert 6 4
29.25 2 33% 23/20 1.15
4.1 4
6. Record
Medication
(EP) 6 5
17.8 1 16% 25/25 1.0
4.0 3
7. Change
Medication
8. (EP) 6 5
16 1 16% 26/25 1.04
3.8 3
9. Access
Medication
10. (EP) 6 6
4.5 0 0 6/6 1.0
3.8 3
11. Create
Prescription 6 4 150.8 2 33% 142/92 1.54 3.1 2
12. Record
Medication
Order 6 4
150.8 2 33% 142/92 1.54
3.1 2
13. Change
Medication
Order 6 5
16 1 16% 26/25 1.04
3.8 3
14. Access
Medication
Order 6 6
4.5 0 0 6/6 1.0
3.8 3
15. Record Lab
Order 6 6 100.0 0 0 75/60 1.25 4.0 2
16. Change Lab
Order 6 6 57.5 0 0 54/48 1.125 4.0 2
17. Access Lab
Order 6 6 3.67 0 0 6/6 1.0 4.67 4
18. Record
Radiology
Order 6 6
80.0 0 0 76/66 1.15
4.0 2
EHR Usability Test Report – CCP Version 3.1 17
19. Change
Radiology
Order 6 6
50.83 0 0 54/48 1.125
4.0 2
20. Access
Radiology
Order 6 6
3.83 0 0 7/6 1.17
4.67 4
21. Problem List
Intervention 6 6 66.5 0 0 49/42 1.17 4.67 3
22. Medication
List
Intervention 6 6
54.17 0 0 31/24 1.29
4.5 3
23. Medication
Allergy List
Intervention 6 6
92.83 0 0 64/54 1.19
4.5 3
24. Demograph
Intervention 6 6 25.67 0 0 14/12 1.17 4.17 3
25. Lab Test and
Results
Intervention 6 6
88.67 0 0 124/72 1.72
4.17 3
26. Vital Signs
Intervention 6 6 11.16 0 0 67/54 1.24 4.33 3
27. Identify User
Diagnostic
and
Therapeutic
Reference
6 6
13.83
0 0 33/30 1.10
4.5 3
28. Configure
CDS
intervention
by user
6 6
48.83
0 0 42/30 1.40
4.17 4
29. Reconcile
medication
list with
other source
6 6
41.5
0 0 39/36 1.08
4.5 3
30. Reconcile
problem list
with other
source
6 6
48.33
0 0 42/36 1.17
4.5 3
31. Reconcile
medication
allergy list
with other
source
6 6
41.83
0 0 38/36 1.05
4.5 3
32. Order
Medication
(EH) 6 6
100.17 0 0 102/66 1.54
4.67 2
33. Change
Medication
Order (EH) 6 6
40.83 0 0 49/42 1.17
4.83 2
34. View/Access
Medication
Order (EH) 6 6
4.5 0 0 7/6 1.17
5 2
35. Use eMAR 6 6 1.38 0 0 77/72 1.07 4.88 2
Effectiveness
All users were able to effectively record, change and view allergies with no failures from the 10
participants. All users were able to effectively receive a drug-drug and drug-allergy intervention and
access medications with no failures from the 6 participants. 5 out of 6 participants were able to
effectively record and change a medication with one failure. 4 out 6 participants were able to
EHR Usability Test Report – CCP Version 3.1 18
effectively adjust the severity level and create a prescription with 2 failures. 6 out of 6 participants were
able to record/change/view lab/radiology orders, reconcile meds/allergies/problems and trigger clinical
decision support. With very minimal demonstration the vast majority of users were able to mimic the
administrator’s actions and followed a good system flow. Eligible Hospital CPOE medication ordering,
modifying and viewing found 6 out of 6 participants able to effectively complete these tasks. eMAR
found 6 out of 6 participants able to effectively complete these tasks.
Efficiency
Tasks 1 thru 3 - Interestingly, that most users did not exhibit taking the optimal path to completing the
action, but their task time average was very close to optimal performance. Users seemed to pick up
speed as they began to perform similar actions, illustrating that the design of the system provides strong
repeating exercises, requiring less alternate actions for the user to remember. The majority of wasted
time and path deviations surrounded the user attempting to search for a medication.
Tasks 4 – 12 – Users appeared to use substantial time hunting for what they were trying to accomplish.
This was reinforced by the feedback that the screen was too cluttered and they would have preferred a
stronger screen focus on their current task. Due to screen real estate they also appeared to be scrolling
frequently to attempt to complete their task. In one or two path deviations, where one or two mouse
clicks was necessary their times were close to optimal, but longer tasks appeared to separate actual to
optimal times.
Tasks 13 – 31– Recording lab/radiology orders allowed the user to maximize their speed while placing
save buttons, continue buttons and sign buttons in logical progression through the screens. This also
allowed the users to accelerate when then modifying an order, since the process is quite similar. Clinical
Decision Support Alerts appeared quickly after the user saved the information and left as much or as
little information available as wanted by the user. Reconciliation appeared slow with the first user
action, but then substantially faster after the first initial save.
Tasks 32 -34 - Recording/modifying and viewing CPOE medication orders found users able to quickly
complete the task, since many of the needed SIG fields were auto populated with the full order from
First Data Bank. The modification of the order is so similar to the original creation of the order that
most users picked up speed, noting that a daily use of the product would certainly greatly increase, an
already efficient workflow.
Task 35 – Using assistive technology to administer to the right patient, right medication, right dose, right
route and right time, participants had no problem using the bar code scanner and having the result
populate the correct fields (patient and medication).
Satisfaction
Tasks 1 thru 3 - All users rated each task’s ease of use from 4.5 to 4.8, on a scale of 1 to 5, 5 illustrating
the easiest. They typically finished each exercise, commenting “that was really easy”. They also rated
their overall satisfaction with the system at 4.4 on a scale of 1 to 5, 5 being the highest.
EHR Usability Test Report – CCP Version 3.1 19
Tasks 4 thru 12 – Users tasking ratings were from 3.1 to 4.5, on a scale of 1 to 5, 5 illustrating the
easiest. While some short tasks, users commented “that’s it?”, the longer tasks showed users becoming
lost and then quickly frustrated. While this may just require more training, these types of responses
hurt the overall satisfaction averaging the system with a 3.8 rating on a scale of 1 to 5, 5 being the
highest.
Tasks 13 – 31 – User tasking ratings were from 4.0 to 4.67, on a scale of 1 to 5, 5 illustrating the easiest.
The users seemed quite satisfied in the amount of effort needed, ease and quality of information
returned. These were fairly high ratings, when very little training was provided, these would likely
increase as users received formal training and increased daily use of the system. They also rated their
overall satisfaction with the system at 4.33 on a scale of 1 to 5, 5 being the highest.
Tasks 32 -34 – User task ratings were from 4.67 to 5.0 on a scale of 1 to 5, 5 illustrating the easiest. The
users frequently commented how intuitive the design was and the minimal screen clutter. The
satisfaction scoring of the users certainly reinforce the impression of a very simple user interface and
their satisfaction with the overall product. This group also rated their overall satisfaction with the
system at 4.67 on a scale of 1 to 5, 5 being the highest.
Task 35 – User task ratings were from 4.5 to 5.0 on a scale of 1 to 5, 5 illustrating the easiest. The users
frequently commented how intuitive the design was and the minimal screen clutter. The satisfaction
scoring of the users certainly reinforce the impression of a very simple user interface and their
satisfaction with the overall product. This group also rated their overall satisfaction with the system at
4.83 on a scale of 1 to 5, 5 being the highest.
Major Findings
Tasks 1 thru 3 - The majority of user reactions to work flows were positive and navigation seemed very
straight forward to the user once they began recording an allergy. While some users may have gotten
lost during their workflow, none experienced an error that prevented them from moving forward or
required the test administrator’s intervention. Users, with no training and just a simple quick
demonstration were able to catch on very quickly and navigate the system, displaying the intuitive
nature of the system.
Tasks 4 – 12 – While the majority of users were able to complete the tasks, there were some users who
became easily frustrated and began guessing at which button to choose as they quickly became lost.
Users were vocal about the screens displaying too much information and vocalizing that some should be
in separate windows. While some of the standard, daily tasks were quick and efficient, the longer tasks
offered more error and took much time. Interestingly, the users gave an average rating of 3.88 in the
ease of use, but with overall satisfaction of 3.1, largely citing the user interface as reason for some
dissatisfaction.
EHR Usability Test Report – CCP Version 3.1 20
Tasks 13 – 31 – The order entry of labs and radiology were surprisingly found easy by the users. While
these are generally more complex tasks, users found a very intuitive design flow to their work, where
the next step for them to take was logical and well placed on the screen. While they would have
preferred more prominent identification of required fields, when missing those, the system very quickly
notified them of missing pieces, which they found easy to correct. The user never felt overwhelmed
with the amount of information/buttons/options on screen which is something we found very frequent
in medication ordering. The clinical information reconciliation was very quick and easy for the users,
though after clicking to remove an item, the delay typically prompted the user to click again and again.
Redesign might consider some type of progress status to the user. After the initial save though, the
repeated actions moved much faster. Users did mention seeing a bit more detail, perhaps more
columns and the ability to click through to the total information of entry. Finally, the users triggered
clinical decision support with much ease and could see a quick condensed view of information, with an
easy ability to expand for further details. This was their first time in receiving interactive pop ups and
always paused to review the information. Repeated use though should be measured in the future for
pop up fatigue and a tendency to stop fully reviewing this and becoming accustomed to just closing
windows.
Tasks 32 -34 - All of the participants complimented the system’s intuitive workflow, appreciation of
clean design and ease in which they learned the product. They all commented that even the minor
areas they found that could use improvement, were simply a product of not using the product on a daily
basis. They all felt a thorough training and daily product use would easily solve these minor problems of
concern. As medication ordering can be one of the largest areas of potential patient risk, a clean, simple
UI can take a complex need and simply it for the end user, which is apparent of this system.
Task 35 – While completing eMAR, participants commented how easy it is to use scanning to document
medication administration. Several nurses expressed satisfaction with the way the wrong medications
can be caught before being administered. One participant wanted to start using the eMAR the very
same day!
Areas for Improvement
Tasks 1 thru 3 - For the majority of users the navigation involving the searching of a medication was a
constant source of path deviation. There is a text field and a search button, users frequently attempted
to type in the field, rather than click the search button, launching another screen where they could type
if a few letters and run a medication search. Users would frequently voice, “the keyboard is not
working”, obviously expecting the search feature to work differently. It was also noted that while all the
data elements were on the left hand side of the screen, the scroll bar was on the right. The data entry
screens were long enough requiring the user to move from entering in data, to moving across the
screen, to moving back to entering data. The layout of the screen into two columns would have
provided the user a more efficient use of time and negated the need for scrolling.
Tasks 4 thru 12 – Certainly a cleaner screen for the users to function in and moving items they are
currently working on to the top would be valuable. The screens would benefit for an ability to collapse
EHR Usability Test Report – CCP Version 3.1 21
the top portion, and reappear when mousing over to make the best use of screen real estate.
Compartmentalizing the data could also assist with the users finding the same type of tasks in the same
portion of the screen each time, rather then scrolling to find. Also when completing a prescription it
may be valuable to have a pop up prompt for signing, users seemed to hunt and sometimes forget to
sign a script.
Tasks 13 – 31 – As mentioned previously on lab/radiology orders, a more prominent indication of which
items were required, seemed a consistent message from users. Screen utilization should also be
considered with the amount of space used for headers and left hand columns vs. the amount of area
dedicated to the particular task. Reconciliation should strongly consider notification to the user that an
item is being reconciled/deleted to avoid multiple unnecessary clicks. Pop up fatigue on clinical decision
support interventions should also be researched further.
Tasks 32 -34 – The only two areas found of any significance were the selection of duration type prior to
number and possibly the time in returning the full orders from First Data Bank. Reported issues of users
not knowing if a order was signed or the state it was in should easily be solved with thorough user
training of the Active, Completed, Discontinued and Pending Order states.
Task 35 – One participant suggested a different layout for the data when the medication was scanned –
they wanted the medication, route dose and time to display next to the order detail. One other
participant suggested the bar code window not close after scanning the patient, rather to remain open
while the medication is being scanned.
EHR Usability Test Report – CCP Version 3.1 22
Appendices
The following appendices include supporting data for this usability study.
1. Sample Consent Form
CCP Usability Study
Practice Consent
CoCentrix would like to thank you for your participation in this study. The purpose of the study is to evaluate the
usability of the CoCentrix CCP electronic health record. Your participation in this study will include performing
specific tasks within CCP; and completing a short survey following the study. The study should take approximately
15 minutes. The information collected by CoCentrix during the study is for research purposes only. Your
participation in this study is voluntary, so you are free to withdraw at any point during the study.
I understand and agree that the purpose of this study is to make software applications more useful and usable in
the future.
By signing below, I agree to participate in the study.
Organization Name
Name of Participant
Gender
Age
Staff Role
Clinical Experience (in years)
Computer Experience (in years)
EHR Experience (in years)
Date of Study
Printed Name Signature Date
EHR Usability Test Report – CCP Version 3.1 23
2. Sample Post-Test Questionnaire
CoCentrix CCP User Survey
1. Rate the task of adding an allergy?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
2. Rate the task of changing an allergy?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
3. Rate the task of accessing/viewing an allergy?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
4. Rate the task of creating a drug-drug and drug-allergy intervention?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
5. Rate the task of adjusting the level of severity of drug-drug interventions?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
6. Rate the task of recording a medication list?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
7. Rate the task of changing a medication list?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
8. Rate the task of accessing/viewing a medication list?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
9. Rate the task of creating a prescription?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
10. Rate the task of recording a EP medication order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
11. Rate the task of changing a EP medication order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
EHR Usability Test Report – CCP Version 3.1 24
12. Rate the task of accessing/viewing a EP medication order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
13. Rate the task of recording a lab order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
14. Rate the task of changing a lab order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
15. Rate the task of accessing/viewing a lab order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
16. Rate the task of recording a radiology order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
17. Rate the task of changing a radiology order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
18. Rate the task of accessing/viewing a radiology order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
19. Rate the task of triggering clinical decision support intervention for problems?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
20. Rate the task of triggering clinical decision support intervention for medications?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
21. Rate the task of triggering clinical decision support intervention for med allergies?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
22. Rate the task of triggering clinical decision support intervention for demographics?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
23. Rate the task of triggering clinical decision support intervention for lab tests and results?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
24. Rate the task of triggering clinical decision support intervention for vital signs?
Overall, this task was: ______
EHR Usability Test Report – CCP Version 3.1 25
(Very Difficult 1 to Very Easy 5)
25. Rate the task of identifying user diagnostic and therapeutic reference information?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
26. Rate the task of configuring of CDS interventions by user?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
27. Rate the task of reconciling medication list with another source?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
28. Rate the task of reconciling medication allergy list with another source?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
29. Rate the task of reconciling problem list with another source?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
30. Rate the task of recording a EH medication order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
31. Rate the task of changing a EH medication order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
32. Rate the task of accessing/viewing a EH medication order?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
33. Rate the task of administering a medication via eMAR using assistive technology?
Overall, this task was: ______
(Very Difficult 1 to Very Easy 5)
34. On a scale of 1-5, how would you rate your overall satisfaction with the system?
Overall, this task was: ______
(Very Dissatisfied 1 to Very Satisfied 5)
EHR Usability Test Report – CCP Version 3.1 26
3. Moderator Guide
CoCentrix CCP Usability Study
Allergy List
Tasks to perform during study:
Medication Allergy List
Preparation Task Goal Optimal Paths
1. No Allergies Create new allergy 50 seconds 12
2. Existing Allergy Modify existing allergy 27 seconds 7
3. Existing Allergy Access/View allergy 7 seconds 2
Create New Allergy Optimal Path: Click Allergy Button � Click New Allergy Button � Click Allergy Type
Drop Down � Select Allergy from Drop Down � Click Search Button � Type first three letters in search field
� Click Search� Select Allergy from Lookup � Click Reaction Type Drop Down � Select Reaction from
Drop Down � Click Severity Type Drop Down � Select Severity from Drop Down � Click Save �
Modify Existing Allergy Optimal Path: Click Allergy Button � Select Allergy from List � Click Expiration
Date � Enter Expiration Date � Click Expiration Reason Drop Down � Select Expiration Reason from Drop
Down � Click Save �
Access/View Allergy Optimal Path: Click Allergy Button � Select Allergy from List �
Drug-Drug, Drug-Allergy Intervention Checks
Preparation Task Goal Optimal Paths
1. Allergies Present Create drug-drug and drug-
allergy intervention 8 seconds 4
2. Severity Level Set Adjust severity level of drug-drug
interventions 27 seconds 5
Create Drug-Drug, Drug-Allergy Intervention Optimal Path: Click Favorites � Select medication from
list � Click Use button � View drug-drug and drug-allergy alerts �
Adjust Severity Level of Drug-Drug Interventions Optimal Path: Click Options � Click Preferences-
Practice � Find “When checking for drug-drug interactions, show:” and Click Drop Down � Select
“Severe and Contraindicated Only” � Click Make Changes Button
EHR Usability Test Report – CCP Version 3.1 27
Medication List
Preparation Task Goal Optimal Paths
1. Medications
Present Record medication 13 seconds 5
2. Medications
Present Change medication 11 seconds 5
3. Medications
Present Access/View medications 2 seconds 1
Create New Medication Optimal Path: Click Manage Medications � Click Favorites Drop Down � Select
Medication � Click Add � Click Continue
Modify Existing Medication Optimal Path: Click Manage Medications � Click Modify � Click Sig times
per day Drop Down � Select different amount � Click Continue
Access/View Medication Optimal Path: Click Manage Medications � View Medications �
Electronic Prescriptions
Preparation Task Goal Optimal Paths
1. Medications
Present Prescribe new medication 75 seconds 23
Create Prescription Optimal Path: Type 4 letters into search � Click Find Button � Select
Medication � Click Sig Action Drop Down � Select Action � Click Amount Drop Down � Select Amount
� Click Measurement Drop Down � Select Measurement � Click Route Drop Down � Select Route
� Click Times Per Day Drop Down � Select Times Per Day � Click Direction Drop Down � Select
Direction � Click Duration Drop Down � Select Duration � Click Date Today � Select Continue �
Click Prescribe Anyway � Click OK � Type in Password � Click Send
CPOE
Preparation Task Goal Optimal Paths
1. Medications
Present Record medication order (EP) 75 seconds 23
2. Medications
Present Change medication order (EP) 11 seconds 5
3. Medications
Present
Access/View medication order
(EP) 2 seconds 1
1. None
Present Record lab order 50 seconds 10
2. None
Present Change lab order 35 seconds 8
3. None
Present Access/View lab order 3 seconds 1
1. None
Present Record radiology order 52 seconds 11
2. None Change radiology order 35 seconds 8
EHR Usability Test Report – CCP Version 3.1 28
Present
3. None
Present Access/View radiology order 2 seconds 1
1. None Present Record medication order (EH) 70 seconds 11
2. Medications
Present Change medication order (EH) 24 seconds 7
3. Medications
Present
Access/View medication order
(EH) 4 seconds 1
EHR Usability Test Report – CCP Version 3.1 29
Create Medication Order Optimal Path (EP): Type 4 letters into search � Click Find Button � Select
Medication � Click Sig Action Drop Down � Select Action � Click Amount Drop Down � Select Amount
� Click Measurement Drop Down � Select Measurement � Click Route Drop Down � Select Route
� Click Times Per Day Drop Down � Select Times Per Day � Click Direction Drop Down � Select
Direction � Click Duration Drop Down � Select Duration � Click Date Today � Select Continue �
Click Prescribe Anyway � Click OK � Type in Password � Click Send
Modify Existing Medication Order Optimal Path (EP) : Click Manage Medications � Click Modify �
Click Sig times per day Drop Down � Select different amount � Click Continue
Access/View Medication Order Optimal Path (EP) : Click Manage Medications � View Medications �
Create Lab Order Optimal Path: Click Add New Order � Click Lab Button � Click in Lab Field Drop
Down � Click in Frequency � Select Drop Down � Type in Reason � Click Save & Continue Button
� Click Sign Checkbox � Click Sign Selected Orders � Click Sign
Modify Existing Lab Order Optimal Path: Click Modify � Click Perform Selected Actions � Click Days
� Click Sun � Click Save & Continue � Click Sign Checkbox � Click Sign Selected Orders � Click Sign
Access/View Lab Order Optimal Path: Click Active Orders �
Create Radiology Order Optimal Path: Click Add New Order � Click Imaging/Other Study Button �Click
Radiology/Imaging Button �Click in Test Field �Click in Drop Down � Click in Frequency Drop Down � Type
in Reason � Click Save & Continue Button � Click Sign Checkbox � Click Sign Selected Orders � Click
Sign
Modify Existing Radiology Order Optimal Path: Click Modify � Click Perform Selected Actions � Click
Days � Click Sun � Click Save & Continue � Click Sign Checkbox � Click Sign Selected Orders � Click
Sign
Access/View Radiology Order Optimal Path: Click Active Orders �
Create Medication Order Optimal Path (EH): Click New Order � Type synth � Select Synthroid
� Select Synthroid 100 MCG � Enter Duration 30 � Select Days � Type Hypothyroidism under
Indication � Click Save and Continue � Select Medication � Click Sign Selected Orders � Click Sign
Modify Existing Medication Order Optimal Path (EH) : Check Modify � Click Perform Selected Actions
� Change Duration to 20 days � Click Save & Continue � Select Medication � Click Sign Selected
Orders � Click Sign
Access/View Medication Order Optimal Path (EH) : Click Active Orders � View Medications �
Clinical Information Reconciliation
EHR Usability Test Report – CCP Version 3.1 30
Preparation Task Goal Optimal Paths
1. 2 Problem Lists Reconcile problems with
another source 30 seconds 6
2. 2 Medication Lists Reconcile problems with
another source 30 seconds 6
3. 2 Medication
Allergy Lists
Reconcile problems with
another source 30 seconds 6
Reconcile Problems Optimal Path: Click Problem � Click Delete � Confirm Delete � Click Problem
� Click Reconcile � Confirm Reconcile
Reconcile Medications Optimal Path: Click Medication � Click Delete � Confirm Delete � Click
Medication � Click Reconcile � Confirm Reconcile
Reconcile Medication Allergies Optimal Path: Click Medication Allergy � Click Delete � Confirm Delete
� Click Medication Allergy � Click Reconcile � Confirm Reconcile
Clinical Decision Support
Preparation Task Goal Optimal Paths
1. Build CDS
Intervention for
Problems
Trigger Intervention for
Problems 55 seconds 7
2. Build CDS
Intervention for
Medications
Trigger Intervention for
Medications 50 seconds 4
3. Build CDS
Intervention for
Medication Allergy
Trigger Intervention for
Medication Allergy 65 seconds 9
4. Build CDS
Intervention for
Vitals
Trigger Intervention for Vitals 40 seconds 9
5. Build CDS
Intervention for
Demographics
Trigger Intervention for
Demographics 15 seconds 2
6. Build CDS
Intervention for Lab
Results
Trigger Intervention for Lab
Results 65 seconds 12
7. Build Bibliographic
and Reference
Information
Identify Reference Information 10 seconds 5
8. Build CDS
Configurations Disable CDS Interventions 30 seconds 5
EHR Usability Test Report – CCP Version 3.1 31
Trigger CDS Intervention for Problems Optimal Path: Click Diagnosis � Click ICD9 � Type Major Depre
� Click Search � Select 296.31 � Select Patient Program� Click Save
Trigger CDS Intervention for Medications Optimal Path: Click Add New Medication � Type Name-
Fluoxetine � Type RxNorm 403971 � Click Save
Trigger CDS Intervention for Medication Allergy Optimal Path: Click Add New Allergy � Type Name –
Penicillin G � Click Save � Select Allergy Type – Drug Allergy � Select Reaction - Hives� Type
penic� Click Search � Select Penicillin G � Click Save
Trigger CDS Intervention for Vitals Optimal Path: Click Add New Clinical Measurement � Click Time
Measured � Select Date� Select Time � Enter Height 72 � Enter Height Unit Inches � Enter Weight
300 � Enter Weight Unit lbs � Click Save
Trigger CDS Intervention for Demographics Optimal Path: Enter Date of Birth� Click Save
Trigger CDS Intervention for Lab Results Optimal Path: Click Add New Lab Result � Click Description
� Enter Hemoglobin � Click Save � Click Drop Down Arrow � Click Result Values� Click Add New
Result Value � Click Result Field � Enter A1c� Click in Value Field � Enter 10% � Click Save
Identify User Diagnostic Therapeutic Information Optimal Path: Click Save � Click CDS Alert � Click
Expand Bibliographic Citation � Click Scroll Bar� Click Hyperlink
Configure CDS by User Optimal Path: Click CRM � Click Clinical Decision Support � Select Intervention
� Select Physician � Click Save
eMAR
Preparation Task Goal Optimal Paths
1. Medications Orders
Present
Use assistive technology to
administer medication 65 seconds 12
Create eMAR Optimal Path: Select patient from caseload list with a click on the name � Click the MAR
Button � Click the Due tab � Click the Barcode button � Click in the Patient Barcode field and scan
the patient’s ID � Click the Search icon � Click the Barcode button � the Medications tab � Click in the
NDC field and scan the medication � Click the Search icon � Click the Sign Administration Charting button
� Click the Sign button in the Sign Administration Charting window �
EHR Usability Test Report – CCP Version 3.1 32
EHR Usability Test Report – CCP Version 3.1 33
2. Participants Demographics
Gender Tasks 1-3 Tasks 4-12 Tasks 13-31 Tasks 32-34 Task 35
Male 4 1 3 2 4
Female 6 5 3 4 2
Staff Role Tasks 1-3 Tasks 4-12 Tasks 13-31 Tasks 32-34 Task 35
Post-Doctoral Fellow 2
RN Case Manager 1 1 1
Neuropsychologist 1 1
RN 1 3
Clinical Associate 1 1 1
Clinical Case Manager 2
Occupational
Therapist
1
Psychology Student 2
Program Manager 1 1 1
Cognitive Therapist 1
Psychotherapist 1 1
Advanced Practice
Nurse
1
Health IT 1 6 1
EHR Usability Test Report – CCP Version 3.1 34
Years of Clinical
Experience
Tasks 1-3 Tasks 4-12 Tasks 13-31 Tasks 32-34 Task 35
<5 2 1 2 1 2
<10 4 0 0 4 1
>10 4 5 4 1 3
Years of Computer
Experience
Tasks 1-3 Tasks 4-12 Tasks 13-31 Tasks 32-34 Task 35
<20 5 3 1 2 5
>20 5 3 5 4 1
Years of EHR Experience Tasks 1-3 Tasks 4-9 Tasks 13-31 Tasks 32-34 Task 35
<1 2 0 4 2
<3 5 2 1
>3 3 4 1 4 6
January 14, 2014 Melissa Martin EHR Coordinator Drummond Group, Inc. 13359 North Highway 183, Suite B-406-238 Austin, TX 78750 Re: Quality Management System Dear Ms. Martin, Attached you shall find our response regarding Quality Management System. Please let me know if you have any further questions. Sincerely, Neal Tilghman Vice President, Product
CoCENTRIX QMS is structured around Visual Studio Team Foundation Server for managing our Agile/Scrum process projects within our Software Development Life Cycle. It incorporates quality planning, provides a scrum framework for managing the activities that enable the company to create software which consistently satisfies customer and regulatory requirements, and is a tool for achieving enhanced customer satisfaction. Our QMS definition consists of Scrum’s roles, events, artifacts and the rules that bind them. Scrum is founded on empirical process control theory, employing frequent inspection and adaptation to optimize predictability and control risk. Activities affecting quality are documented in accordance with written manuals, procedures instructions, specifications, and charts that contain appropriate criteria for determining whether prescribed activities have been satisfactorily accomplished. Scrum Roles:
Scrum Master
Product Owner
Team Members
Stakeholders
Users
Our process also provides for continual improvement by monitoring processes based on their significance, measuring their effectiveness against objectives, and managing processes for improvement. After each Sprint, our process calls for a Review meeting for the team to present to management, customers, users, and the Product Owner the product increment that is has built during the Sprint and receive any feedback. In addition to the Review, a Sprint Retrospective meeting takes place in which Scrum Team members reflect on their work, process and quality, producing an actionable plan for improving. We are not currently using a Quality Management System for Implementation and Maintenance.
From: Neal Tilghman Senior Vice President, Product CoCentrix, Inc. 540 North Tamiami Trail Sarasota, FL 34236 (941) 306-4951 October 1, 2014 Dear Gary Isaac,
1. Are default settings for audit log and audit log status record are enabled by default?
Yes.
2. Is encryption of electronic health information on end-user devices is enabled by
default?
Yes. The web application uses “no_cache” headers to prevent storage of information on end-
user devices. In addition, MS CRM does not allow users to disable encryption.
3. Does the EHR SUT allow a user to disable the following?
The audit log can be disabled. The audit log status and encryption cannot be disabled.
4. Does the EHR SUT permit any users to delete electronic health information?
[IN170.314(d)(2)-3.04]
Yes.
5. Describe how the audit logs are protected from being changed, overwritten or
deleted by the EHR technology.
[IN170.314(d)(2)-4.01]
6. Describe how the EHR is capable of detecting whether the audit logs have been
altered. [IN170.314(d)(2)-5.01]
The audit is protected by database controls; only authorized users can access the database.
Therefore the database audit log would track user access to the log.
Sincerely, Neil Tilghman
10/2/2014