Amit V. Trivedi


Transmit health information to a Third Party using the standards ..... requirements gathering and design. ..... HELP1 has maintained its Command Line Interface (CLI) with myriad nested menus .... A usability study moderator was assigned to set up the portable usability lab, ... The Intermountain Local Area Network provided.

Test Results Summary for 2014 Edition EHR Certification Version 1.0  February 20, 2016

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Note: Product Inherited certification from CHPL Product Number: 140167R25 Part 1: Product and Developer Information 1.1

1.2

Certified Product Information Product Name:

HELP2 Clinical Desktop

Product Version:

2016.M01.03

Domain:

Inpatient

Test Type:

Modular EHR

Developer/Vendor Information Developer/Vendor Name:

Intermountain Healthcare

Address:

36 S. State Street, Salt Lake City, UT 84111

Website:

www.intermountainhealthcare.org

Email: Phone: Developer/Vendor Contact:

Rebecca Farr

Part 2: ONC-Authorized Certification Body Information 2.1

ONC-Authorized Certification Body Information ONC-ACB Name:

ICSA Labs, an independent division of Verizon

Address:

1000 Bent Creek Blvd, Suite 200 Mechanicsburg, PA 17050

Website:

https://www.icsalabs.com/technology-program/onc-ehr

Email:

[email protected]

Phone:

717.790.8100

ONC-ACB Contact:

Amit Trivedi

This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative: Amit Trivedi ONC-ACB Authorized Representative

Amit V. Trivedi

Program Manager – Healthcare Function/Title

2/20/2016

Signature and Date

Template Version 1

Page 1 of 2

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

2.2

Gap Certification

The following identifies criterion or criteria certified via gap certification §170.314 (a)(1)

(a)(17)

(d)(5)

(d)(9)

(a)(6)

(b)(5)*

(d)(6)

(f)(1)

(a)(7)

(d)(1)

(d)(8)

*Gap certification allowed for Inpatient setting only No gap certification

2.3

Inherited Certification

The following identifies criterion or criteria certified via inherited certification §170.314 (a)(1)

(a)(14)

(c)(3)

(f)(1)

(a)(2)

(a)(15)

(d)(1)

(f)(2)

(a)(3)

(a)(16) Inpt. only

(d)(2)

(f)(3)

(a)(4)

(a)(17) Inpt. only

(d)(3)

(f)(4) Inpt. only

(a)(5)

(b)(1)

(d)(4)

(a)(6)

(b)(2)

(d)(5)

(f)(5) Optional & Amb. only

(a)(7)

(b)(3)

(d)(6)

(a)(8)

(b)(4)

(d)(7)

(f)(6) Optional & Amb. only

(a)(9)

(b)(5)

(d)(8)

(g)(1)

(a)(10)

(b)(6) Inpt. only

(d)(9) Optional

(g)(2)

(a)(11)

(b)(7)

(e)(1)

(g)(3)

(a)(12)

(c)(1)

(e)(2) Amb. only

(g)(4)

(a)(13)

(c)(2)

(e)(3) Amb. only

No inherited certification

Template Version 1

Page 2 of 2

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Note: Product Inherited certification from CHPL Product Number: 140167R20 Part 1: Product and Developer Information 1.1

1.2

Certified Product Information Product Name:

HELP2 Clinical Desktop

Product Version:

2015.M10.09

Domain:

Inpatient

Test Type:

Modular EHR

Developer/Vendor Information Developer/Vendor Name:

Intermountain Healthcare

Address:

36 S. State Street, Salt Lake City, UT 84111

Website:

www.intermountainhealthcare.org

Email: Phone: Developer/Vendor Contact:

Rebecca Farr

Part 2: ONC-Authorized Certification Body Information 2.1

ONC-Authorized Certification Body Information ONC-ACB Name:

ICSA Labs, an independent division of Verizon

Address:

1000 Bent Creek Blvd, Suite 200 Mechanicsburg, PA 17050

Website:

https://www.icsalabs.com/technology-program/onc-ehr

Email:

[email protected]

Phone:

717.790.8100

ONC-ACB Contact:

Amit Trivedi

This test results summary is approved for public release by the following ONC-Authorized Certification Body Representative: Amit Trivedi ONC-ACB Authorized Representative

Amit V. Trivedi

Program Manager – Healthcare Function/Title

12/23/2015

Signature and Date

12/23/2015 7:23 PM

Template Version 1

Page 1 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

2.2

Gap Certification

The following identifies criterion or criteria certified via gap certification §170.314 (a)(1)

(a)(17)

(d)(5)

(d)(9)

(a)(6)

(b)(5)*

(d)(6)

(f)(1)

(a)(7)

(d)(1)

(d)(8)

*Gap certification allowed for Inpatient setting only No gap certification

2.3

Inherited Certification

The following identifies criterion or criteria certified via inherited certification §170.314 (a)(1)

(a)(14)

(c)(3)

(f)(1)

(a)(2)

(a)(15)

(d)(1)

(f)(2)

(a)(3)

(a)(16) Inpt. only

(d)(2)

(f)(3)

(a)(4)

(a)(17) Inpt. only

(d)(3)

(f)(4) Inpt. only

(a)(5)

(b)(1)

(d)(4)

(a)(6)

(b)(2)

(d)(5)

(f)(5) Optional & Amb. only

(a)(7)

(b)(3)

(d)(6)

(a)(8)

(b)(4)

(d)(7)

(f)(6) Optional & Amb. only

(a)(9)

(b)(5)

(d)(8)

(g)(1)

(a)(10)

(b)(6) Inpt. only

(d)(9) Optional

(g)(2)

(a)(11)

(b)(7)

(e)(1)

(g)(3)

(a)(12)

(c)(1)

(e)(2) Amb. only

(g)(4)

(a)(13)

(c)(2)

(e)(3) Amb. only

No inherited certification

12/23/2015 7:23 PM

Template Version 1

Page 2 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

Part 3: NVLAP-Accredited Testing Laboratory Information

Report Number: 2014-EHRI762563-2014-0628-00 Test Date(s): 6/23/2014 3.1

NVLAP-Accredited Testing Laboratory Information ATL Name:

ICSA Labs, an independent division of Verizon

Accreditation Number:

200697-0

Address:

1000 Bent Creek Boulevard, Suite 200 Mechanicsburg, PA 17050

Website:

https://www.icsalabs.com/technology-program/onc-ehr

Email:

[email protected]

Phone:

717.790.8100

ATL Contact:

Michelle Knighton

For more information on scope of accreditation, please reference http://ts.nist.gov/standards/scopes/2006970.htm Product inherited certification

3.2

Test Information 3.2.1 Additional Software Relied Upon for Certification

12/23/2015 7:23 PM

Additional Software

Applicable Criteria

PopHealth

ONC 314c1, ONC 314c2, ONC 314c3

Functionality provided by Additional Software Import QRDAI and generate QRDAIII

Chrony

ONC 314d2, ONC 314e1

NTP on Linux Servers

Clinical Key

ONC 314a14

HL7 Infobutton and other method

cPOE

ONC 314a1

Medication ordering

First DataBank

ONC 314a1

RxNorm mappings

Foresight

ONC 314a8, ONC 314b2, ONC 314e1

Rules engine; Creates trigger to run rules to check for completeness of data

HELP1

ONC 314a1, ONC 314f1

Lab and imaging orders, Immunization information

HELP1 Web Services

ONC 314b2

Provide data for CCDA

HELP2 Web Services

ONC 314b2

Provide data for CCDA

Template Version 1

Page 3 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015 Functionality provided by Additional Software Receive and incorporate CCDA, C32, and CCR documents; create and transmit CCDA document; Runs rules and sets complete flag for data content; Create export summaries

Additional Software

Applicable Criteria

HIE Services

ONC 314b1, ONC 314b2, ONC 314b7, ONC 314e1

Immunization Interface

ONC 314f1

HL7 2.5.1 immunization messages

Micromedex

ONC 314a14

HL7 Infobutton and other method

My Health Patient Portal

ONC 314e1

Patient/Proxy UI for View, Download, Transmit

ntpd

ONC 314d2

NTP on AIX Server

Up-to-Date

ONC 314a14

HL7 Infobutton and other method

No additional software required

3.2.2 Test Tools Version

Test Tool Cypress

2.4.1

ePrescribing Validation Tool HL7 CDA Cancer Registry Reporting Validation Tool HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool HL7 v2 Immunization Information System (IIS) Reporting Validation Tool HL7 v2 Laboratory Results Interface (LRI) Validation Tool HL7 v2 Syndromic Surveillance Reporting Validation Tool Transport Testing Tool Direct Certificate Discovery Tool No test tools required

3.2.3 Test Data Alteration (customization) to the test data was necessary and is described in Appendix A No alteration (customization) to the test data was necessary

12/23/2015 7:23 PM

Template Version 1

Page 4 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

3.2.4 Standards 3.2.4.1

Multiple Standards Permitted

The following identifies the standard(s) that has been successfully tested where more than one standard is permitted Criterion #

Standard Successfully Tested §170.204(b)(1)

(a)(8)(ii)(A)(2)

HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

§170.207(a)(3) (a)(13)

IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.204(b)(1) (a)(15)(i)

(a)(16)(ii)

HL7 Version 3 Implementation Guide: URL-Based Implementations of the Context-Aware Information Retrieval (Infobutton) Domain

(e)(1)(i) (e)(1)(ii)(A)(2) (e)(3)(ii)

12/23/2015 7:23 PM

§170.207(j) HL7 Version 3 Standard: Clinical Genomics; Pedigree

§170.204(b)(2) HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

§170. 210(g)

Network Time Protocol Version 3 (RFC 1305)

Network Time Protocol Version 4 (RFC 5905)

The code set specified at 45 CFR 162.1002(c)(2) (ICD-10CM) for the indicated conditions

§170.207(i) (b)(7)(i)

HL7 Version 3 Implementation Guide: Context-Aware Knowledge Retrieval (Infobutton) Service-Oriented Architecture Implementation Guide

§170.210(g)

§170.207(i) (b)(2)(i)(A)

§170.204(b)(2)

The code set specified at 45 CFR 162.1002(c)(2) (ICD-10CM) for the indicated conditions

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

§170.207(a)(3) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

Annex A of the FIPS Publication 140-2 • AES-256, SHA-384 §170.210(g) Network Time Protocol Version 3 (RFC 1305)

§170. 210(g) Network Time Protocol Version 4 (RFC 5905)

Annex A of the FIPS Publication 140-2

Template Version 1

Page 5 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015 Criterion # Common MU Data Set (15)

Standard Successfully Tested §170.207(a)(3) §170.207(b)(2) IHTSDO SNOMED CT® International Release July 2012 and US Extension to SNOMED CT® March 2012 Release

The code set specified at 45 CFR 162.1002(a)(5) (HCPCS and CPT-4)

None of the criteria and corresponding standards listed above are applicable

3.2.4.2

Newer Versions of Standards

The following identifies the newer version of a minimum standard(s) that has been successfully tested Newer Version

Applicable Criteria

No newer version of a minimum standard was tested

12/23/2015 7:23 PM

Template Version 1

Page 6 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

3.2.5 Optional Functionality Criterion #

Optional Functionality Successfully Tested

(a)(4)(iii)

Plot and display growth charts

(b)(1)(i)(B)

Receive summary care record using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(1)(i)(C)

Receive summary care record using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(b)(2)(ii)(B)

Transmit health information to a Third Party using the standards specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(2)(ii)(C)

Transmit health information to a Third Party using the standards specified at §170.202(b) and (c) (SOAP Protocols)

(f)(3)

Ambulatory setting only – Create syndrome-based public health surveillance information for transmission using the standard specified at §170.205(d)(3) (urgent care visit scenario)

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental Procedures and Nomenclature)

Common MU Data Set (15)

Express Procedures according to the standard specified at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD-10-PCS)

No optional functionality tested

12/23/2015 7:23 PM

Template Version 1

Page 7 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

3.2.6 2014 Edition Certification Criteria* Successfully Tested Criteria #

Version TP** TD***

Criteria #

(a)(1)

(c)(3)

(a)(2)

(d)(1)

(a)(3)

(d)(2)

(a)(4)

(d)(3)

(a)(5)

(d)(4)

(a)(6)

(d)(5)

(a)(7)

(d)(6)

(a)(8)

(d)(7)

(a)(9)

(d)(8)

(a)(10)

(d)(9) Optional

(a)(11)

(e)(1)

(a)(12)

(e)(2) Amb. only

(a)(13)

(e)(3) Amb. only

(a)(14)

(f)(1)

(a)(15)

(f)(2)

(a)(16) Inpt. only

(f)(3)

(a)(17) Inpt. only

(f)(4) Inpt. only

(b)(1)

(f)(5) Optional & Amb. only

(b)(2) (b)(3) (b)(4)

(f)(6) Optional & Amb. only

(b)(5)

(g)(1)

(b)(6) Inpt. only

(g)(2)

(b)(7)

(g)(3)

(c)(1)

(g)(4)

Version TP TD

(c)(2) *For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification (navigation: 2014 Edition Test Method) **Indicates the version number for the Test Procedure (TP) ***Indicates the version number for the Test Data (TD)

12/23/2015 7:23 PM

Template Version 1

Page 8 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

3.2.7 2014 Clinical Quality Measures* Type of Clinical Quality Measures Successfully Tested: Ambulatory Inpatient No CQMs tested *For a list of the 2014 Clinical Quality Measures, please reference http://www.cms.gov (navigation: 2014 Clinical Quality Measures)

CMS ID

Version

Version

CMS ID

2

90

136

155

22

117

137

156

50

122

138

157

52

123

139

158

56

124

140

159

61

125

141

160

62

126

142

161

64

127

143

163

65

128

144

164

66

129

145

165

68

130

146

166

69

131

147

167

74

132

148

169

75

133

149

177

77

134

153

179

82

135

154

182

CMS ID

Inpatient CQMs Version CMS ID

CMS ID

Version

9

V2

172

V2

178

V3

185

V2

V2

108

30

V3

73

V2

109

91

53 55 60

100

V2

102 V2

Version

107

72

V3

CMS ID

V3

V1

32

Version

Version

71

26 31

12/23/2015 7:23 PM

CMS ID

Ambulatory CQMs Version CMS ID

110

V2

188

111

V2

190

113

V2

104

V2

114

105

V2

171

Template Version 1

V2

Page 9 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

3.2.8 Automated Numerator Recording and Measure Calculation 3.2.8.1

Automated Numerator Recording Automated Numerator Recording Successfully Tested (a)(1)

(a)(9)

(a)(16)

(b)(6)

(a)(3)

(a)(11)

(a)(17)

(e)(1)

(a)(4)

(a)(12)

(b)(2)

(e)(2)

(a)(5)

(a)(13)

(b)(3)

(e)(3)

(a)(6)

(a)(14)

(b)(4)

(a)(7)

(a)(15)

(b)(5)

Automated Numerator Recording was not tested

3.2.8.2

Automated Measure Calculation Automated Measure Calculation Successfully Tested (a)(1)

(a)(9)

(a)(16)

(b)(6)

(a)(3)

(a)(11)

(a)(17)

(e)(1)

(a)(4)

(a)(12)

(b)(2)

(e)(2)

(a)(5)

(a)(13)

(b)(3)

(e)(3)

(a)(6)

(a)(14)

(b)(4)

(a)(7)

(a)(15)

(b)(5)

Automated Measure Calculation was not tested

3.2.9 Attestation Attestation Forms (as applicable)

Appendix

Safety-Enhanced Design*

B

Quality Management System**

C

Privacy and Security *Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (b)(3), (b)(4) **Required for every EHR product

12/23/2015 7:23 PM

Template Version 1

Page 10 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

3.3

Appendices

Appendix A: Test Data Alterations The following deviations from the ONC-approved Test Data were utilized during certification testing: None.

Appendix B: Safety-Enhanced Design Attestation 1

170.314(g)(3) Safety-enhanced design

1.1

Identify which of the following criteria are scheduled to be tested or inherited for certification.

1.1.1

170.314(a)(1) Computerized provider order entry

1.1.2

170.314(a)(2) Drug-drug, drug-allergy interactions checks

1.1.3

170.314(a)(6) Medication list

1.1.4

170.314(a)(7) Medication allergy list

1.1.5

170.314(a)(8) Clinical decision support

1.1.6

170.314(a)(16) Electronic medication administration record (inpatient setting only)

1.1.7

170.314(b)(3) Electronic prescribing

1.1.8

170.314(b)(4) Clinical information reconciliation

1.2

Document the applied user-centered design (UCD) processes for each applicable EHR technology capability submitted for testing. Provide the name, description, and citation for all UCD processes used. •

If a single UCD process was used for applicable capabilities, it would only need to be identified once.



If different UCD processes were applied to specific capabilities, be sure to indicate the criterion or criteria to which each UCD process applies.



If a modified UCD process was used for any of the applicable capabilities, an outline and short description of the UCD process must be provided. The description must also include identifying any industry-standard UCD process upon which the modified UCD process was based.

A single UCD Design Process was used for all of the above criteria post development: NISTIR 7741: NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records. See appended document for more information.

12/23/2015 7:23 PM

Template Version 1

Page 11 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015 1.3

Submit a Usability Test Report for each criterion you selected in Question 1.1. • •

Attach the Usability Test Report in a separate document. Identify the name of the report(s) and any other supporting documentation materials in the field below. If more than one report is submitted, specify which report applies to which criteria.



Reports may be supplied in any format, though they must include the necessary information for all of the certification criteria submitted for testing and conform to the content and completion requirements of the Customized Common Industry Format Template for Electronic Health Record Usability Testing per NISTIR 7742. Failure to include all required elements will constitute automatic failure of the SED Attestation.



The official NISTIR 7742 report template can be located at http://www.nist.gov/itl/hit/upload/LowryNISTIR7742Customized_CIF_Template_for_EHR_Usability_Testing_Publicationl_Versiondoc.pdf See appended documents

Appendix C: Quality Management System Attestation The following Quality Management System attestation was submitted during certification testing: 1

170.314(g)(4) Quality management system

1.1

If an industry standard QMS was used during the development, testing, implementation or maintenance of the EHR technology for any of the certification criteria, specify it/them by name (e.g. ISO 9001, IEC 62304, ISO 13485, 21 CFR Part 820, etc.). If an industry standard QMS was not used, please skip to Question 1.2. N/A

1.2

If a modified or "home-grown" QMS was used during the development, testing, implementation or maintenance of the EHR technology for any of the certification criteria, include an outline and short description of the QMS, which could include identifying any industry-standard QMS upon which it was based and modifications to that standard. If a modified or “home-grown” QMS was not used, please skip to Question 1.3. N/A

1.3

If no QMS was used during the development, testing, implementation or maintenance of the EHR technology for any of the certification criteria, please state that. No Quality Management System was used.

12/23/2015 7:23 PM

Template Version 1

Page 12 of 13

Test Results Summary for 2014 Edition EHR Certification Version 1.0  December 23, 2015

Test Results Summary Document History Version 1.0

Description of Change Certification inherited from 140167R09

Date January 23, 2015

END OF DOCUMENT

12/23/2015 7:23 PM

Template Version 1

Page 13 of 13

User Centered Design Testing Standard Operating Procedure (SOP) 1.0 Purpose: This SOP defines the User Centered Design (UCD) testing process and deliverables. 2.0 Scope: This SOP should be applied when UCD testing is requested to be conducted on a clinical information system. 3.0 Responsibility: The individuals (e.g. Usability Analysts, Application Systems Technical Analysts, Business Analysts, developers etc.) who have been designated to conduct, analyze and produce reports summarizing the findings and resulting recommendations from a UCD test. 4.0 Definitions: 4.1 User-Centered Design (UCD): UCD is a user interface design process that focuses on usability goals, user characteristics, environment, tasks, and workflow in the design of a product (or feature). 4.2 Summative Testing: A usability study that is typically conducted during the latter stages of development or post-development and serves as a quality measure to validate that what was built is usable. This type of study may utilize high fidelity prototypes or versions of the finished product (or feature) to measure efficiency, effectiveness and satisfaction with the product (or feature). The feedback provided by test participants may help identify specific problems associated with usability/learnability and may identify opportunities for improvement to the product (or feature). Summative testing should be conducted for new products but may be modified for maintenance releases of existing products. 4.3 Formative Testing: A usability study that is typically conducted during the early stages of requirements gathering and design. This type of study may utilize low or high fidelity prototypes to help identify specific problems associated with user interactions. The feedback provided by study participants may allow for changes to the design before they are delivered to development. Issues and problems discovered during formative testing should be documented and included as part of the design output documentation but are not required to follow the UCD testing steps outlined in the procedure below. 5.0 Procedure: 5.1 Planning: During the planning phase of the project, the development team will, using their professional judgment, plan and request appropriate UCD testing sessions at pertinent points in the product development lifecycle. The number and type of UCD studies that will be utilized for each individual project may vary depending on the nature and complexity of the project. 5.2 Participants: 5.2.1

Usability Study Moderator: The individual(s) assigned to administer and conduct the test with the test participant(s). This individual should not have direct responsibility for the design or development of the product (or feature).

5.2.2

Usability Study Observer: The individual(s) assigned to witness and make annotations about the observed test participant’s behaviors and interactions with the product (or feature) being tested. The individuals who should participate are determined by the development team leaders and may include Usability Analysts, Business Analysts, Application Systems Technical Analysts, developers, etc.

5.2.3

Participant: The individual(s) selected, as representative users (e.g. physicians, nurses, therapists, technicians, etc.) of the product, who will perform typical, defined tasks with the product.

5.3 Documentation: 5.3.1

Usability Study Plan (USP): The document used to collect information required to prepare the UCD study. It should define the application to be tested, setup requirements (hardware, software and environment), participant profiles and tasks to be performed and any pre-study assumptions related to critical issues or limitations that affect the usability test. The USP should be reviewed and stored in the designated Intermountain repository.

5.3.2

Usability Study Report (USR): The document used to record the summary of findings and resulting recommendations from UCD studies. The USR should be stored in the designated Intermountain repository. It should include: 5.3.2.1

Name and version of the product

5.3.2.2

Date and location of the test

5.3.2.3

Test environment

5.3.2.4

Description of the intended users

5.3.2.5

Total number of participants

5.3.2.6

Description of participants

5.3.2.7

Description of the user tasks that were tested

5.3.2.8

List of the specific metrics captured during the test

5.3.2.9

Data scoring

5.3.2.10

Results of the test and data analysis

5.3.2.11

Major test findings

5.3.2.12

Identified problems and opportunities for improvement.

5.4 Evaluation: Following each UCD study, a Usability Study Report will be produced. The USR will identify problems and potential problems and is not intended to document the solution to the problems and issues identified. 5.5 Identified problems and opportunities for improvement should be reviewed by the business analyst lead, technical lead, and test lead or their designees. 6.0 Exceptions: None 7.0 Primary References: Software Design Control CIS Guideline (add link) Design Review SOP (add link) 8.0 Secondary Materials: NISTIR 7741: NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records

NISTIR 7742: Customized Common Industry Format Template for Electronic Health Record Usability Testing 9.0 Revision History: Document Created: Effective Date:

October 2013 October 2013

Next Scheduled Review Date: Subsequent Review/Revision Dates:

January 2015

Usability Study Report Computerized Provider Order Entry - Hospital Inpatient Setting

Product(s) & Version:

Safety Enhanced Design Requirements Addressed 2:

HELP1 Pharmacy Medication Orders 1 (additional software included in the certification of HELP2 Clinical Desktop 2014.M02.17)

§170.314(a)(1) Computerized Provider Order Entry, Hospital Setting

-----------------------------------Dates of Usability Test: Date of Report:

Nov. 26-27, 2013 Jan. 23, 2013

Report Prepared By:

Wendy Sudar, Intermountain Healthcare Phone Number: 801.507.9165 Email address: [email protected] Carl Bechtold, Intermountain Healthcare Phone Number: 801.507.9168 Email address: [email protected]

1

The HELP product is released weekly. It is not versioned. The Office of National Coordinator for Health Information Technology - Approved Test Procedures Version 1.2 December 14, 2012 §170.314(a)(1), Computerized provider order entry.

2

Table of Contents EXECUTIVE SUMMARY .......................................................................................................................................... 3 PERFORMANCE RESULTS ...................................................................................................................................... 3 MAJOR FINDINGS.................................................................................................................................................. 4 INTRODUCTION .................................................................................................................................................... 5 METHOD

...................................................................................................................................................... 5

INTENDED USERS/PARTICIPANTS ......................................................................................................................... 5 STUDY DESIGN .................................................................................................................................................. 5 TASKS ................................................................................................................................................................ 6 PROCEDURES .................................................................................................................................................... 6 TEST ENVIRONMENT ........................................................................................................................................ 7 TEST FORMS AND TOOLS .................................................................................................................................. 7 PARTICIPANT INSTRUCTIONS ........................................................................................................................... 8 USABILITY METRICS .......................................................................................................................................... 8 DATA SCORING ................................................................................................................................................. 8 MAJOR FINDINGS................................................................................................................................................ 10 INDENTIFIED PROBLEMS & OPPORTUNITIES FOR IMPROVEMENT .................................................................... 11 Appendix 1: DEMOGRAPHIC SURVEY ................................................................................................................. 13 Appendix 2: INFORMED CONSENT & RELEASE FORM ........................................................................................ 14 Appendix 3: SYSTEM USABILITY SCALE QUESTIONNAIRE ................................................................................... 15

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 2 of 15

EXECUTIVE SUMMARY A summative usability study was completed by Intermountain Healthcare’s Clinical Information Systems Business Analyst Team for the HELP1/Tandem Pharmacy Orders Module (hereafter referred to as HELP1 Pharmacy Orders). Study sessions were conducted from Nov. 26-27, 2013, at the Intermountain Medical Center, South Office Building. The HELP1 product is not versioned. The purpose of the study was to evaluate the ease of use for the current user interface and, additionally, to provide evidence of usability for HELP1 Pharmacy Orders. During the usability study, four (4) healthcare providers matching the target demographic criteria served as participants and used the HELP1 Pharmacy Orders module in simulated but representative tasks. The study collected performance data from six (6) tasks typically done within HELP1 Pharmacy Orders. A practice task was also included to allow participants to become familiar with the process of interacting with screen task instructions and taking the online post-task surveys. Morae Version 3.3.2 3 was used to record, observe, and analyze study data. During the 30 minute one-on-one usability tests, participants were greeted by the Usability Study Moderator and asked to review and sign an informed consent and release form 4 and were told that they could withdraw from the study at any time. All study participants had attended a three-hour orientation and training session on the HELP1 Electronic Medical Record (EMR) system one week prior to the test sessions. Otherwise, the participants were unfamiliar with the HELP1 Pharmacy Orders module. The Usability Study Moderator introduced the study and instructed participants to complete a series of tasks. After the final task of the study, participants were asked to complete a System Usability Scale (SUS) questionnaire. Participant identification has been removed. Study data can be linked, in some cases, to participants’ demographics but not to participants’ names. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records (NISTIR 7741) were used to evaluate the effectiveness, efficiency, and satisfaction of HELP1 Pharmacy Medication Orders. PERFORMANCE RESULTS CCHIT requirements for this test included 1) creating orders, 2) changing an order, and 3) viewing active orders. HELP1 Pharmacy Orders includes the ability to order complex infusion medications in the hospital environment. The testers included two of those order types to test that added capability. Task 1 – Review a selected patient’s active and inactive medications. (50% successful) Task 2 – Discontinue the patient’s Omeprazole order. (50% successful) Task 3 - View the patient’s Coumadin order history. (50% successful) Task 4 - Your patient has been taking 2.5 mg of Coumadin daily. The physician now wants the patient to take 3 mg daily. Make that change. (75% Successful)

3

Morae is produced by TechSmith Corporation. Morae Recorder, Observer, and Manager were used during the course of this study. See http://www.techsmith.com/morae.html for additional product information. 4 See Appendix 3 for more information about the Consent and Release form. Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 3 of 15

Task 5 - Create a medication order for a continuous infusion of sodium chloride 0.9% 100/ml/hr for a total of 2 liters. (25% successful) Task 6 - Create a medication order for a Dobutamine continuous infusion starting at 5mcg/kg/min and enter titration parameters. (0% successful) Efficiency: None of the participants were able to complete any of the tasks within the adjusted times (1.25 x) of the expert user. See complete data below. MAJOR FINDINGS In addition to the performance data collected, the following qualitative observations were made: HELP1 has maintained its Command Line Interface (CLI) with myriad nested menus to the present version, resulting in fast but training-intensive “Pick From Thousands” (PFT) keystroke operation. 5 Observations of actual trained clinical users comport to the effectiveness and efficiency of our expert user, indicating the software can be highly useful when the user is highly trained and experienced. However, all of this test’s novice module users, though they had undergone some demonstration-training, found the software non user-friendly. •

• •



HELP1 Pharmacy Medications Order module maintains a fairly consistent “look and feel” throughout the order workflow. However, inconsistent operation of controls and non-standard vocabulary are significant roadblocks to usability. The system does not provide user feedback or warnings at critical interaction points, resulting in lost time and errors. Typical of its CLI heritage, HELP1 provides no tool or info tips on mouse hovers, and a novice user is without access to “secret handshakes” that make the system usable. For example, in our test, users could type a string of letters into a “route” entry field. If the system did not recognize the string as a legal coded input, the user could not proceed. Producing a menu of legal entries requires typing a “?” into the entry field. Controls, variously F-keys or numbers, are not consistent. Moreover, their behavior is inconsistent. For example, some menus and toolbars work with a mouse click and others do not.

Additional Information and Suggested Improvements • Heuristic evaluation could be conducted to identify changes to the software that might reasonably be expected to improve user efficiency, effectiveness, satisfaction, and reduce user error. These could also be expected to reduce training time. • Contextual and discount usability techniques should be used to identify vocabulary problems to decrease training time and improve user performance. • User Interaction should be made consistent, at least throughout the module if not through the HELP1 EMR system. • User feedback must be implemented, particularly when user error results in loss of data.

5

Sudar, 2013, HELP/Tandem Allergies Module, p. 5

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 4 of 15

INTRODUCTION The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability. To this end, measures of effectiveness, efficiency, and user satisfaction, including task completion success, task completion times, and user satisfaction, were captured during the usability testing. The application tested for this study is the HELP1 Pharmacy Medication Orders module used extensively by licensed pharmacists in Intermountain Healthcare hospitals and in accordance with Utah State law. Health Evaluation through Logical Processing (HELP1) is an Intermountain Healthcare application developed over the past three decades. Originally a mainframe-terminal system, the software migrated to Tandem branded server hardware and operating system, with users interfacing with client PCs. Though the hardware and its proprietary data software became HP NonStop, the Intermountain product is still frequently known as the “Tandem,” as well as HELP1. Later, Intermountain developed a web-based system called Help2, primarily for ambulatory care, but it is also used in the hospitals. HELP1, or as it is variously known, Tandem, or Help1/Tandem, is one of the core EMR systems for Intermountain Healthcare’s 23 hospitals.

METHOD INTENDED USERS/PARTICIPANTS HELP1 Pharmacy orders is an Intermountain Healthcare product developed for a wide range of medication orders in a hospital environment. Capabilities include ordering of complex infusions. A total of four (4) clinicians took part in the study. Participants were recruited from a pool of Intermountain Healthcare employed clinicians. These individuals were compensated for their time based on Intermountain Healthcare compensation policies. Participants were scheduled for 30-minute sessions. Participants were scheduled to take part in the study at their Intermountain facilities. Participants had a mix of clinical specialties and demographic characteristics as shown in the following table. To ensure anonymity, participant names are replaced with Participant ID’s. Participant ID Product Experience E-Health Record Experience Clinical Experience Age Range Gender

1 Novice - 2 Expert - 4 > 15 years >50 M

2 Novice - 1 Expert - 4 6 – 10 Years 18-34 F

3 Novice – 2 Expert - 4 11 – 15 Years 35-49 F

4 Novice - 1 Expert - 4 6 – 10 Years >50 M

STUDY DESIGN Overall, the objective of this study was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction; and areas where the application failed to meet the needs of the participants. The data from this study may serve as a baseline for future studies using an updated version of the HELP1. In short, this study serves as both a means to record or benchmark current usability and also to identify areas where improvements might be made.

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 5 of 15

During the usability study, participants interacted with the same implementaiton of Help Pharmacy Order product and were provided with the identical instructions. The system was evaluated for effectiveness, efficiency, and satisfaction as defined by measures collected and analyzed for each participant: • • • •

Number of tasks successfully completed within the allotted time and without assistance Time to complete each task Participant comments Participant satisfaction ratings measured by the System Usability Scale.

TASKS Tasks were determined based on the following criteria: 1) prioritization based on fundamental and commonly performed activities; b) tasks equivalent or more complex than those specified in the ONC’s Test §170.314(a)(1) Computerized Provider Order Entry Hospital certification criteria as noted in the above Executive Summary. Tasks were also developed and implemented to evaluate risk, expressed as reduced effectiveness and potential negative outcomes for patients (NIST IR 7742 3.3). Tasks were presented to participants in an order typical of their clinical workflow, but pass/fail evaluation included prioritization of risk, and risk was used as absolute criteria for marking task completion as passed or failed. The study collected performance data from six (6) tasks. Morae Version 3.3.2 was used to record, observe, and analyze study data.

Task 1

Task 2 Task 3

1) Click Start Task to begin; 2) Select your patient. 3) View the patient's active medication list. 4) View the patient's inactive medication list. 5) Click End Task when you have completed the task. Discontinue the Omeprazole prescription. View the patient’s Coumadin order history.

Task 4

Your patient has been taking 2.5 mg of Coumadin daily. The physician now wants the patient to take 3 mg daily. Make that change.

Task 5 Task 6

Create a medication order for a continuous infusion of sodium chloride 0.9% 100/ml/hr for a total of 2 liters. Create a medication order for a dobutamine continuous infusion starting at 5mcg/kg/min and enter titration parameters.

PROCEDURES Upon arrival, participants were greeted and their identity was verified. Each participant reviewed and signed an informed consent and release form 6. The facilitator witnessed the participant’s signature.

6

See Appendix 3 for more information about the Consent and Release form.

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 6 of 15

A usability study moderator was assigned to set up the portable usability lab, administer instructions and conduct post study interviews. Morae software was used to administer surveys and tasks, record task times, log task success and failures, register mouse clicks and capture participant comments. For each task, the participants were presented with on-screen instructions. Task timing began when the participant indicated they were ready to start the task by clicking the Start Task button displayed in an on-screen Task Instructions dialog box. Task time was stopped when the participant indicated they had successfully completed the task by clicking the End Task button displayed in the Task Instructions dialog. Following the last task, a post-test questionnaire known as the System Usability Scale (SUS) was administered electronically. Once the post-test questionnaire was completed, participants were asked to discuss their rationale for answering the post test questions and were also requested to discuss the things they liked and did not like about HELP1 Pharmacy Orders. Participants were also invited to suggest improvements they felt would be useful. Participants were then thanked for their time and were invited to return to their normal activities. The Morae recordings captured each participant’s image as a “picture-in-picture” along with live action video of the software screen, mouse clicks, types of mouse clicks (left/right) and mouse movement. Participant demographics, times on tasks, satisfaction surveys, comments and post-test questionnaire responses were also recorded with Morae. A compilation of all participant recordings were assembled together into a Morae project, thus allowing data analysis to be compiled and reviewed for all five participants as a set, as well as by individual participant. TEST ENVIRONMENT For the convenience of test participants, all test sessions were conducted in conference rooms at the clinicians’ work facility. Studies were run on a portable computer system comprising Intermountain’s current standard clinical software and operating system. This included a laptop computer running 64-bit Windows 7 connected to an Intermountain standard-issue Dell keyboard and mouse, and a 22-inch 1680 X 1050 pixel Dell monitor. The system also included a Logitech Web Cam to record audio and video of the participants. The Intermountain Local Area Network provided access to the Help Pharmacy Order architecture (services, storage, etc.) Help Pharmacy Order was run from the Test environment. From a technical and practical point of view, system performance, response time, etc. was representative of an authentic implementation experience. TEST FORMS AND TOOLS During the usability study, various documents and instruments were used, including: 1. Informed Consent 2. On-line Tasks prompts These tasks were followed by a Post-test Questionnaire or System Usability Scale (SUS) 7 The participant’s interaction with the HELP1 Pharmacy Order was captured and recorded digitally. A web camera recorded the participant’s facial expressions synced with the screen capture. Comments were recorded by the camera’s microphone.

7

See Appendix 2 for more information about the System Usability Scale (SUS)

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 7 of 15

PARTICIPANT INSTRUCTIONS The usability study moderator allowed time for the participant to review the Informed Consent document 8. The usability study moderator also reviewed the following areas of Consent & Release document aloud with each participant. … I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation… USABILITY METRICS According to the NISTIR Guide to the Processes Approach for Improving the Usability of Electronic Health Records, electronic health record applications should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency, and user satisfaction were captured during the usability testing. The goals of the study were to assess: 1. Effectiveness, by measuring participant success rates. 2. Efficiency, by measuring the average task time. 3. Satisfaction, by measuring ease of use ratings. DATA SCORING Morae software collected data about Time on Task (ToT), Success & Failure Achievement and the Number & Severity of Errors. Satisfaction data captured through surveys was also configurable. The post-test System Usability Scale (SUS) questionnaire is an industry-standard set of ten (10) questions that captures the overall satisfaction rating for each participant. Morae also calculated the average SUS score for the participant set. DATA ANALYSIS AND REPORTING The results of the usability study were calculated according to the methods noted in the previous Usability Metrics section and the following types of data were collected for each participant: • • • •

Number of tasks successfully completed - within the allotted time and without assistance. Number of tasks not successfully completed. Time to complete each task. Major findings and/or Identified Problems and Opportunities based on issues that came up during the tasks.

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 8 of 15

• •

Participant’s verbalizations. Participant’s satisfaction ratings for the product’s ease of use.

PRODUCT EFFECTIVENESS RESULTS Task 1 – Review a selected patient’s active and inactive medications. (50% successful) Task 2 – Discontinue the patient’s Omeprazole order. (50% successful) Task 3 - View the patient’s Coumadin order history. (50% successful) Task 4 - Your patient has been taking 2.5 mg of Coumadin daily. The physician now wants the patient to take 3 mg daily. Make that change. (75% Successful) Task 5 - Create a medication order for a continuous infusion of sodium chloride 0.9% @ 100/ml/hr for a total of 2 liters. (25% successful) Task 6 - Create a medication order for a Dobutamine continuous infusion starting at 5mcg/kg/min and enter titration parameters. (0% successful) PRODUCT EFFICIENCY RESULTS None of the participants were able to complete the tasks within the timeframe of that set by an expert user. (The expert user baseline is calculated as actual time X 1.25) ToT Expert (P0) P0 (*1.25) P1 P2 P3 P4

Task 1 1:03 0.11 2:01 3:11 1:25 :29

Task 2 :14 0.74 3:38 4:30 6:13 1:58

Task 3 :13 0.60 7:43 :37 2:54 :36

Task 4 1:08 0.26 1:30 8:26 11:55 3:30

Task 5 :35 0:43 4:58 8:03 6:38 3:54

Task 6 1:29 1:51 5:42 5:08 8:59 3:36

(Grey areas indicate tasks failed) PRODUCT SATISFACTION RESULTS 9 The System Usability Scale (SUS) is a survey containing ten (10) questions. The average score for all ten questions was calculated for each participant. The overall score for the entire product was then taken from the average score from all five (5) participants. A SUS score above a 68 is considered above average and anything below 68 is below average. 10 The HELP1 Pharmacy Orders module received a score of 28.13 thus indicating a below average level of satisfaction.

All Participants P1 P2 P3 P4 SUS Score Standard Dev. 9 10

SUS Score 30 22.25 25 35 28.13 4.87

A participant’s task efficiency is measured against an expert user’s time on the same task multiplied by 1.25% Sauro, J. (2011). Measuring Usability with the System Usability Scale (SUS) ♦http://www.measuringusability.com/sus.php

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 9 of 15

MAJOR FINDINGS HELP1 has maintained its Command Line Interface (CLI) with myriad nested menus to the present version, resulting in fast but training-intensive “Pick From Thousands” (PFT) keystroke operation. 11 Observations of actual trained clinical users comport to the effectiveness and efficiency of our expert user, indicating the software can be highly useful when the user is highly trained and experienced. However, all of this test’s novice module users, though they had undergone some demonstrations and training, found the software non-intuitive. •







11

HELP1 Pharmacy Medications Order module maintains a consistent “look and feel” throughout the order workflow. However, inconsistent operation of controls and non-standard vocabulary are significant roadblocks to usability. o One participant working on task 4, for example, found a menu to “Edit Doses.” This user repeatedly attempted to use this to modify the dose size rather than the number of doses. o Participants struggled to find a way to select a patient. (uncommon abbreviation) o “Input the indices of the drug you want to delete” drew a frustrated “What index?” o An unlabeled and seemingly required field followed only by “MG” together with an “Available Strengths” dialog caused most users to repeatedly attempt and fail to “resolve dose exactly” as the dialog instructed. The system does not provide user feedback or warnings at critical interaction points, resulting in lost time and errors. o Repeated verbalizations from users indicated they did not know when their orders were “done.” o One user, viewing a drug interaction screen, used [ESC] (the common way to “go back”) rather than a “Y” or “N” to save or abort the order. The order was lost with no feedback or warning to the user. Typical of its CLI heritage, HELP1 provides no tool or info tips on mouse hovers, and a novice user is without access to “secret handshakes” that make the system usable. For example, in our test, users could type a string of letters into a “route” entry field. If the system did not recognize the string as a legal coded input, the user could not proceed. Producing a menu of legal entries requires typing a “?” into the entry field. Our expert user utilized a “.” followed by a mnemonic to filter menus. Controls, variously F-keys or numbers, are not consistent in appearance or function. For example, some work with a mouse click and others do not. In multiple circumstances, users had difficulty understanding how the tool numbers related to item numbers. Further errors resulted from using numbers that activated unintended actions. This was particularly true when two rows of commands were available. Vertical scroll bars appear within some menus, but none of the participants noticed them without hinting.

Sudar, 2013, HELP/Tandem Allergies Module, p. 5

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 10 of 15

INDENTIFIED PROBLEMS & OPPORTUNITIES FOR IMPROVEMENT •



• • • •

12

Adherence to established usability norms has been shown to significantly improve efficiency while reducing training costs and errors. For example, “One study at NCR showed a 25% increase in throughput with an additional 25% decrease in errors resulting from redesign of screens to follow basic principles of good design” 12 The same authors document significant reduction in training costs with improved usability. Heuristic evaluation could be conducted to identify changes to the software that could reasonably be expected to improve user efficiency, effectiveness, satisfaction, and reduce user error. These could also be expected to reduce training time. Contextual and discount usability techniques should be used to identify vocabulary problems to decrease training time and improve user performance. User Interaction should be made consistent, at least throughout the module if not through the HELP1 EMR system. User feedback must be implemented, particularly when user error results in loss of data. When coded data is required in a field, users should be given an affordance to open coded data menus without having to know the “secret handshakes.”

Gallaway, 1981, quoted in Bias and Mayhew, 2005, Cost Justifying Usability, Morgan Kaufmann, p. 29.

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 11 of 15

APPENDICES The following appendices include supplemental data for this usability test report. Following is a list of the appendices provided: Appendix 1: Appendix 2: Appendix 3:

Jan. 20, 2013

Demographic Survey System Usability Scale (SUS) Questionnaire Informed Consent and Release Form

HELP/Tandem Pharmacy Medication Orders

Page 12 of 15

Appendix 1: DEMOGRAPHIC SURVEY Demographic Survey

Thank you for participating in today's study. To help us understand your area of expertise, please take a few minutes to provide answers to the following questions. 1. What is your age group? () 18-34 () 35-49 () 50+ 2. What is your gender? () Female () Male 3. What is your current clinical specialty? () MD () Medical Assistant () Nurse Practitioner () Physician’s Assistant () Pharmacist () RN () Other Additional Comments: 4. How long have you worked in your area of clinical expertise? () 0-5 years () 6-10 years () 11-15 years () Over 15 years Additional Comments: 5. Where would you rate your electronic health record computer expertise? Novice 1 2 3 4 5 Expert Additional Comments: 6. How familiar are you with the HELP1 Allergies module? Novice 1 2 3 4 5 Expert Additional Comments: 7. Which best applies to your current work environment? () Clinic-Outpatient () Inpatient () Both Additional Comments: Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 13 of 15

Appendix 2: INFORMED CONSENT & RELEASE FORM For valued contributions received, the recognition and sufficiency of which are hereby acknowledged, I ___________________________________________________ (name of test, interview or research participant) have voluntarily agreed to participate in User Centered sessions, assessments or interviews conducted and/or recorded by: _________________________________________(name of person conducting tests, interviews or recordings) I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation. Although the sessions, assessments or interviews are not designed to create stress or discomfort, I understand that I may experience some stress or discomfort and that I may withdraw at any time without penalty. I understand the software designs I view may be confidential and I will not discuss them with anyone outside of this study group. I also understand that these designs may be experimental in nature. I also understand that they may or may not be implemented in any actual products. I have read the above release and consent, prior to its execution; I fully understand the contents and consequences thereof. This agreement shall be binding upon me and my heirs, legal representatives and assigns.  YES, I have read the above statement and agree to be a participant. __________________________________________________________ (Signature of Participant) Date: ________________________

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 14 of 15

Appendix 3: SYSTEM USABILITY SCALE QUESTIONNAIRE

1. I think that I would like to use this system frequently.

Strongly Disagree

Strongly Agree

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

2. I found the system unnecessarily complex.

3. I thought the system was easy to use.

4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly. 8. I found the system very cumbersome to use.

9. I felt very confident using the system.

10. I needed to learn a lot of things before I could get going with this system.

Jan. 20, 2013

HELP/Tandem Pharmacy Medication Orders

Page 15 of 15

Usability Study Report Computerized Provider Order Entry Imaging Orders – Inpatient Setting

Product(s) & Version:

Safety Enhanced Design Requirements Addressed 2:

HELP1 Order Communications, Imaging Orders 1 (additional software included in the certification of HELP2 Clinical Desktop 2014.M02.17)

§170.314(a)(1) Computerized Provider Order Entry

-----------------------------------Dates of Usability Test: Date of Report:

Jan. 15-21, 20114 Jan. 29, 2014

Report Prepared By:

Carl Bechtold, Intermountain Healthcare Phone Number: 801.507.9168 Email address: [email protected]

1

The HELP product is released weekly. It is not versioned. The Office of National Coordinator for Health Information Technology - Approved Test Procedures Version 1.2 December 14, 2012 §170.314(a)(1), Computerized provider order entry.

2

Table of Contents Executive Summary ........................................................................................................................................................ 3 Performance Results ............................................................................................................................................ 3 Major Findings synopsis ....................................................................................................................................... 4 Additional Information and Suggested Improvements ........................................................................................ 4 Method ........................................................................................................................................................................... 5 Introduction

...................................................................................................................................................... 5

Intended Users and Participants .......................................................................................................................... 5 Study Design ...................................................................................................................................................... 5 Tasks

...................................................................................................................................................... 6

Procedures

...................................................................................................................................................... 6

Test Environment ................................................................................................................................................. 7 Test Forms and Tools ............................................................................................................................................ 7 Participant Instructions ........................................................................................................................................ 7 Usability Metrics ................................................................................................................................................... 8 Data Scoring...................................................................................................................................................... 8 Data Analysis and Reporting............................................................................................................................. 8 Findings........................................................................................................................................................................... 9 Product Effectiveness Results............................................................................................................................... 9 Product Efficiency Results .................................................................................................................................... 9 Product Satisfaction Results ............................................................................................................................. 10 Major Findings .................................................................................................................................................... 10 Opportunities for Improvement ......................................................................................................................... 12 Appendices ................................................................................................................................................................... 13 Appendix 1: Demographic Survey ...................................................................................................................... 14 Appendix 2: Informed Consent & Release Form ................................................................................................ 15 Appendix 3: System Usability Scale Questionnaire ............................................................................................ 16

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 2 of 16

Executive Summary A summative usability study was completed by Intermountain Healthcare’s Clinical Information Systems Business Analyst Team for the HELP1 Order Communication Module (hereafter referred to as Order Comm). Study sessions were conducted from Jan. 15-21, 2014, at the Intermountain Medical Center, South Office Building. This building is proximate to both inpatient hospital and ambulatory settings where the product under study is used. The HELP1 product releases weekly and is not versioned. The purpose of the study was to evaluate the ease of use for the current user interface and, additionally, to provide evidence of usability for Order Comm. During the usability study, six healthcare providers matching the target demographic criteria served as participants and used Order Comm module in simulated but representative tasks provided by the §170.314(a)(1) test criteria. The study collected performance data from four tasks typically done within Order Comm. Morae Version 3.3.2 3 was used to record, observe, and analyze study data. During the 30-minute one-on-one usability tests, participants were greeted by the Usability Study Moderator and asked to review and sign an informed consent and release form 4 and were told that they could withdraw from the study at any time. All study participants were employed clinicians who use the product under consideration in their work environments. The Usability Study Moderator introduced the study and instructed participants to complete a series of tasks. After the final task of the study, participants were asked to complete a System Usability Scale (SUS) questionnaire. Participant identification has been removed. Study data can be linked, in some cases, to participants’ demographics but not to participants’ names. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records (NISTIR 7741) were used to evaluate the effectiveness, efficiency, and satisfaction of Order Comm.

Performance Results CCHIT requirements for this test included 1) creating orders, 2) changing an order, and 3) viewing active orders. Effectiveness: Search failures and users’ inability to find feature controls contributed to an overall average task failure rate of 8 percent, with other tasks completed with considerable difficulty. These issues are discussed in greater detail below. Task Fail Rate

1 0

2 0

3 33

4 0

Average 8.25

Efficiency: Participants were generally unable to complete tasks within the adjusted times (1.25 x) of the expert user. Time lags were exacerbated by search and vocabulary problems mentioned below. Minor confounding issues related to Morae and the menu structure of the test environment also contributed to slower times as noted below. Generally, these clinicians completed tasks rapidly and without error except

3

Morae is produced by TechSmith Corporation. Morae Recorder, Observer, and Manager were used during the course of this study. See http://www.techsmith.com/morae.html for additional product information. 4 See Appendix 3 for more information about the Consent and Release form. Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 3 of 16

for orders requiring them to initiate hidden functionality (Cancel) or when they had difficulty searching for orderable items, as noted below. Satisfaction: SUS scores averaged 48.75, somewhat short of the 68 industry standard for satisfaction. User comments included a long learning curve for interaction and vocabulary, inconsistent interaction, and difficulty finding features.

Major Findings synopsis •





Experienced users were able to place orders quickly, with few keystrokes, and almost no mouse gestures – if they were successful at finding target items and features as noted below. The legacy command line interface can be very fast, but users endorsed extensive training time and learning to achieve efficiency. Inability to find orderable items created the greatest difficulty for these users. Essentially, this is a lack of “smart search” technology. Though they were experienced with the software, they were not always familiar with HELP1 terminology abbreviations. Moreover, the patterns for searching appeared to be inconsistent. These experienced users attested to using alternative methods of finding correct search terms in their own daily work. Their comments indicate search problems were not peculiar to this study’s content. Interactions required to complete the test tasks proved difficult to find for four of the six users. The most troublesome included a method of canceling an active order, and included a clear way to change patient focus. The inability to cancel led to more task failures than any other problem. (See detailed discussion below.)

Additional Information and Suggested Improvements • •



Implement improved search capabilities for orderable items (terminology). Heuristic evaluation could be conducted to identify changes to the software that might reasonably be expected to improve user efficiency, effectiveness, satisfaction, and reduce user error. These could also be expected to reduce training time. User Interaction should be made consistent, at least throughout the module if not throughout the entire HELP1 EMR system.

See additional information in the Findings section.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 4 of 16

Method Introduction This study tested the usability of the current user interface, workflow and interaction of the Order Communications (Order Comm) module, a component within Intermountain Healthcare’s Health Evaluation through Logical Processing (HELP1) software. To this end, measures of effectiveness, efficiency, and user satisfaction, including task completion success rates, task completion times, and user satisfaction, were captured during the usability testing. Intermountain Healthcare’s HELP1 software has developed over the past three decades. Originally a mainframe-terminal system, the software migrated to Tandem branded server hardware and operating system, with users interfacing with client PCs. Though the hardware and its proprietary data software became HP NonStop, the Intermountain product is still frequently known as the “Tandem,” as well as HELP1. Later, Intermountain developed a web-based system called Help2, primarily for ambulatory care, but it is also used in the hospitals. HELP1, or as it is variously known, Tandem, or HELP11/Tandem, is one of the core EMR systems for Intermountain Healthcare’s 23 hospitals and their associated ambulatory services.

Intended Users and Participants The module tested for this study is used extensively by clinicians in Intermountain Healthcare hospitals and associated ambulatory settings, including emergency departments, in accordance with Utah State law. A total of six Intermountain employed clinicians took part in the study. These individuals were paid for their time based on Intermountain Healthcare compensation policies. Participants were scheduled for 60-minute sessions, as this test was combined with other Order Comm study tasks. Test Participants had a mix of clinical specialties and demographic characteristics as shown in the table. To ensure anonymity, participant names are replaced with Participant ID’s. Participant ID Product Experience E-Health Record Experience Work Environment Clinical Experience Age Range Gender

2 Expert - 4 Expert - 4 Hospital 11–15 Years 18-34 F

3 Expert - 5 Expert - 4 Hospital 0–5 Years 18-34 F

4 Intermediate-3 Intermediate-3 Both 11–15 Years 35-49 F

5 Intermediate-3 Novice - 2 Outpatient 0–5 Years 18-34 F

6 Intermediate-3 Expert - 4 Hospital 0–5 Years 18-34 M

7 Novice - 2 Intermediate-3 Outpatient 0–5 Years 35-49 F

Sessions occurred in an office facility on the hospital campus where they all work. All Tests utilizing Order Comm software to meet ONC Test §170.314(a)(1) Lab and Imaging criteria were conducted in the same sessions to optimize clinician time. The tool is used in both EH and EP environments, relative to the ONC criteria, with clinicians typically ordering both Lab and Imaging orders in their work assignments.

Study Design Overall, the objective of this study was to discover areas where the application performed well–that is, effectively, efficiently, and with satisfaction; and areas where the application failed to meet the needs of the participants. The data from this study may serve as a baseline for future studies using an updated version of the HELP1. In short, this study serves as both a means to record or benchmark current usability and also to identify areas where improvements might be made. Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 5 of 16

During the usability study, participants interacted with the same implementation of the Order Comm product and were provided with the identical instructions. The system was evaluated for effectiveness, efficiency, and satisfaction as defined by measures collected and analyzed for each participant. • • • •

Number of tasks successfully completed within the allotted time and without assistance Time to complete each task (Time on Task, or ToT) Participant comments Participant satisfaction ratings measured by the System Usability Scale.

Tasks Tasks were those specified in the ONC’s Test §170.314(a)(1) Computerized Provider Order Entry Hospital certification criteria presented as fundamental and commonly performed activities within a clinical workflow. Tasks included the ability to order an item, change an item, and view current active orders. Order Comm requires patient and provider selection before orders can be created. Preparation and orientation: 1) Select patient, 2) Review Morae interface and interaction, and 3) Demographic Survey. Task 1: CT abdomen & pelvis w/ contrast material. Task 2: Change the previous order to: CT abdomen & pelvis w/o contrast material. Task 3: Myocrd image PET perfus single study rest/stress. Task 4: Review active orders. Tasks were also developed and implemented to evaluate risk, expressed as reduced effectiveness and potential negative outcomes for patients (NIST IR 7742 3.3). Areas of significant patient risk, such as patient transportation, were not addressed in this test scenario as they will be considered later relative to decision support. Tasks were presented to participants in an order typical of their clinical workflow, but pass/fail evaluation included prioritization of risk, and risk, if applicable, was used as absolute criteria for marking task completion as passed or failed.

Procedures Upon arrival, participants were greeted and their identity was verified. Each participant reviewed and signed an informed consent and release form 5. The facilitator witnessed the participant’s signature. A usability study moderator was assigned to set up the portable usability lab, administer instructions and conduct post study interviews. Morae software was used to administer surveys and tasks, record task times, log task success and failures, register mouse clicks and capture participant comments. For each task, the participants were presented with on-screen instructions. Task timing began when the participant indicated they were ready to start the task by clicking the Start Task button displayed in an on-screen Task Instructions dialog box. Task time was stopped when the participant indicated they had successfully completed the task by clicking the End Task button displayed in the Task Instructions dialog. Following the last task, a post-test System Usability Scale (SUS) survey was administered electronically. Once the post-test questionnaire was completed, participants were asked to discuss their rationale for answering the post test questions and were also requested to discuss the things they liked and did not like about Order Comm. 5

See Appendix 3 for more information about the Consent and Release form.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 6 of 16

Participants were also invited to suggest improvements they felt would be useful. Participants were then thanked for their time and were invited to return to their normal activities. The Morae recordings captured each participant’s image as a “picture-in-picture” along with live action video of the software screen, mouse clicks, types of mouse clicks (left/right) and mouse movement. Participant demographics, times on tasks, satisfaction surveys, comments and post-test questionnaire responses were also recorded with Morae. A compilation of all participant recordings were assembled together into a Morae project, thus allowing data analysis to be compiled and reviewed for all six participants as a set, as well as by individual participant.

Test Environment Studies were run on a portable computer system comprising Intermountain’s current standard clinical software and operating system. This included a laptop computer running 64-bit Windows 7 connected to an Intermountain standard-issue Dell keyboard and mouse, and a 22-inch 1680 X 1050 pixel Dell monitor. The system also included a Logitech Web Cam to record audio and video of the participants. The Intermountain Local Area Network provided access to the HELP1 architecture (services, storage, etc.) Order Comm was run from the Intermountain Test environment. From a technical and practical point of view, system performance, response time, etc. were representative of an authentic implementation experience. Potential confounding issues: HELP1, and particularly workflows in Order Comm, utilize the Esc key as a “go back” command. Unfortunately, Morae uses that key to stop recordings. Users had to click on the HELP1 interface when returning from Morae task functions, which added to click counts and may have contributed to user stress. Screen content for the test environment and test user was different from those used in the participants’ own work environments, perhaps contributing to minor confusion and slight time lags.

Test Forms and Tools During the usability study, various documents and instruments were used, including 1. Informed Consent 2. On-line Tasks prompts 3. Demographic Survey 4. SUS Questionnaire

Participant Instructions The usability study moderator allowed time for the participant to review the Informed Consent document 6. The usability study moderator also reviewed the following areas of Consent & Release document aloud with each participant. … I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 7 of 16

I understand that I may withdraw from this session, assessment, or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for my participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation.

Usability Metrics According to the NISTIR Guide to the Processes Approach for Improving the Usability of Electronic Health Records, electronic health record applications should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency, and user satisfaction were captured during the usability testing. The goals of the study were to assess: 1. Effectiveness, by measuring participant success rates. 2. Efficiency, by measuring the average task time. 3. Satisfaction, by measuring ease of use ratings. Data Scoring Morae software collected data about Time on Task (ToT), Success & Failure Achievement and the Number & Severity of Errors. Satisfaction data captured through surveys was also configurable. The post-test (SUS) questionnaire is an industry-standard set of ten (10) questions that captures the overall satisfaction rating for each participant. Morae also calculated the average SUS score for the participant set. Data Analysis and Reporting The results of the usability study were calculated according to the methods noted in the previous Usability Metrics section and the following types of data were collected for each participant: • Number of tasks successfully completed - within the allotted time and without assistance. • Number of tasks not successfully completed. • Time to complete each task. • Major findings and/or Identified Problems and Opportunities based on issues that came up during the tasks. • Participant’s verbalizations. • Participant’s satisfaction ratings for the product’s ease of use.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 8 of 16

Findings Product Effectiveness Results As noted above, effectiveness as determined by task completion reached 79 percent overall, with a 21 percent failure rate. The graph below illustrates the tasks where users experienced the greatest difficulties and failures. Task 1 asked participants to order a CT with contrast. Task 2 asked that order to be changed to a CT without contrast. As in other studies in this group, users were unable to find a way of canceling an order so that it might be replaced. They also had difficulty searching for some items.

Product Efficiency Results The chart below compares professional clinicians’ Times on Task (ToT) with that of an analyst who was an expert with the task genre and module. (The “expert user” baseline is calculated as actual time X 1.25) ToT Expert (P1) P1 (*1.25) P2 P3 P4 P5 P6 P7

PREP .55 .69 .75 1.24 .81 1.40 .78 2.62

Task 1 .79 .99 .29 .69 2.88 .61 .75 .75

Task 2 .62 .78 .46 1.03 3.61 .04 1.00 .08

Task 3 .70 .88 .73 1.20 1.36 .75 .67 .95

Task 4 .27 .34 .26 .20 .53 .20 .17 .13

Times expressed as minutes in decimal. Grey areas indicate tasks failed

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 9 of 16

Product Satisfaction Results 7 The System Usability Scale (SUS) is a survey containing 10 questions. The average score for all 10 questions was calculated for each participant. The overall score for the entire product was then taken from the average score from all six participants. A SUS score above a 68 is considered above average and anything below 68 is below average. 8 The Order Comm module received a score of 48.75 thus indicating a below average level of satisfaction. Participant P2 P3 P4 P5 P6 P7 Standard Dev. Average

SUS 45 80 55 27.5 22.5 62.5 19.8 48.75

Major Findings HELP1 has maintained its Command Line Interface (CLI) with myriad nested menus to the present version, resulting in fast but training-intensive “Pick from Thousands” (PFT) keystroke operation. 9 Observations of actual trained clinical users testify to the effectiveness and efficiency of expert users, indicating the software can be highly useful when the user is highly trained and experienced. However, as this study of experienced users demonstrates, efficiency requires consistent formatting and content memorization rather than ability to intuit the interaction. Any changes in page formatting or content vocabulary significantly slow throughput and increase errors. For example, these users initially hesitated at the opening screen. Some commented that the screen was “different from mine” at various points in the workflow. The software displays options based on user rights and roles. The test environment allowed all of those, so the screen was more populated than what the test participants would see in their own work environment. In one particular case, users expected to enter “RAD” in a search field to act as a filter to show only Imaging items. The field for that filter was neither available nor defaulted in the test environment, resulting in significant difficulty for some users. The filter and search relationship was not intuitive for them. Similar issues could be expected as clinicians change assignments and/or departments. Also, the orderable item search tool was repeatedly unhelpful in locating items to be ordered in the tasks. Within the combined group of studies, participants consistently commented on the need for a “better way” of looking up orderable items, and endorsed such terms as “smart search” or “Google-like search.” Those search features are available in other Intermountain software products. Because of the weakness of the HELP1 Order Comm search feature, users commented: •

7 8 9

“This is where I look it up in my book.” (A printed index of item mnemonics, abbreviations, and names she keeps constantly at hand at her work station. These and various “cheat sheets” decorate desktops and cupboard doors in clinician workspaces.)

A participant’s task efficiency is measured against an expert user’s time on the same task multiplied by 1.25% Sauro, J. (2011). Measuring Usability with the System Usability Scale (SUS) ♦http://www.measuringusability.com/sus.php Sudar, 2013, Intermountain Healthcare Usability Study: HELP/Tandem Allergies Module, p. 5

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 10 of 16

• •



“I would look it up in (other software).” (Separate Intermountain software products can be used for terminology search.) “This is when we call the (performing location).” (Phone call to the performing location, a lab or imaging center, or conversation with a tech who might be available at the moment. This is common practice for both Lab and Imaging orders) “We just write it in and let the (performing location) pick the right one.” (Utilizing the comments fields to clarify a “close” match. This is different from deferring to performing location expertise. This behavior results from the user’s inability to find the right orderable item.)

These options range from inconvenient to potential patient safety risks. That a printed index of orderable items is more effective and efficient than a computerized search tool speaks to a need for technical upgrades in HELP1 and the Order Comm module. Further, the goal of clinical software as a support to the professional clinician is frustrated if feature deficits cause the clinician to depend upon the actions of other learned intermediaries in order to create and process a proper order. Granted, expert clinicians in the performing location might be more adept at selecting tests to confirm particular problems (e.g., indications or differential diagnoses) than some ordering providers, but if the ordering provider knows what he or she wants and simply can’t find it, the software is probably underperforming. For these reasons, and due to the frequency and negative performance impacts of the problem, the search feature appears to be the most serious usability problem discovered in these tests. Second in severity and frequency of negative impact is the problem of hidden features, or “secret handshakes” that one must know to operate the software. Those are discussed generally below, but in these tests, where required criteria called upon clinicians to “change” an order, the inability to find a way to “cancel” proved troublesome. Participants engaged in three software workflows to accomplish “change” tasks. One involved a batch order workflow initiated but then abandoned by one user. In that workflow, the user could discover an otherwise hidden command to “Delete” one of a list of unsaved orders rather than saving all of the orders. In Order Comm, once an order is made active by invoking a “Save” command, the order must be “Cancelled” and another order created to invoke a “change.” “Cancelled,” however, is an option found only in an order “Status” menu that proved difficult for users to find. In some cases, selecting an active order from an Order Review screen, and pressing Enter would display a screen with the Status field and menu. Sometimes not. The other workflow was to select to Modify an order from the main order screen, and enter the order number—which the user would have to have remembered from one or two screens before, depending on the route they took. This single problem caused more task failures than any other in the Order Comm CPOE tests. Participants endorsed the most common workaround: “Call the lab.” Imaging and Lab downstream software, which receive Order Comm orders through APIs, are used to cancel items instead of Order Comm. Other problems • The Order Comm module maintains a consistent “look and feel” throughout the order workflow. However, inconsistent operation of controls and non-standard vocabulary are significant roadblocks to usability. o Participants struggled to find a way to change patients. The main menu features a Function Key toolbar/menu at the bottom of the screen, included “F12-Pdt”. None of the Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 11 of 16

• •

participants used this control to change patients without hinting. One participant explained she changes patients by closing the HELP1 software and re-launching it. o Controls, variously F-keys or numbers, are not consistent in appearance or function. For example, some work with a mouse click and others do not. In multiple circumstances, users had difficulty understanding how the tool numbers related to item numbers. Further errors resulted from using numbers that activated unintended actions. This was particularly true when two rows of commands were available. The system does not provide user feedback or warnings at critical interaction points, resulting in lost time and errors. Typical of its CLI heritage, HELP1 provides no tool or info tips on mouse hovers, and a novice user is without access to “secret handshakes” that make the system usable. Producing a menu of legal entries requires typing a “?” into the entry field in some cases. In other cases a “.” followed by a string is required. Efficient and effective use of these interactions can only result from training and practice, which the participants said is significant.

Additional Information and Suggested Improvements • The HELP1 search control should be updated to facilitate “smart search” capability. • Though “Cancelled” might be understood as a status (adjective), it is also an important action (verb). Consider surfacing this action as a control visible from appropriate screens. • User Interaction should be made consistent, at least throughout the module if not through the HELP1 EMR system. • When coded data is required in a field, users should be given an affordance to open coded data menus without having to know the “secret handshakes.” Opportunities for Improvement •





10

Adherence to established usability norms has been shown to significantly improve efficiency while reducing training costs and errors. For example, “One study at NCR showed a 25% increase in throughput with an additional 25% decrease in errors resulting from redesign of screens to follow basic principles of good design” 10 The same authors document significant reduction in both training and support costs with improved usability. Heuristic evaluation could be conducted to identify changes to the software that could reasonably be expected to improve user efficiency, effectiveness, satisfaction, and reduce user error. These could also be expected to reduce training time. Contextual and discount usability techniques should be used to identify vocabulary problems to decrease training time and improve user performance.

Gallaway, 1981, quoted in Bias and Mayhew, 2005, Cost Justifying Usability, Morgan Kaufmann, p. 29.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 12 of 16

Appendices The following appendices include supplemental data for this usability test report. Appendix 1: Appendix 2: Appendix 3:

Jan. 29, 2014

Demographic Survey System Usability Scale (SUS) Questionnaire Informed Consent and Release Form

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 13 of 16

Appendix 1: Demographic Survey Demographic Survey

Thank you for participating in today's study. To help us understand your area of expertise, please take a few minutes to provide answers to the following questions. 1. What is your age group? () 18-34 () 35-49 () 50+ 2. What is your gender? () Female () Male 3. What is your current clinical specialty? () MD () Medical Assistant () Nurse Practitioner () Physician’s Assistant () Pharmacist () RN () Other Additional Comments: 4. How long have you worked in your area of clinical expertise? () 0-5 years () 6-10 years () 11-15 years () Over 15 years Additional Comments: 5. Where would you rate your electronic health record computer expertise? Novice 1 2 3 4 5 Expert Additional Comments: 6. How familiar are you with the HELP1 Allergies module? Novice 1 2 3 4 5 Expert Additional Comments: 7. Which best applies to your current work environment? () Clinic-Outpatient () Inpatient () Both Additional Comments: Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 14 of 16

Appendix 2: Informed Consent & Release Form For valued contributions received, the recognition and sufficiency of which are hereby acknowledged, I ___________________________________________________ (name of test, interview or research participant) have voluntarily agreed to participate in User Centered sessions, assessments or interviews conducted and/or recorded by: _________________________________________(name of person conducting tests, interviews or recordings) I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation. Although the sessions, assessments or interviews are not designed to create stress or discomfort, I understand that I may experience some stress or discomfort and that I may withdraw at any time without penalty. I understand the software designs I view may be confidential and I will not discuss them with anyone outside of this study group. I also understand that these designs may be experimental in nature. I also understand that they may or may not be implemented in any actual products. I have read the above release and consent, prior to its execution; I fully understand the contents and consequences thereof. This agreement shall be binding upon me and my heirs, legal representatives and assigns.  YES, I have read the above statement and agree to be a participant. __________________________________________________________ (Signature of Participant) Date: ________________________

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 15 of 16

Appendix 3: System Usability Scale Questionnaire

1. I think that I would like to use this system frequently.

Strongly Disagree

Strongly Agree

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

2. I found the system unnecessarily complex.

3. I thought the system was easy to use.

4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly. 8. I found the system very cumbersome to use.

9. I felt very confident using the system.

10. I needed to learn a lot of things before I could get going with this system.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Imaging Orders – HELP Order Communication

Page 16 of 16

Usability Study Report Computerized Provider Order Entry Lab Orders – Inpatient Setting

Product(s) & Version:

Safety Enhanced Design Requirements Addressed 2:

HELP1 Order Communications, Lab Orders 1 (additional software included in the certification of HELP2 Clinical Desktop 2014.M02.17)

§170.314(a)(1) Computerized Provider Order Entry

-----------------------------------Dates of Usability Test: Date of Report:

Jan. 15-21, 20114 Jan. 29, 2014

Report Prepared By:

Carl Bechtold, Intermountain Healthcare Phone Number: 801.507.9168 Email address: [email protected]

1

The HELP product is released weekly. It is not versioned. 2 The Office of National Coordinator for Health Information Technology - Approved Test Procedures Version 1.2 December 14, 2012 §170.314(a)(1), Computerized provider order entry.

Table of Contents Executive Summary ........................................................................................................................................................ 3 Performance Results ............................................................................................................................................ 3 Major Findings synopsis ....................................................................................................................................... 4 Additional Information and Suggested Improvements ........................................................................................ 4 Method ........................................................................................................................................................................... 5 Introduction

...................................................................................................................................................... 5

Intended Users and Participants .......................................................................................................................... 5 Study Design ...................................................................................................................................................... 5 Tasks

...................................................................................................................................................... 6

Procedures

...................................................................................................................................................... 6

Test Environment ................................................................................................................................................. 7 Test Forms and Tools ............................................................................................................................................ 7 Participant Instructions ........................................................................................................................................ 7 Usability Metrics ................................................................................................................................................... 8 Data Scoring...................................................................................................................................................... 8 Data Analysis and Reporting............................................................................................................................. 8 Findings........................................................................................................................................................................... 9 Product Effectiveness Results............................................................................................................................... 9 Product Efficiency Results .................................................................................................................................... 9 Product Satisfaction Results ............................................................................................................................. 10 Major Findings .................................................................................................................................................... 10 Opportunities for Improvement ......................................................................................................................... 12 Appendices ................................................................................................................................................................... 13 Appendix 1: Demographic Survey ...................................................................................................................... 14 Appendix 2: Informed Consent & Release Form ................................................................................................ 15 Appendix 3: System Usability Scale Questionnaire ............................................................................................ 16

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 2 of 16

Executive Summary A summative usability study was completed by Intermountain Healthcare’s Clinical Information Systems Business Analyst Team for the HELP1 Order Communication Module (hereafter referred to as Order Comm). Study sessions were conducted from Jan. 15-21, 2014, at the Intermountain Medical Center, South Office Building. This building is proximate to both inpatient hospital and ambulatory settings where the product under study is used. The HELP1 product releases weekly and is not versioned. The purpose of the study was to evaluate the ease of use for the current user interface and, additionally, to provide evidence of usability for Order Comm. During the usability study, six healthcare providers matching the target demographic criteria served as participants and used Order Comm module in simulated but representative tasks provided by the §170.314(a)(1) test criteria. The study collected performance data from four tasks typically done within Order Comm. Morae Version 3.3.2 3 was used to record, observe, and analyze study data. During the 30-minute one-on-one usability tests, participants were greeted by the Usability Study Moderator and asked to review and sign an informed consent and release form 4 and were told that they could withdraw from the study at any time. All study participants were employed clinicians who use the product under consideration in their work environments. The Usability Study Moderator introduced the study and instructed participants to complete a series of tasks. After the final task of the study, participants were asked to complete a System Usability Scale (SUS) questionnaire. Participant identification has been removed. Study data can be linked, in some cases, to participants’ demographics but not to participants’ names. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records (NISTIR 7741) were used to evaluate the effectiveness, efficiency, and satisfaction of Order Comm.

Performance Results CCHIT requirements for this test included 1) creating orders, 2) changing an order, and 3) viewing active orders. Effectiveness: Search failures and users’ inability to find feature controls contributed to an overall average task failure rate of 17 percent. These issues are discussed in greater detail below. Task Fail Rate

1 0

2 0

3 17

4 67

5 0

Average 16.8

Efficiency: Participants were generally unable to complete tasks within the adjusted times (1.25 x) of the expert user. Time lags were exacerbated by search and vocabulary problems mentioned below. Minor confounding issues related to Morae and the menu structure of the test environment also contributed to slower times as noted below. Generally, these clinicians completed tasks rapidly and without error except for orders requiring them to initiate hidden functionality (Cancel) or when they had difficulty searching for orderable items, as noted below. 3

Morae is produced by TechSmith Corporation. Morae Recorder, Observer, and Manager were used during the course of this study. See http://www.techsmith.com/morae.html for additional product information. 4 See Appendix 3 for more information about the Consent and Release form. Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 3 of 16

Satisfaction: SUS scores averaged 48.75, somewhat short of the 68 industry standard for satisfaction. User comments included a long learning curve for interaction and vocabulary, inconsistent interaction, and difficulty finding features.

Major Findings synopsis •





Experienced users were able to place orders quickly, with few keystrokes, and almost no mouse gestures – if they were successful at finding target items and features as noted below. The legacy command line interface can be very fast, but users endorsed extensive training time and learning to achieve efficiency. Inability to find orderable items created the greatest difficulty for these users. Essentially, this is a lack of “smart search” technology. Though they were experienced with the software, they were not always familiar with HELP1 terminology abbreviations. Moreover, the patterns for searching appeared to be inconsistent. These experienced users attested to using alternative methods of finding correct search terms in their own daily work. Their comments indicate search problems were not peculiar to this study’s content. Interactions required to complete the test tasks proved difficult to find for four of the six users. The most troublesome included a method of canceling an active order, and included a clear way to change patient focus. The inability to cancel led to more task failures than any other problem. (See detailed discussion below.)

Additional Information and Suggested Improvements • •



Implement improved search capabilities for orderable items (terminology). Heuristic evaluation could be conducted to identify changes to the software that might reasonably be expected to improve user efficiency, effectiveness, satisfaction, and reduce user error. These could also be expected to reduce training time. User Interaction should be made consistent, at least throughout the module if not throughout the entire HELP1 EMR system.

See additional information in the Findings section.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 4 of 16

Method Introduction This study tested the usability of the current user interface, workflow and interaction of the Order Communications (Order Comm) module, a component within Intermountain Healthcare’s Health Evaluation through Logical Processing (HELP1) software. To this end, measures of effectiveness, efficiency, and user satisfaction, including task completion success rates, task completion times, and user satisfaction, were captured during the usability testing. Intermountain Healthcare’s HELP1 software has developed over the past three decades. Originally a mainframe-terminal system, the software migrated to Tandem branded server hardware and operating system, with users interfacing with client PCs. Though the hardware and its proprietary data software became HP NonStop, the Intermountain product is still frequently known as the “Tandem,” as well as HELP1. Later, Intermountain developed a web-based system called Help2, primarily for ambulatory care, but it is also used in the hospitals. HELP1, or as it is variously known, Tandem, or HELP11/Tandem, is one of the core EMR systems for Intermountain Healthcare’s 23 hospitals and their associated ambulatory services.

Intended Users and Participants The module tested for this study is used extensively by clinicians in Intermountain Healthcare hospitals and associated ambulatory settings, including emergency departments, in accordance with Utah State law. A total of six Intermountain employed clinicians took part in the study. These individuals were paid for their time based on Intermountain Healthcare compensation policies. Participants were scheduled for 60-minute sessions, as this test was combined with other Order Comm study tasks. Test Participants had a mix of clinical specialties and demographic characteristics as shown in the table. To ensure anonymity, participant names are replaced with Participant ID’s. Participant ID Product Experience E-Health Record Experience Work Environment Clinical Experience Age Range Gender

2 Expert - 4 Expert - 4 Hospital 11–15 Years 18-34 F

3 Expert - 5 Expert - 4 Hospital 0–5 Years 18-34 F

4 Intermediate-3 Intermediate-3 Both 11–15 Years 35-49 F

5 Intermediate-3 Novice - 2 Outpatient 0–5 Years 18-34 F

6 Intermediate-3 Expert - 4 Hospital 0–5 Years 18-34 M

7 Novice - 2 Intermediate-3 Outpatient 0–5 Years 35-49 F

Sessions occurred in an office facility on the hospital campus where they all work. All Tests utilizing Order Comm software to meet ONC Test §170.314(a)(1) Lab and Imaging criteria were conducted in the same sessions to optimize clinician time. The tool is used in both EH and EP environments, relative to the ONC criteria, with clinicians typically ordering both Lab and Imaging orders in their work assignments.

Study Design Overall, the objective of this study was to discover areas where the application performed well–that is, effectively, efficiently, and with satisfaction; and areas where the application failed to meet the needs of the participants. The data from this study may serve as a baseline for future studies using an updated version of the HELP1. In short, this study serves as both a means to record or benchmark current usability and also to identify areas where improvements might be made. Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 5 of 16

During the usability study, participants interacted with the same implementation of the Order Comm product and were provided with the identical instructions. The system was evaluated for effectiveness, efficiency, and satisfaction as defined by measures collected and analyzed for each participant. • • • •

Number of tasks successfully completed within the allotted time and without assistance Time to complete each task (Time on Task, or ToT) Participant comments Participant satisfaction ratings measured by the System Usability Scale.

Tasks CCHIT tasks for §170.314(a)(1) Computerized Provider Order Entry included the ability to order an item, change an item, and view current active orders. They were presented as fundamental and commonly performed activities within a clinical workflow. Order Comm requires patient and provider selection before orders can be created. Preparation and orientation: 1) Select patient, 2) Review Morae interface and interaction, and 3) Demographic Survey. Task 1: Complete blood count (hemogram) panel in blood by automated count Task 2: Hemoglobin A1c/hemoglobin total in blood Task 3: Gas & carbon monoxide panel in blood Task 4: Change the previous order to: Gas & carbon monoxide panel in arterial blood Task 5: Review active orders. Tasks were also developed and implemented to evaluate risk, expressed as reduced effectiveness and potential negative outcomes for patients (NIST IR 7742 3.3). Areas of significant patient risk, such as patient transportation, were not addressed in this test scenario as they will be considered later relative to decision support. Tasks were presented to participants in an order typical of their clinical workflow, but pass/fail evaluation included prioritization of risk, and risk, if applicable, was used as absolute criteria for marking task completion as passed or failed.

Procedures Upon arrival, participants were greeted and their identity was verified. Each participant reviewed and signed an informed consent and release form 5. The facilitator witnessed the participant’s signature. A usability study moderator was assigned to set up the portable usability lab, administer instructions and conduct post study interviews. Morae software was used to administer surveys and tasks, record task times, log task success and failures, register mouse clicks and capture participant comments. For each task, the participants were presented with on-screen instructions. Task timing began when the participant indicated they were ready to start the task by clicking the Start Task button displayed in an on-screen Task Instructions dialog box. Task time was stopped when the participant indicated they had successfully completed the task by clicking the End Task button displayed in the Task Instructions dialog. Following the last task, a post-test System Usability Scale (SUS) survey was administered electronically. Once the post-test questionnaire was completed, participants were asked to discuss their rationale for answering the post test questions and were also requested to discuss the things they liked and did not like about Order Comm. 5

See Appendix 3 for more information about the Consent and Release form.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 6 of 16

Participants were also invited to suggest improvements they felt would be useful. Participants were then thanked for their time and were invited to return to their normal activities. The Morae recordings captured each participant’s image as a “picture-in-picture” along with live action video of the software screen, mouse clicks, types of mouse clicks (left/right) and mouse movement. Participant demographics, times on tasks, satisfaction surveys, comments and post-test questionnaire responses were also recorded with Morae. A compilation of all participant recordings were assembled together into a Morae project, thus allowing data analysis to be compiled and reviewed for all six participants as a set, as well as by individual participant.

Test Environment Studies were run on a portable computer system comprising Intermountain’s current standard clinical software and operating system. This included a laptop computer running 64-bit Windows 7 connected to an Intermountain standard-issue Dell keyboard and mouse, and a 22-inch 1680 X 1050 pixel Dell monitor. The system also included a Logitech Web Cam to record audio and video of the participants. The Intermountain Local Area Network provided access to the HELP1 architecture (services, storage, etc.) Order Comm was run from the Intermountain Test environment. From a technical and practical point of view, system performance, response time, etc. were representative of an authentic implementation experience. Potential confounding issues: HELP1, and particularly workflows in Order Comm, utilize the Esc key as a “go back” command. Unfortunately, Morae uses that key to stop recordings. Users had to click on the HELP1 interface when returning from Morae task functions, which added to click counts and may have contributed to user stress. Screen content for the test environment and test user was different from those used in the participants’ own work environments, perhaps contributing to minor confusion and slight time lags.

Test Forms and Tools During the usability study, various documents and instruments were used, including 1. Informed Consent 2. On-line Tasks prompts 3. Demographic Survey 4. SUS Questionnaire

Participant Instructions The usability study moderator allowed time for the participant to review the Informed Consent document 6. The usability study moderator also reviewed the following areas of Consent & Release document aloud with each participant. … I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 7 of 16

I understand that I may withdraw from this session, assessment, or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for my participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation.

Usability Metrics According to the NISTIR Guide to the Processes Approach for Improving the Usability of Electronic Health Records, electronic health record applications should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency, and user satisfaction were captured during the usability testing. The goals of the study were to assess: 1. Effectiveness, by measuring participant success rates. 2. Efficiency, by measuring the average task time. 3. Satisfaction, by measuring ease of use ratings. Data Scoring Morae software collected data about Time on Task (ToT), Success & Failure Achievement and the Number & Severity of Errors. Satisfaction data captured through surveys was also configurable. The post-test (SUS) questionnaire is an industry-standard set of ten (10) questions that captures the overall satisfaction rating for each participant. Morae also calculated the average SUS score for the participant set. Data Analysis and Reporting The results of the usability study were calculated according to the methods noted in the previous Usability Metrics section and the following types of data were collected for each participant: • Number of tasks successfully completed - within the allotted time and without assistance. • Number of tasks not successfully completed. • Time to complete each task. • Major findings and/or Identified Problems and Opportunities based on issues that came up during the tasks. • Participant’s verbalizations. • Participant’s satisfaction ratings for the product’s ease of use.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 8 of 16

Findings Product Effectiveness Results As noted above, effectiveness as determined by task completion reached 83 percent overall, with a 17 percent failure rate. The graph below illustrates the tasks where users experienced the greatest difficulties and failures. Task 2 proved to be one of the most difficult search challenges for participants in the entire test series. Ultimately only two participants found the actual item, which required some or at least the leading letters of a six-letter mnemonic. Most participants ordered a hemoglobin test and specified A1c in the comments section, expecting the lab to change the order to the correct coding. Failure in Task 4 involved a control hidden means of cancelling an order. Participants had to cancel the item ordered in Task 3 in order to replace it. This problem is discussed in greater detail below.

Product Efficiency Results The chart below compares professional clinicians’ Times on Task (ToT) with that of an analyst who was an expert with the task genre and module. (The “expert user” baseline is calculated as actual time X 1.25) ToT Expert (P1) P1 (*1.25) P2 P3 P4 P5 P6 P7

PREP .55 .69 .65 .93 .78 1.21 .30 1.93

Task 1 .40 .50 .25 .60 1.55 1.24 .17 1.72

Task 2 .27 .34 .74 1.51 1.28 1.05 1.31 1.15

Task 3 .47 .59 .156 1.39 2.41 1.92 .40 1.75

Task 4 .69 .86 .74 1.84 1.12 .09 .93 4.61

Task 5 .25 .34 .26 .20 .53 .20 .17 .13

Times expressed as minutes in decimal. Grey areas indicate tasks failed

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 9 of 16

Product Satisfaction Results 7 The System Usability Scale (SUS) is a survey containing 10 questions. The average score for all 10 questions was calculated for each participant. The overall score for the entire product was then taken from the average score from all six participants. A SUS score above a 68 is considered above average and anything below 68 is below average. 8 The Order Comm module received a score of 48.75 thus indicating a below average level of satisfaction. Participant P2 P3 P4 P5 P6 P7 Standard Dev. Average

SUS 45 80 55 27.5 22.5 62.5 19.8 48.75

Major Findings HELP1 has maintained its Command Line Interface (CLI) with myriad nested menus to the present version, resulting in fast but training-intensive “Pick from Thousands” (PFT) keystroke operation. 9 Observations of actual trained clinical users testify to the effectiveness and efficiency of expert users, indicating the software can be highly useful when the user is highly trained and experienced. However, as this study of experienced users demonstrates, efficiency requires consistent formatting and content memorization rather than ability to intuit the interaction. Any changes in page formatting or content vocabulary significantly slow throughput and increase errors. For example, these users initially hesitated at the opening screen. Some commented that the screen was “different from mine” at various points in the workflow. The software displays options based on user rights and roles. The test environment allowed all of those, so the screen was more populated than what the test participants would see in their own work environment. In one particular case, users expected to enter “RAD” in a search field to act as a filter to show only Imaging items. The field for that filter was neither available nor defaulted in the test environment, resulting in significant difficulty for some users. The filter and search relationship was not intuitive for them. Similar issues could be expected as clinicians change assignments and/or departments. Also, the orderable item search tool was repeatedly unhelpful in locating items to be ordered in the tasks. Within the combined group of studies, participants consistently commented on the need for a “better way” of looking up orderable items, and endorsed such terms as “smart search” or “Google-like search.” Those search features are available in other Intermountain software products. Because of the weakness of the HELP1 Order Comm search feature, users commented: •

7 8 9

“This is where I look it up in my book.” (A printed index of item mnemonics, abbreviations, and names she keeps constantly at hand at her work station.)

A participant’s task efficiency is measured against an expert user’s time on the same task multiplied by 1.25% Sauro, J. (2011). Measuring Usability with the System Usability Scale (SUS) ♦http://www.measuringusability.com/sus.php Sudar, 2013, Intermountain Healthcare Usability Study: HELP/Tandem Allergies Module, p. 5

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 10 of 16

• • •

“I would look it up in LabNet.” (A separate Intermountain software product that includes a terminology search.) “This is when we call the lab.” (Phone call to the lab, or conversation with a tech who might be available at the moment.) “We just write it in and let the lab pick the right one.” (Utilizing the comments fields to clarify a “close” match. For example, “Fasting Glucose” proved elusive so some participants ordered “Glucose” panel, adding “Fasting” to comments. They endorsed the practice of expecting the lab to reassign the proper item code.)

These options range from inconvenient to potential patient safety risks. That a printed index of orderable items is more effective and efficient than a computerized search tool speaks to a need for technical upgrades in HELP1 and the Order Comm module. Further, the goal of clinical software as a support to the professional clinician is frustrated if feature deficits cause the clinician to depend upon the actions of other learned intermediaries in order to create and process a proper order. Granted, expert lab or imaging clinicians might be more adept at selecting tests to confirm particular problems (e.g., indications or differential diagnoses) than some ordering providers, but if the ordering provider knows what he or she wants and simply can’t find it, the software is probably underperforming. For these reasons, and due to the frequency and negative performance impacts of the problem, the search feature appears to be the most serious usability problem discovered in these tests. Second in severity and frequency of negative impact is the problem of hidden features, or “secret handshakes” that one must know to operate the software. Those are discussed generally below, but in these tests, where required criteria called upon clinicians to “change” an order, the inability to find a way to “cancel” proved troublesome. Participants engaged in three software workflows to accomplish “change” tasks. One involved a batch order workflow initiated but then abandoned by one user. In that workflow, the user could discover an otherwise hidden command to “Delete” one of a list of unsaved orders rather than saving all of the orders. In Order Comm, once an order is made active by invoking a “Save” command, the order must be “Cancelled” and another order created to invoke a “change.” “Cancelled,” however, is an option found only in an order “Status” menu that proved difficult for users to find. In some cases, selecting an active order from an Order Review screen, and pressing Enter would display a screen with the Status field and menu. Sometimes not. The other workflow was to select to Modify an order from the main order screen, and enter the order number—which the user would have to have remembered from one or two screens before, depending on the route they took. This single problem caused more task failures than any other in the Order Comm CPOE tests. Participants endorsed the most common workaround: “Call the lab.” Imaging and Lab downstream software, which receive Order Comm orders through APIs, are used to cancel items instead of Order Comm. Other problems • The Order Comm module maintains a consistent “look and feel” throughout the order workflow. However, inconsistent operation of controls and non-standard vocabulary are significant roadblocks to usability. o Participants struggled to find a way to change patients. The main menu features a Function Key toolbar/menu at the bottom of the screen, included “F12-Pdt”. None of the Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 11 of 16

• •

participants used this control to change patients without hinting. One participant explained she changes patients by closing the HELP1 software and re-launching it. o Controls, variously F-keys or numbers, are not consistent in appearance or function. For example, some work with a mouse click and others do not. In multiple circumstances, users had difficulty understanding how the tool numbers related to item numbers. Further errors resulted from using numbers that activated unintended actions. This was particularly true when two rows of commands were available. The system does not provide user feedback or warnings at critical interaction points, resulting in lost time and errors. Typical of its CLI heritage, HELP1 provides no tool or info tips on mouse hovers, and a novice user is without access to “secret handshakes” that make the system usable. Producing a menu of legal entries requires typing a “?” into the entry field in some cases. In other cases a “.” followed by a string is required. Efficient and effective use of these interactions can only result from training and practice, which the participants said is significant.

Additional Information and Suggested Improvements • The HELP1 search control should be updated to facilitate “smart search” capability. • Though “Cancelled” might be understood as a status (adjective), it is also an important action (verb). Consider surfacing this action as a control visible from appropriate screens. • User Interaction should be made consistent, at least throughout the module if not through the HELP1 EMR system. • When coded data is required in a field, users should be given an affordance to open coded data menus without having to know the “secret handshakes.” Opportunities for Improvement •





10

Adherence to established usability norms has been shown to significantly improve efficiency while reducing training costs and errors. For example, “One study at NCR showed a 25% increase in throughput with an additional 25% decrease in errors resulting from redesign of screens to follow basic principles of good design” 10 The same authors document significant reduction in both training and support costs with improved usability. Heuristic evaluation could be conducted to identify changes to the software that could reasonably be expected to improve user efficiency, effectiveness, satisfaction, and reduce user error. These could also be expected to reduce training time. Contextual and discount usability techniques should be used to identify vocabulary problems to decrease training time and improve user performance.

Gallaway, 1981, quoted in Bias and Mayhew, 2005, Cost Justifying Usability, Morgan Kaufmann, p. 29.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 12 of 16

Appendices The following appendices include supplemental data for this usability test report. Appendix 1: Appendix 2: Appendix 3:

Jan. 29, 2014

Demographic Survey System Usability Scale (SUS) Questionnaire Informed Consent and Release Form

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 13 of 16

Appendix 1: Demographic Survey Demographic Survey

Thank you for participating in today's study. To help us understand your area of expertise, please take a few minutes to provide answers to the following questions. 1. What is your age group? () 18-34 () 35-49 () 50+ 2. What is your gender? () Female () Male 3. What is your current clinical specialty? () MD () Medical Assistant () Nurse Practitioner () Physician’s Assistant () Pharmacist () RN () Other Additional Comments: 4. How long have you worked in your area of clinical expertise? () 0-5 years () 6-10 years () 11-15 years () Over 15 years Additional Comments: 5. Where would you rate your electronic health record computer expertise? Novice 1 2 3 4 5 Expert Additional Comments: 6. How familiar are you with the HELP1 Allergies module? Novice 1 2 3 4 5 Expert Additional Comments: 7. Which best applies to your current work environment? () Clinic-Outpatient () Inpatient () Both Additional Comments: Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 14 of 16

Appendix 2: Informed Consent & Release Form For valued contributions received, the recognition and sufficiency of which are hereby acknowledged, I ___________________________________________________ (name of test, interview or research participant) have voluntarily agreed to participate in User Centered sessions, assessments or interviews conducted and/or recorded by: _________________________________________(name of person conducting tests, interviews or recordings) I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation. Although the sessions, assessments or interviews are not designed to create stress or discomfort, I understand that I may experience some stress or discomfort and that I may withdraw at any time without penalty. I understand the software designs I view may be confidential and I will not discuss them with anyone outside of this study group. I also understand that these designs may be experimental in nature. I also understand that they may or may not be implemented in any actual products. I have read the above release and consent, prior to its execution; I fully understand the contents and consequences thereof. This agreement shall be binding upon me and my heirs, legal representatives and assigns.  YES, I have read the above statement and agree to be a participant. __________________________________________________________ (Signature of Participant) Date: ________________________

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 15 of 16

Appendix 3: System Usability Scale Questionnaire

1. I think that I would like to use this system frequently.

Strongly Disagree

Strongly Agree

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

2. I found the system unnecessarily complex.

3. I thought the system was easy to use.

4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly. 8. I found the system very cumbersome to use.

9. I felt very confident using the system.

10. I needed to learn a lot of things before I could get going with this system.

Jan. 29, 2014

Computerized Provider Order Entry, EP-Lab Orders – HELP Order Communication

Page 16 of 16

Usability Study Report cPOE-A Drug-Drug Interactions cPOE-A Drug-Medication Allergy Interactions Product(s) & Version:

Safety Enhanced Design Requirements Addressed 1 :

cPOE 2013, Version 2.0.2.6 HELP2 Version M10.18

§170.314 (a)(2) Drug-drug, drug-allergy interaction checks

-----------------------------------Date of Usability Test: Date of Report: Report Prepared By:

November 26 -27, 2013 December 3, 2013 Wendy Sudar, Intermountain Healthcare Phone Number: 801.507.9165 Email address: [email protected]

1 The Office of National Coordinator for Health Information Technology ♦ Approved Test Procedures Version 1.2 ♦December 14, 2012 ♦ §170.314a2drug_allergy_interaction_checks_2014_tp_approvedv1.2

Table of Contents EXECUTIVE SUMMARY................................................................................................................... 3 Performance Results ..................................................................................................................... 4 EFFECTIVENESS............................................................................................................................. 4 EFFICIENCY .................................................................................................................................. 4 MAJOR FINDINGS.......................................................................................................................... 4 AREAS FOR IMPROVEMENT............................................................................................................ 4 INTRODUCTION ............................................................................................................................ 4 METHOD ..................................................................................................................................... 4 PARTICIPANTS .............................................................................................................................. 4 STUDY DESIGN.......................................................................................................................... 5 TASKS ...................................................................................................................................... 5 PROCEDURES............................................................................................................................ 6 TEST ENVIRONMENT.................................................................................................................. 7 TEST FORMS AND TOOLS............................................................................................................ 7 PARTICIPANT INSTRUCTIONS ...................................................................................................... 7 USABILITY METRICS ................................................................................................................... 8 DATA SCORING ......................................................................................................................... 8 DATA ANALYSIS AND REPORTING ................................................................................................ 8 MAJOR FINDINGS........................................................................................................................ 10 AREAS FOR IMPROVEMENT.......................................................................................................... 10 APPENDICES .................................................................................................................................. 11 Appendix 1: DEMOGRPAHIC SURVEY ............................................................................................. 12 Appendix 2: CONSENT & RELEASE ................................................................................................. 13 Appendix 3:

SYSTEM USABILITY SCALE QUESTIONNAIRE ................................................................ 14

EXECUTIVE SUMMARY A summative usability study was completed by Intermountain Healthcare’s, Clinical Information Systems Business Analyst Team for the cPOE 2013 - Version 2.0.2.6 product (hereafter referred to as cPOE) Study sessions were conducted on November 26-27, 2013 at the Intermountain Medical Center, South Office Building. The purpose of the study was to validate the ease of use for the current user interface and to provide evidence of usability for the cPOE product as it specifically relates to alert information for Drug-Drug and Drug Medication Allergy Interaction checks when ordering medications. Usability study tasks and background setup allowed for the appearance of Drug-Drug Interaction information (Task 2), thus allowing clinicians to determine if whether to continue with their medication order. In Task 3, existing medications and medication allergies in the patient’s record exposed both Drug-Drug and Drug-Medication Interaction alerts, thus allowing clinicians to determine whether to continue with their medication order. During the usability study, four (4) healthcare providers matching the target demographic criteria served as participants and used the cPOE product in simulated, but representative tasks. The study collected performance data from three (3) tasks typically done with a cPOE product when ordering medications. Morae Version 3.3.2 2 was used to record, observe, and analyze study data. During the 30 minute one-on-one usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form 3 and were told they could withdraw from the study at any time. Study participants included those with experience using the current cPOE product or the product’s predecessor application, as well as those unfamiliar with the product. The administrator introduced the study and instructed participants to complete a series of tasks (presented one at a time). As part of the study protocol, the administrator did not provide assistance to the participant as to how to complete any of the tasks. All participant data was de-identified – no correspondence can be made from the identity of the participant to the data collected. After the final task of the study, participants were asked to complete a post questionnaire, also known as, the System Usability Scale (SUS) and were compensated according to Intermountain Healthcare payment guidelines for the use of their time. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records (NISTIR 7741) were used to evaluate the effectiveness, efficiency and satisfaction of cPOE product.

2 3

cPOE 2013 - Version 2.0.2 (Build 6) is also known as cPOE 2013 - Version 2.0.2.6 Morae is produced byTechSmith Corporation. Three Morae products were used during the course of this study. See http://www.techsmith.com/morae.html for additional product information.

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 3 of 14

Performance Results EFFECTIVENESS •

Study results indicate that all four (4) participants were unable to successfully complete the first two tasks – completing a medication order

EFFICIENCY



Study results indicate that all four (4) participants were unable to successfully complete any of the tasks within the time frame (including a buffer) as that of an expert user.

MAJOR FINDINGS • • • •

Participants were able to easily locate a medication using the medication search feature. For novice users, there is a learning curve for understanding the purpose and usage for the following buttons: Apply, Apply & Print and Issue Order. Interaction alerts were easy to understand and address as evidenced by comments such as this: " Uh, oh, we're getting a drug interaction alert. I am a clinician and am clinically capable of making a decision. I will put in my reason." Overall product satisfaction was rated significantly above average. One of the participant comments indicated their satisfaction with the overall ease of use: “The system was easy to use once I got my bearings.”

AREAS FOR IMPROVEMENT •

To avoid problems with initial ease of use, moving the Apply button to the end of the string instead of the beginning might reduce problems with the initial learning curve.

INTRODUCTION The purpose of this study was to test and validate the usability of the current user interface. To this end, measures of effectiveness, efficiency and user satisfaction, such task completion success, task completion times and user satisfaction were captured during the usability testing. The application tested for this study was cPOE 2013, Version 2.0.2.6 which was designed to present medical information (e.g. the patient’s known medications list) to healthcare providers in an outpatient and inpatient (Emergency Department) setting. Specific areas for this study were the ordering of medications, some of which resulted in Drug-Drug Interaction alerts and Drug-Medication Allergy alerts.

METHOD PARTICIPANTS A total of four (4) participants, Clinical Analysts, tested the cPOE product. Participants were recruited from a pool of Intermountain Healthcare Clinical Analysts. These participants were compensated for their time based on Intermountain Healthcare compensation policies.

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 4 of 14

Participants had a mix of clinical specialties and demographic characteristics. The following table illustrates each participant by characteristic. To ensure anonymity, participant names are replaced with Participant ID’s in order to prevent an individual’s data to be tied to their actual identities. Product Expertise

Average

Below Average

Novice

Average

Almost Expert

Almost Expert

Almost Expert

Almost Expert

Work Environment

Inpatient

Outpatient

Inpatient

Outpatient

Clinical Experience Profession

> 15 years Lab Medicine

6-10 years RN

11-15 years RN

6-10 years RN

50+ Male P1

18-34 Female P2

35-49 Female P3

50+ Male P4

E-Health Record Experience

Age Range Gender Participant ID

Participants were scheduled for 30 minute sessions. This pace allowed for time in between sessions for debriefing opportunities between the usability analyst, the data logger and the observation group facilitator. It also allowed time to reset the system back to proper test conditions. STUDY DESIGN Overall, the objective of this study was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction and areas where the application failed to meet the needs of the participants. The data from this study may serve as a baseline for future studies using an updated version of the cPOE product. In short, this study serves as both a means to record or benchmark current usability and also to identify areas where improvements must be made. During the usability study, participants interacted with the same version of the cPOE product and were provided with the identical instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant: • • • •

Number of tasks successfully completed within the allotted time and without assistance Time to complete each task Participant comments Participant satisfaction ratings

TASKS Since no previous Usability Studies were performed on the CPOE Drug-Drug and Drug-Medication Allergy module, tasks were determined based on the following criteria: 1) fundamental and commonly performed medication orders that interact with other common medication orders ; 2) common medication allergies that interact with common medication orders. Tasks were developed and implemented to evaluate risk, expressed as reduced effectiveness and potential negative outcomes for patients. Tasks were presented to participants in an order typical of their clinical workflow, but pass/fail evaluation included prioritization of risk, and risk was used as absolute criteria for marking task completion as passed or failed.

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 5 of 14

The study collected performance data from three (3) tasks. Morae Version 3.3.2 was used to record, observe, and analyze study data. Task 1: Create a medication order for Baclofen 20 mg, include information for: Dose, Route, Frequency, Quantity, Refills. Task 2: Create a medication order for Celebrex 200 mg include information for: Dose, Route, Frequency, Quantity, Refills. Task 3 Create an order for Erythromycin 250 mg Oral Capsule, Delayed Release: Dose, Route, Frequency, Quantity, Refills. PROCEDURES Upon arrival, participants were greeted and their identity was verified. Each participant reviewed and signed an informed consent and release form 4 . The facilitator witnessed the participant’s signature. A usability analyst moderated the study sessions which included setting up the portable usability lab, administering instructions and conducting post study interviews. Morae Version 3.3.2 5 was used to administer surveys and tasks, record task times, log task success and failures, register mouse clicks and capture participant comments. A clinical analyst marked key areas of the session recording to be reviewed after the study. For each task, the participants were presented with on-screen instructions. Participants were also given a paper copy of each task. Task timing began when the participant indicated they were ready to start the task, by clicking the Start Task button found on the on-screen Task Instructions dialog box. Task time was stopped when the participant indicated they had successfully completed the task, by clicking the End Task button found on the Task Instructions dialog. Following Task 3, a post-test questionnaire known as the System Usability Scale (SUS) was administered electronically. 6 Once the post-test questionnaire was completed, participants were asked to discuss their rationale for answering the post test questions and were also requested to discuss the things they liked and did not like about the cPOE product. Participants were also invited to suggest improvements they felt would be useful. Participants were then thanked for their time and were invited to return to their work environment. Morae allowed the participant’s image to be captured in a picture-in-picture screen which included the live action of the software screen, mouse clicks, types of mouse clicks (left/right) and mouse movement. Participant demographics, time on tasks, satisfaction surveys, comments and post-test questionnaire responses were recorded using Morae. A compilation of all participant recordings were assembled together into a Morae project, thus allowing data analysis to be compiled and reviewed for all five participants as a set, as well as by individual participant. 4

See Appendix 2 for the Consent and Release form. Morae is produced byTechSmith Corporation. Morae Recorder, Observer, and Manager were used during the course of this study. See http://www.techsmith.com/morae.html for additional product information. 5

6

See Appendix 3 for more information about the System Usability Scale Questionnaire.

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 6 of 14

TEST ENVIRONMENT As part of the portable usability lab, a single 22”monitor, keyboard and mouse were attached to a laptop running the cPOE product. The monitor used the following settings: Resolution 1680 X 1050; Color Depth 32 Bit; Refresh Rate 59 Hz. Since all participants are part of the Intermountain Healthcare Corporation, the Intermountain LAN provided access to the cPOE product and no additional connections were required. cPOE product was run from the production server. The cPOE product was used in conjunction with the HELP2 Version M10.18. From a technical and practical point of view, system performance, response time, etc. was representative of an authentic implementation experience. TEST FORMS AND TOOLS During the usability study, various documents and instruments were used, including: 1. Informed Consent 2. On line Tasks prompts 3. Post-test Questionnaire The participant’s interaction with the cPOE product was captured and recorded digitally. A Logitech web camera recorded the participant’s facial expressions synced with the screen capture. Comments were recorded by the camera’s microphone. PARTICIPANT INSTRUCTIONS The administrator allowed time for the participant to review the Informed Consent document 7 . The administrator also reviewed the following highlighted areas of Consent & Release document aloud to each participant. … I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation…

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 7 of 14

USABILITY METRICS According to the NISTIR Guide to the Processes Approach for Improving the Usability of Electronic Health Records, electronic health record applications should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the study were to assess: 1. Effectiveness by measuring participant success rates. 2. Efficiency by measuring the average task time. 3. Satisfaction by measuring ease of use ratings. DATA SCORING Morae can be configured to collect data about Time on Task (ToT), Success & Failure Achievement and the Number & Severity of Errors. For this study, Time on Task and Success & Failure was captured. Satisfaction data was captured through the post-test questionnaire the System Usability Scale. The System Usability Scale or SUS – is a set of ten (10) questions that captures the average satisfaction rating per participant. Morae allows for the calculation of an overall SUS score for the entire set of participants which was also used as part of this study. DATA ANALYSIS AND REPORTING The results of the usability study were calculated according to the methods noted in the previous Usability Metrics section and the following types of data were collected for each participant: • • • • • •

Number of tasks successfully completed - within the allotted time and without assistance. Number of tasks not successfully completed. Time to complete each task. Major findings and/or Identified Problems and Opportunities based on issues that came up during the tasks. Participant’s verbalizations. Participant’s satisfaction ratings for the product’s ease of use.

PRODUCT EFFECTIVENESS RESULTS Study results indicate that all four (4) participants were unable to successfully complete the first two tasks. When a medication order is created, the clinician must use the Apply & Print button or the Issue button, however the UI also allows for medication orders to be placed on hold through the use of an Apply button. For the first two tasks, all four (4) participants used the Apply button - which does not constitute the completion of an order. Thus all four participants were documented as failing to complete the task. Task 3 presented both a Drug-Medication Interaction and Drug-Drug Interaction problem. As a result, rather than complete the order, two (2) of the clinicians choose to cancel the medication order rather than proceed due to the interaction risk. Since an order cancellation is considered a clinically appropriate task completion, these two (2) clinicians were given a task completion score. Since the other two (2) participants determined to continue the order, but used the Apply button the task was marked as Failure to Complete.

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 8 of 14

Success Distribution by Task 100.00

100.00

100.00

50.00

90.00

80.00

Success Distribution (%)

70.00 Legend

60.00

Failed to complete Completed with difficulty

50.00

50.00

Completed with ease Score not set

40.00

30.00

20.00

10.00

0.00

Task 1

Task 2

Task 3

Task

Success Distribution By Task Participant ID Participant 1 Participant 2 Participant 3 Participant 4

Task 1

Task 2

Task 3

Failed to complete Failed to complete

Failed to complete Failed to complete

Completed task

Failed to complete

Failed to complete

Failed to complete

Failed to complete

Completed task Failed to complete

Failed to complete

PRODUCT EFFICIENCY RESULTS Average Time on Task compared to an expert user with an additional .25% buffer indicates that overall efficiency by tasks. All four participants were unable to complete this task within the time as that of an expert user with an includeda buffer (+25%).

ToT (in Min) Expert Expert + 1.25% Participant Participant Participant Participant

1 2 3 4

Efficiency December 3, 2013

Task 2

Task 2

Task 3

0.62 0.78

0.39 0.49

0.66 0.83

3.75 1.84 1.74 2.39

3.39 0.84 1.17 2.48

1.82 2.02 1.35 4.09

0%

0%

0%

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 9 of 14

PRODUCT SATISFACTION RESULTS

The System Usability Scale (SUS) is a survey containing ten (10) questions. The average score for all ten questions was calculated for each participant. The overall score for the entire product was then taken from the average score from all five (5) participants. A SUS score above a 68 is considered above average and anything below 68 is below average 8. The cPOE product received an overall score of 81.25 thus indicating an above average level of satisfaction. All Participants Participant 1

SUS Score 82.50

Participant 2

55.00

Participant 3

97.5

Participant 4

90.00

Average (Mean)

81.25

MAJOR FINDINGS • • • • • • •

Participants were able to easily locate a medication using the medication search feature. Participants readily added Dose, Frequency, Quantity and Refill amounts without difficulty. While adding a start date for a medication was easily accomplished, adding an end date proved troublesome due to dependencies on medication quantities and refills. For novice users, there is a learning curve for understanding the purpose and usage for the following buttons: Apply, Apply & Print and Issue Order. Novice users select the Apply button believing this button submits and order. They may also choose this button since it is the first of the three buttons. Interaction alerts were easy to understand and address as evidenced by comments such as this: " Uh, oh, we're getting a drug interaction alert. I am a clinician and am clinically capable of making a decision. I will put in my reason." Overall product satisfaction was rated significantly above average. One of the participant comments indicated their satisfaction with the overall ease of use: “The system was easy to use once I got my bearings.”

AREAS FOR IMPROVEMENT •

8

To avoid problems with initial ease of use, moving the Apply button to the end of the string instead of the beginning might reduce problems with the initial learning curve.

Sauro, J. (2011). Measuring Usability with the System Usability Scale (SUS) http://www.measuringusability.com/sus.php

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 10 of 14

APPENDICES

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 11 of 14

Appendix 1: DEMOGRPAHIC SURVEY Demographic Survey

Thank you for participating in today's study. One of the requirements for the final usability report is participant demographic information. Please take a few minutes to provide answers to the following questions. 1. What is your current clinical specialty? ( ) MD ( ) NP, PA ( ) RN ( ) Other ( ) Add comments in the box below, if desired Additional Comments: How long have you worked in your area of clinical expertise? ( ) 0-5 years ( ) 6-10 years ( ) 11-15 years ( ) Over 15 years Additional Comments: 3. Which best applies to your current work environment? [ ] Clinic - Outpatient [ ] Inpatient [ ] Other: Please provide additional information in the comment box below Additional Comments: 4. Where would you rate your electronic health record computer expertise? Novice 1 2 3 4 5 Expert Additional Comments: 5. How familiar are you with CPOE – ? Novice 1 2 3

4

5

Expert

Additional Comments:

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 12 of 14

Appendix 2: CONSENT & RELEASE For valued contributions received, the recognition and sufficiency of which are hereby acknowledged, I ___________________________________________________ (name of test, interview or research participant) have voluntarily agreed to participate in User Centered sessions, assessments or interviews conducted and/or recorded by: _________________________________________(name of person conducting tests, interviews or recordings) I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation. Although the sessions, assessments or interviews are not designed to create stress or discomfort, I understand that I may experience some stress or discomfort and that I may withdraw at any time without penalty. I understand the software designs I view may be confidential and I will not discuss them with anyone outside of this study group. I also understand that these designs may be experimental in nature. I also understand that they may or may not be implemented in any actual products. I have read the above release and consent, prior to its execution; I fully understand the contents and consequences thereof. This agreement shall be binding upon me and my heirs, legal representatives and assigns. 

YES, I have read the above statement and agree to be a participant.

__________________________________________________________ (Signature of Participant) Date: ________________________

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 13 of 14

Appendix 3:

SYSTEM USABILITY SCALE QUESTIONNAIRE

1. I think that I would like to use this system frequently.

Strongly Disagree

Strongly Agree

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

2. I found the system unnecessarily complex.

3. I thought the system was easy to use.

4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly. 8. I found the system very cumbersome to use.

9. I felt very confident using the system.

10. I needed to learn a lot of things before I could get going with this system.

December 3, 2013

Summative Usability Testing Results: cPOE 2013, 2.0.2.4 Drug-Drug Interactions and Drug Medication Interactions

Page 14 of 14

Usability Study Report HELP/Tandem Medication List (Pharmacy Orders)

Product(s) & Version: Safety Enhanced Design Requirements Addressed 2 :

HELP Pharmacy Medication Orders 1

§170.314(a)(6) Medication List, Hospital

-----------------------------------Dates of Usability Test: Date of Report:

Nov. 26-27, 2013 Dec. 3, 2013

Report Prepared By:

Carl Bechtold, Intermountain Healthcare Phone Number: 801.507.9168 Email address: [email protected]

1

The HELP product is released weekly. It is not versioned. Module versions are not referenced. 2 The Office of National Coordinator for Health Information Technology ♦ Approved Test Procedures Version 1.2 ♦December 14, 2012 ♦ §170.314a2drug_allergy_interaction_checks_2014_tp_approvedv1.2

Table of Contents EXECUTIVE SUMMARY....................................................................................................................... 4 PERFORMANCE RESULTS.................................................................................................................... 4 MAJOR FINDINGS.............................................................................................................................. 5 INTRODUCTION ................................................................................................................................ 5 METHOD

.................................................................................................................................. 6

INTENDED USERS/PARTICIPANTS ........................................................................................................ 6 STUDY DESIGN.............................................................................................................................. 6 TASKS .......................................................................................................................................... 7 PROCEDURES................................................................................................................................ 7 TEST ENVIRONMENT...................................................................................................................... 8 TEST FORMS AND TOOLS................................................................................................................ 8 PARTICIPANT INSTRUCTIONS .......................................................................................................... 8 USABILITY METRICS ....................................................................................................................... 9 DATA SCORING ............................................................................................................................. 9 Appendix 1:

Demographic Survey................................................................................................... 13

Appendix 2:

System Usability Scale (SUS) Questionnaire .................................................................... 13

Appendix 3:

Informed Consent and Release Form............................................................................. 13

Appendix 1: DEMOGRAPHIC SURVEY ................................................................................................. 14 Appendix 2: INFORMED CONSENT & RELEASE FORM............................................................................ 15 Appendix 3: SYSTEM USABILITY SCALE QUESTIONNAIRE ....................................................................... 16

EXECUTIVE SUMMARY A summative usability study was completed by Intermountain Healthcare’s Clinical Information Systems Business Analyst Team for the HELP/Tandem Pharmacy Orders Module (hereafter referred to as HELP Pharmacy Orders). Study sessions were conducted from Nov. 26-27, 2013, at the Intermountain Medical Center, South Office Building. As mentioned in Footnote 1, the HELP product is not versioned. The purpose of the study was to evaluate the ease of use for the current user interface and, additionally, to provide evidence of usability for HELP Pharmacy Orders. During the usability study, four (4) healthcare providers matching the target demographic criteria served as participants and used the HELP Pharmacy Orders module in simulated but representative tasks. The study collected performance data from six (6) tasks typically done within HELP Pharmacy Orders. A practice task was also included to allow participants to become familiar with the process of interacting with screen task instructions and taking the online post-task surveys. Morae Version 3.3.2 3 was used to record, observe, and analyze study data. During the 30 minute one-on-one usability tests, participants were greeted by the Usability Study Moderator and asked to review and sign an informed consent and release form 4 and were told that they could withdraw from the study at any time. All study participants had attended a three-hour orientation and training session on the HELP Electronic Medical Record (EMR) system one week prior to the test sessions. Otherwise, the participants were unfamiliar with the HELP Pharmacy Orders module. The Usability Study Moderator introduced the study and instructed participants to complete a series of tasks. After the final task of the study, participants were asked to complete a System Usability Scale (SUS) questionnaire. Participant identification has been removed. Study data can be linked, in some cases, to participants’ demographics but not to participants’ names. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records (NISTIR 7741) were used to evaluate the effectiveness, efficiency, and satisfaction of HELP Allergies. PERFORMANCE RESULTS Task 1 – Review a selected patient’s active and inactive medications. (50% successful) Task 2 – Discontinue the patient’s Omeprazole order. (50% successful) Task 3 - View the patient’s Coumadin order history. (50% successful) Task 4 - Your patient has been taking 2.5 mg of Coumadin daily. The physician now wants the patient to take 3 mg daily. Make that change. (75% Successful) Task 5 - Create a medication order for a continuous infusion of sodium chloride 0.9% 100/ml/hr for a total of 2 liters. (25% successful) Task 6 - Create a medication order for a Dobutamine continuous infusion starting at 5mcg/kg/min and enter titration parameters. (0% successful) 3

Morae is produced by TechSmith Corporation. Morae Recorder, Observer, and Manager were used during the course of this study. See http://www.techsmith.com/morae.html for additional product information. 4 See Appendix 3 for more information about the Consent and Release form. Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 4 of 13

Efficiency: None of the participants were able to complete any of the tasks successfully within the adjusted times (1.25 x) of the expert user. See complete data below. MAJOR FINDINGS In addition to the performance data collected, the following qualitative observations were made: HELP has maintained its Command Line Interface (CLI) with myriad nested menus to the present version, resulting in fast but training-intensive “Pick From Thousands” (PFT) keystroke operation. 5 Observations of actual trained clinical users comport to the effectiveness and efficiency of our expert user, indicating the software can be highly useful when the user is highly trained and experienced. However, all of this test’s novice module users, though they had undergone some demonstration-training, found the software non user-friendly. •

• •



HELP Pharmacy Medications Order module maintains a fairly consistent “look and feel” throughout the order workflow. However, inconsistent operation of controls and non-standard vocabulary are significant roadblocks to usability. The system does not provide user feedback or warnings at critical interaction points, resulting in lost time and errors. Typical of its CLI heritage, HELP provides no tool or info tips on mouse hovers, and a novice user is without access to “secret handshakes” that make the system usable. For example, in our test, users could type a string of letters into a “route” entry field. If the system did not recognize the string as a legal coded input, the user could not proceed. Producing a menu of legal entries requires typing a “?” into the entry field. Controls, variously F-keys or numbers, are not consistent. Moreover, their behavior is inconsistent. For example, some work with a mouse click and others do not.

Additional Information and Suggested Improvements • Heuristic evaluation could be conducted to identify changes to the software that could reasonably be expected to improve user efficiency, effectiveness, satisfaction, and reduce user error. These could also be expected to reduce training time. • Contextual and discount usability techniques should be used to identify vocabulary problems to decrease training time and improve user performance. • User Interaction should be made consistent, at least throughout the module if not through the HELP EMR system. • User feedback must be implemented, particularly when user error results in loss of data.

INTRODUCTION The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability. To this end, measures of effectiveness, efficiency, and user satisfaction, including task completion success, task completion times, and user satisfaction, were captured during the usability testing.

5

Sudar, 2013, HELP/Tandem Allergies Module, p. 5

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 5 of 13

The application tested for this study was the HELP Pharmacy Medication Orders module used extensively by pharmacists in Intermountain Healthcare hospitals. Physicians prepare written orders and do not interact directly with the module. Pharmacy Technicians use the module to process billing for dispensed items. Health Evaluation through Logical Processing (HELP) is an Intermountain Healthcare application developed over the past three decades. Originally a mainframe-terminal system, the software migrated to Tandem branded server hardware and operating system, with users interfacing with client PCs . Though the hardware and its proprietary data software became HP NonStop, the Intermountain product is still frequently known as the “Tandem,” as well as HELP. Later, Intermountain developed a web-based system called Help2, primarily for ambulatory care, but it is also used in the hospitals. HELP1, or as it is variously known, Tandem, or Help1/Tandem, is one of the core EMR systems for Intermountain Healthcare’s 23 hospitals.

METHOD INTENDED USERS/PARTICIPANTS The intended users for HELP Pharmacy Orders are pharmacists. A total of three (4) clinical analysts took part in the study. Participants were recruited from a pool of Intermountain Healthcare employed clinicians. These individuals were compensated for their time based on Intermountain Healthcare compensation policies. Participants were scheduled for 30-minute sessions. Participants were scheduled to take part in the study at their Intermountain facilities. Participants had a mix of clinical specialties and demographic characteristics as shown in the following table. To ensure anonymity, participant names are replaced with Participant ID’s. Participant ID Product Experience E-Health Record Experience Work Environment Clinical Experience Profession Age Range Gender

1 Novice - 2 Expert - 4 Hospital > 15 years Lab Medicine >50 M

2 Novice - 1 Expert - 4 IT 6 – 10 Years RN 18-34 F

3 Novice – 2 Expert - 4 Hospital 11 – 15 Years RN 35-49 F

4 Novice - 1 Expert - 4 IT 6 – 10 Years RN >50 M

STUDY DESIGN Overall, the objective of this study was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction; and areas where the application failed to meet the needs of the participants. The data from this study may serve as a baseline for future studies using an updated version of the HELP. In short, this study serves as both a means to record or benchmark current usability and also to identify areas where improvements might be made. During the usability study, participants interacted with the same version of Help/Tandem Pharmacy Order product and were provided with the identical instructions. The system was evaluated for effectiveness, efficiency, and satisfaction as defined by measures collected and analyzed for each participant: • Dec. 3, 2013

Number of tasks successfully completed within the allotted time and without assistance HELP/Tandem Med Orders & Med List

Page 6 of 13

• • •

Time to complete each task Participant comments Participant satisfaction ratings measured by Likert Scale surveys

TASKS Tasks were determined based on the following criteria: 1) prioritization based on fundamental and commonly performed activities; b) tasks specified in the ONC’s Test §170.314(a)(1)Medication Orders, Hospital; and §170.314(a)(6) Medication List, Hospital certification criteria. Tasks were also developed and implemented to evaluate risk, expressed as reduced effectiveness and potential negative outcomes for patients (NIST IR 7742 3.3). Tasks were presented to participants in an order typical of their clinical workflow, but pass/fail evaluation included prioritization of risk, and risk was used as absolute criteria for marking task completion as passed or failed. The study collected performance data from six (6) tasks. Morae Version 3.3.2 was used to record, observe, and analyze study data.

Task 1 1) Click Start Task to begin; 2) Select your patient. 3) View the patient's active medication list. 4) View the patient's inactive medication list. 5) Click End Task when you have completed the task.

Task 2 Discontinue the Omeprazole prescription.

Task 3 View the patient’s Coumadin order history.

Task 4 Your patient has been taking 2.5 mg of Coumadin daily. The physician now wants the patient to take 3 mg daily. Make that change.

Task 5 Create a medication order for a continuous infusion of sodium chloride 0.9% 100/ml/hr for a total of 2 liters.

Task 6 Create a medication order for a dobutamine continuous infusion starting at 5mcg/kg/min and enter titration parameters.

PROCEDURES Upon arrival, participants were greeted and their identity was verified. Each participant reviewed and signed an informed consent and release form 6 . The facilitator witnessed the participant’s signature. A usability study moderator was assigned to set up the portable usability lab, administer instructions and conduct post study interviews. Morae software was used to administer surveys and tasks, record task times, log task success and failures, register mouse clicks and capture participant comments. For each task, the participants were presented with on-screen instructions. Task timing began when the participant indicated they were ready to start the task by clicking the Start Task button displayed in an on-screen Task Instructions dialog box. Task time was stopped when the participant indicated they had successfully completed the task by clicking the End Task button displayed in the Task Instructions dialog. 6

See Appendix 3 for more information about the Consent and Release form.

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 7 of 13

Following the last task, a post-test questionnaire known as the System Usability Scale (SUS) was administered electronically. Once the post-test questionnaire was completed, participants were asked to discuss their rationale for answering the post test questions and were also requested to discuss the things they liked and did not like about the HELP Allergies. Participants were also invited to suggest improvements they felt would be useful. Participants were then thanked for their time and were invited to return to their normal activities. The Morae recordings captured each participant’s image as a “picture-in-picture” along with live action video of the software screen, mouse clicks, types of mouse clicks (left/right) and mouse movement. Participant demographic s, time on tasks, satisfaction surveys, comments and post-test questionnaire responses were also recorded with Morae. A compilation of all participant recordings were assembled together into a Morae project, thus allowing data analysis to be compiled and reviewed for all five participants as a set, as well as by individual participant. TEST ENVIRONMENT For the convenience of test participants, all test sessions were conducted in conference rooms at the clinical analysts’ work facility. Studies were run on a portable computer system comprising Intermountain’s current standard clinical software and operating system. This included a laptop computer running 64-bit Windows 7 connected to a standard-issue Dell keyboard and mouse, and a 22-inch 1680 X 1050 pixel Dell monitor. The system also included a Logitech Web Cam to record audio and video of the participants. Since all participants are part of the Intermountain Healthcare Corporation, the Intermountain LAN provided access to the Help/Tandem Pharmacy Order architecture (services, storage, etc.)

Help/Tandem Pharmacy Order was run from the Test environment. The module was used in conjunction with the

HELP application. From a technical and practical point of view, system performance, response time, etc. was representative of an authentic implementation experience. TEST FORMS AND TOOLS During the usability study, various documents and instruments were used, including: 1. Informed Consent 2. On-line Tasks prompts These tasks were followed by a Post-test Questionnaire or System Usability Scale (SUS) 7

The participant’s interaction with the Help/Tandem Pharmacy Order was captured and recorded digitally. A web camera recorded the participant’s facial expressions synced with the screen capture. Comments were recorded by the camera’s microphone. PARTICIPANT INSTRUCTIONS The usability study moderator allowed time for the participant to review the Informed Consent document 8 . The usability study moderator also reviewed the following highlighted areas of Consent & Release document aloud to each participant. … I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any

7

See Appendix 2 for more information about the System Usability Scale (SUS)

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 8 of 13

way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation… USABILITY METRICS According to the NISTIR Guide to the Processes Approach for Improving the Usability of Electronic Health Records, electronic health record applications should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency, and user satisfaction were captured during the usability testing. The goals of the study were to assess: 1. Effectiveness, by measuring participant success rates. 2. Efficiency, by measuring the average task time. 3. Satisfaction, by measuring ease of use ratings. DATA SCORING Morae can be configured to collect data about Time on Task (ToT), Success & Failure Achievement and the Number & Severity of Errors. For this study, Time on Task and Success & Failure were captured. Satisfaction data captured through surveys was also configurable. The post-test System Usability Scale (SUS) questionnaire is an industry-standard set of ten (10) questions that captures the overall satisfaction rating for each participant. Morae also allows for the calculation of an overall SUS score for the entire set of participants. DATA ANALYSIS AND REPORTING The results of the usability study were calculated according to the methods noted in the previous Usability Metrics section and the following types of data were collected for each participant: • • • • • •

Number of tasks successfully completed - within the allotted time and without assistance. Number of tasks not successfully completed. Time to complete each task. Major findings and/or Identified Problems and Opportunities based on issues that came up during the tasks. Participant’s verbalizations. Participant’s satisfaction ratings for the product’s ease of use.

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 9 of 13

PRODUCT EFFECTIVENESS RESULTS Task 1 – Review a selected patient’s active and inactive medications. (50% successful) Task 2 – Discontinue the patient’s Omeprazole order. (50% successful) Task 3 - View the patient’s Coumadin order history. (50% successful) Task 4 - Your patient has been taking 2.5 mg of Coumadin daily. The physician now wants the patient to take 3 mg daily. Make that change. (75% Successful) Task 5 - Create a medication order for a continuous infusion of sodium chloride 0.9% @ 100/ml/hr for a total of 2 liters. (25% successful) Task 6 - Create a medication order for a Dobutamine continuous infusion starting at 5mcg/kg/min and enter titration parameters. (0% successful) PRODUCT EFFICIENCY RESULTS None of the participants were able to successfully complete the tasks within the timeframe of that set by an expert user. (The expert user baseline is calculated as actual time X 1.25) ToT Expert (P0) P0 (*1.25) P1 P2 P3 P4

Task 1 1:03 0.11 2:01 3:11 1:25 :29

Task 2 :14 0.74 3:38 4:30 6:13 1:58

Task 3 :13 0.60 7:43 :37 2:54 :36

Task 4 1:08 0.26 1:30 8:26 11:55 3:30

Task 5 :35 0:43 4:58 8:03 6:38 3:54

Task 6 1:29 1:51 5:42 5:08 8:59 3:36

(Grey areas indicate tasks failed) PRODUCT SATISFACTION RESULTS 9

The System Usability Scale (SUS) is a survey containing ten (10) questions. The average score for all ten questions was calculated for each participant. The overall score for the entire product was then taken from the average score from all five (5) participants. A SUS score above a 68 is considered above average and anything below 68 is below average. 10 The HELP Pharmacy Orders module received a score of 28.13 thus indicating a below average level of satisfaction. All Participants P1 P2 P3 P4 SUS Score Standard Dev.

9 10

SUS Score 30 22.25 25 35 28.13 4.87

A participant’s task efficiency is measured against an expert user’s time on the same task multiplied by 1.25% Sauro, J. (2011). Measuring Usability with the System Usability Scale (SUS) ♦http://www.measuringusability.com/sus.php

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 10 of 13

MAJOR FINDINGS HELP has maintained its Command Line Interface (CLI) with myriad nested menus to the present version, resulting in fast but training-intensive “Pick From Thousands” (PFT) keystroke operation. 11 Observations of actual trained clinical users comport to the effectiveness and efficiency of our expert user, indicating the software can be highly useful when the user is highly trained and experienced. However, all of this test’s novice module users, though they had undergone some demonstration-training, found the software intuitive. •







11

HELP Pharmacy Medications Order module maintains a consistent “look and feel” throughout the order workflow. However, inconsistent operation of controls and non-standard vocabulary are significant roadblocks to usability. o One participant working on task 4, for example, found a menu to “Edit Doses.” This user repeatedly attempted to use this to modify the dose size rather than the number of doses. o Participants struggled to find a way to select a patient. (uncommon abbreviation) o “Input the indices of the drug you want to delete” drew a frustrated “What index?” o An unlabeled and seemingly required field followed only by “MG” together with an “Available Strengths” dialog caused most users to repeatedly attempt and fail to “resolve dose exactly” as the dialog instructed. The system does not provide user feedback or warnings at critical interaction points, resulting in lost time and errors. o Repeated verbalizations from users indicated they did not know when their orders were “done.” o One user, viewing a drug interaction screen, used [ESC] (the common way to “go back”) rather than a “Y” or “N” to save or abort the order. The order was lost with no feedback or warning to the user. Typical of its CLI heritage, HELP provides no tool or info tips on mouse hovers, and a novice user is without access to “secret handshakes” that make the system usable. For example, in our test, users could type a string of letters into a “route” entry field. If the system did not recognize the string as a legal coded input, the user could not proceed. Producing a menu of legal entries requires typing a “?” into the entry field. Our expert user utilized a “.” followed by a mnemonic to filter menus. Controls, variously F-keys or numbers, are not consistent in appearance or function. For example, some work with a mouse click and others do not. In multiple circumstances, users had difficulty understanding how the tool numbers related to item numbers. Further errors resulted from using numbers that activated unintended actions. This was particularly true when two rows of commands were available. Vertical scroll bars appear within some menus, but none of the participants noticed them without hinting.

Sudar, 2013, HELP/Tandem Allergies Module, p. 5

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 11 of 13

INDENTIFIED PROBLEMS & OPPORTUNITIES FOR IMPROVEMENT •



• • • •

12

Adherence to established usability norms has been shown to significantly improve efficiency while reducing training costs and errors. For example, “One study at NCR showed a 25% increase in throughput with an additional 25% decrease in errors resulting from redesign of screens to follow basic principles of good design” 12 The same authors document significant reduction in training costs with improved usability. Heuristic evaluation could be conducted to identify changes to the software that could reasonably be expected to improve user efficiency, effectiveness, satisfaction, and reduce user error. These could also be expected to reduce training time. Contextual and discount usability techniques should be used to identify vocabulary problems to decrease training time and improve user performance. User Interaction should be made consistent, at least throughout the module if not through the HELP EMR system. User feedback must be implemented, particularly when user error results in loss of data. When coded data is required in a field, users should be given an affordance to open coded data menus without having to know the “secret handshakes.”

Gallaway, 1981, quoted in Bias and Mayhew, 2005, Cost Justifying Usability, Morgan Kaufmann, p. 29.

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 12 of 13

APPENDICES The following appendices include supplemental data for this usability test report. Following is a list of the appendices provided:

Appendix 1: Demographic Survey Appendix 2: System Usability Scale (SUS) Questionnaire Appendix 3: Informed Consent and Release Form

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 13 of 13

Appendix 1: DEMOGRAPHIC SURVEY Demographic Survey

Thank you for participating in today's study. To help us understand your area of expertise, please take a few minutes to provide answers to the following questions. 1. What is your age group? () 18-34 () 35-49 () 50+ 2. What is your gender? () Female () Male 3. What is your current clinical specialty? () MD () Medical Assistant () Nurse Practitioner () Physician’s Assistant () Pharmacist () RN () Other Additional Comments: 4. How long have you worked in your area of clinical expertise? () 0-5 years () 6-10 years () 11-15 years () Over 15 years Additional Comments: 5. Where would you rate your electronic health record computer expertise? Novice 1 2 3 4 5 Expert Additional Comments: 6. How familiar are you with the HELP Allergies module? Novice 1 2 3 4 5 Expert Additional Comments: 7. Which best applies to your current work environment? () Clinic-Outpatient () Inpatient () Both Additional Comments: Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 14 of 13

Appendix 2: INFORMED CONSENT & RELEASE FORM For valued contributions received, the recognition and sufficiency of which are hereby acknowledged, I ___________________________________________________ (name of test, interview or research participant) have voluntarily agreed to participate in User Centered sessions, assessments or interviews conducted and/or recorded by: _________________________________________(name of person conducting tests, interviews or recordings) I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation. Although the sessions, assessments or interviews are not designed to create stress or discomfort, I understand that I may experience some stress or discomfort and that I may withdraw at any time without penalty. I understand the software designs I view may be confidential and I will not discuss them with anyone outside of this study group. I also understand that these designs may be experimental in nature. I also understand that they may or may not be implemented in any actual products. I have read the above release and consent, prior to its execution; I fully understand the contents and consequences thereof. This agreement shall be binding upon me and my heirs, legal representatives and assigns.  YES, I have read the above statement and agree to be a participant. __________________________________________________________ (Signature of Participant) Date: ________________________

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 15 of 13

Appendix 3: SYSTEM USABILITY SCALE QUESTIONNAIRE

1. I think that I would like to use this system frequently.

Strongly Disagree

Strongly Agree

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

2. I found the system unnecessarily complex.

3. I thought the system was easy to use.

4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly. 8. I found the system very cumbersome to use.

9. I felt very confident using the system.

10. I needed to learn a lot of things before I could get going with this system.

Dec. 3, 2013

HELP/Tandem Med Orders & Med List

Page 16 of 13

HELP 1/Tandem Allergies Module Module Studied:

HELP Allergies

Referenced Test Procedures 2 : §170.314(a) (7) Medication Allergy List - Hospital Dates of Usability Test:

11/1/2013 – 11/12/2013

Date of Report:

11/22/2013

Report Prepared By:

Wendy Sudar, Intermountain Healthcare Phone Number: 801.507.9165 Email address: [email protected]

1

HELP/Tandem software is not versioned. 2 The Office of National Coordinator for Health Information Technology ♦ 2014 Edition: Test Procedure for § 170.314a7medicationallergylist_2014_tp_approvedv1.2 ♦ Approved Test Procedure Version 1.2♦December 14, 2012

Table of Contents EXECUTIVE SUMMARY................................................................................................................... 4 PERFORMANCE RESULTS................................................................................................................ 4 MAJOR FINDINGS.......................................................................................................................... 5 INTRODUCTION ............................................................................................................................ 5 METHOD ..................................................................................................................................... 5 INTENDED USERS/PARTICIPANTS .................................................................................................... 5 STUDY DESIGN.............................................................................................................................. 6 TASKS .......................................................................................................................................... 6 PROCEDURES................................................................................................................................ 7 TEST ENVIRONMENT...................................................................................................................... 7 TEST FORMS AND TOOLS................................................................................................................ 8 PARTICIPANT INSTRUCTIONS .......................................................................................................... 8 USABILITY METRICS ....................................................................................................................... 8 DATA SCORING ............................................................................................................................. 9 MAJOR FINDINGS........................................................................................................................ 11 Appendix 1:

Demographic Survey............................................................................................... 12

Appendix 2:

System Usability Scale (SUS) Questionnaire ............................................................ 12

Appendix 3:

Informed Consent and Release Form ..................................................................... 12

Appendix 1: DEMOGRAPHIC SURVEY ............................................................................................. 13 Appendix 2: INFORMED CONSENT & RELEASE FORM........................................................................ 14 Appendix 3: SYSTEM USABILITY SCALE QUESTIONNAIRE ................................................................... 15

EXECUTIVE SUMMARY A summative usability study was completed by Intermountain Healthcare’s; Clinical Information Systems Business Analyst Team for the HELP/Tandem Allergies Module (hereafter referred to as HELP Allergies). As noted in Footnote 1 of the cover page, HELP is not versioned. Study sessions were conducted from 11/11/2013 – 11/12/2013 at the Intermountain Medical Center, South Office Building. The purpose of the study was to evaluate the ease of use for the current user interface and, additionally, to provide evidence of usability for HELP Allergies. During the usability study, four (4) healthcare providers matching the target demographic criteria served as participants and used the HELP Allergies module in simulated but representative tasks. The study collected performance data from six (6) tasks typically done using HELP Allergies. A practice task was also included to allow participants to become familiar with the process of interacting with screen task instructions and taking the online post-task surveys. Morae Version 3.3.2 3 was used to record, observe, and analyze study data. During the 30 minute one-on-one usability tests, each participant was greeted by the Usability Study Moderator and asked to review and sign an informed consent and release form 4 and were told that they could withdraw from the study at any time. All study participants were unfamiliar with the HELP Allergies product. Three of the four were also unfamiliar with HELP. The Usability Study Moderator introduced the study and instructed participants to complete a series of tasks. After the final task of the study, participants were asked to complete a System Usability Scale (SUS) questionnaire. Participant names were removed from the questionnaire. Study data can be linked, in some cases, to participant’s demographics but not to participants’ names. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records (NISTIR 7741) were used to evaluate the effectiveness, efficiency, and satisfaction of HELP Allergies. PERFORMANCE RESULTS Task 1: Import the Patient’s Allergies from HELP2 – 75% Success, 0% Efficiency Task 2: Discontinue an Allergy – 66% Success, 0% Efficiency Task 3: Review the list of no longer applicable allergies – 50% Success, 0% Efficiency Task 4: Add a Medication Allergy –100% Success, 0% Efficiency Task 5: Add a Food Allergy –100% Success, 0% Efficiency Task 6: Add a Second Allergy Reaction – 30% Success0% Efficiency

3

HELP Allergies module Morae is produced byTechSmith Corporation. Morae Recorder, Observer, and Manager were used during the course of this study. See http://www.techsmith.com/morae.html for additional product information. 4

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 4 of 13

MAJOR FINDINGS • • •

• • •

While experienced users can speedily move through the Command Line Interface (CLI), novice users found a non-GUI interface challenging to use, since functionality is hidden or obscured deep inside of menu sets. Keystrokes to access to the Pick from Thousands (PFT) list was not familiar to novice users, thus a rapid way to obtain data went unnoticed. “The UI is not user friendly and it is cumbersome in the way you have to get places. When you are not familiar with the system and until you learn the system it is not easy. When you are unfamiliar with the process it is slow to chart. It doesn't give you all the buttons so you know where to go next.” (Participant 2) During the HELP2 allergy list import, the message “No Ingredient has a HIC Code. Allergy will not be stored” appeared. “It was nice that it [HELP] pulled in the allergies from the outpatient program. This is a huge benefit.” (Participant 2) Adding multiple reactions was troublesome. There was no visible display for a second or third line entry, but the option was actually available. This caused a significant amount of confusion, as to how to enter multiple reactions.

INTRODUCTION The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability. To this end, measures of effectiveness, efficiency, and user satisfaction, such as task completion success, task completion times, and user satisfaction were captured during the usability testing. The application tested for this study was the HELP Allergies module which was designed to present medical information (e.g. the patient’s known allergy list) to healthcare providers in an outpatient setting. Specific areas for this study were the display and functionality of the patient’s allergies and the user interaction required to add, delete, and modify allergies using the software.

METHOD INTENDED USERS/PARTICIPANTS The intended users for HELP Allergies are physicians, mid-levels, nurses, and Health Unit Coordinators (HUCs). A total of four (4) clinical analysts, took part in the study. Participants were recruited from a pool of Intermountain Healthcare employed clinicians. These individuals were compensated for their time based on Intermountain Healthcare compensation policies. Participants were scheduled for 30-minute sessions. Participants were scheduled to take part in the study at their Intermountain facilities.

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 5 of 13

Participants had a mix of clinical specialties and demographic characteristics. The following table illustrates each participant by characteristic. To ensure anonymity, participant names are replaced with Participant ID’s. Product Experience E-Health Record Experience Work Environment Clinical Experience Profession Age Range Gender Participant ID

Average

Above Average

Above Average

Above Average

Above Average

Almost Expert

Above Average

Almost Expert

Clinic 0-5 RN 18-34 Female 1

Both 11-15 RN 35-49 Female 2

Both 0-5 RN 50+ Male 3

Both 11-15 Medical Technology 50+ Male 4

STUDY DESIGN Overall, the objective of this study was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction; and areas where the application failed to meet the needs of the participants. The data from this study may serve as a baseline for future studies using an updated version of the HELP Allergies. In short, this study serves as both a means to record or benchmark current usability and also to identify areas where improvements might be made. During the usability study, participants interacted with the same version of the HELP Allergies product and were provided with the identical instructions. The system was evaluated for effectiveness, efficiency, and satisfaction as defined by measures collected and analyzed for each participant: • • • •

Number of tasks successfully completed within the allotted time and without assistance Time to complete each task Participant comments Participant satisfaction ratings measured by Likert Scale surveys

TASKS Tasks were determined based on the following criteria: 1) prioritization based on fundamental and commonly performed activities; b) tasks specified in the ONC’s Test Procedure §170.314.xx HELP Allergies certification criterion. Tasks were developed and implemented to evaluate risk, expressed as reduced effectiveness and potential negative outcomes for patients (NIST IR 7742 3.3). Tasks were presented to participants in an order typical of their clinical workflow, but pass/fail evaluation included prioritization of risk, and risk was used as absolute criteria for marking task completion as passed or failed. The study collected performance data from seven (7) tasks. A practice task was also included to allow participants to become familiar with the process of interacting with on-screen tasks instructions and taking an online post-task survey. Morae Version 3.3.2 5 was used to record, observe, and analyze study data.

Practice Task Review the patient's Allergies list. Task 1 Import the patient’s medication allergies from HELP2. 5

Morae is produced byTechSmith Corporation. Morae Recorder, Observer and Manager were used during the course of this study. See http://www.techsmith.com/morae.html for additional product information. Nov. 22, 2013

HELP/Tandem Allergies Module

Page 6 of 13

Task 2 The patient reports that they do not have an allergy to COMPAZINE (Prochloroperazine). Make this adjustment. Task 3 Review the list of allergies that are no longer applicable for this patient. Task 4 The patient reports that when she took Penicillin on January 15, 2013, she broke out in a rash. You have reviewed her medication record and confirmed that she indeed received Penicillin G on that date. Document her penicillin allergy. Task 5 The patient reports that she has an allergy to Shrimp. allergy.

It is anaphylaxis. Document this food

Task 6 The patient also mentions that she now recalls that in addition to the rash she had when taking Penicillin, she also began wheezing. Add this information to the Penicillin allergy information. PROCEDURES Upon arrival, participants were greeted and their identity was verified. Each participant reviewed and signed an informed consent and release form 6 . The facilitator witnessed the participant’s signature. A usability study moderator was assigned to set up the portable usability lab, administer instructions and conduct post study interviews. Morae software was used to administer surveys and tasks, record task times, log task success and failures, register mouse clicks and capture participant comments. For each task, the participants were presented with on-screen instructions. Participants were also given a paper copy of each task. Task timing began when the participant indicated they were ready to start the task by clicking the Start Task button displayed in an on-screen Task Instructions dialog box. Task time was stopped when the participant indicated they had successfully completed the task by clicking the End Task button displayed in the Task Instructions dialog. Following the last task, a post-test questionnaire known as the System Usability Scale (SUS) was administered electronically. Once the post-test questionnaire was completed, participants were asked to discuss their rationale for answering the post test questions and were also requested to discuss the things they liked and did not like about the HELP Allergies. Participants were also invited to suggest improvements they felt would be useful. Participants were then thanked for their time and were invited to return to their normal activities. The Morae recordings captured each participant’s image as a “picture-in-picture” along with live action video of the software screen, mouse clicks, types of mouse clicks (left/right) and mouse movement. Participant demographic s, time on tasks, satisfaction surveys, comments and post-test questionnaire responses were also recorded with Morae. A compilation of all participant recordings were assembled together into a Morae project, thus allowing data analysis to be compiled and reviewed for all five participants as a set, as well as by individual participant. TEST ENVIRONMENT For the convenience of test participants, all test sessions were conducted in conference rooms at the clinical analysts’ work facility.

6

See Appendix 3 for more information about the Consent and Release form.

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 7 of 13

Studies were run on a portable computer system comprising Intermountain’s current standard clinical software and operating system. This included a laptop computer running 64-bit Windows 7 connected to a standard-issue Dell keyboard and mouse, and a 22-inch 1680 X 1050 pixel Dell monitor. The system also included a Logitech Web Cam to record audio and video of the participants. Since all participants are part of the Intermountain Healthcare Corporation, the Intermountain LAN provided access to the HELP Allergies architecture (services, storage, etc.) HELP Allergies was run from the Production environment. HELP Allergies was used in conjunction with the HELP application. From a technical and practical point of view, system performance, response time, etc. was representative of an authentic implementation experience. TEST FORMS AND TOOLS During the usability study, various documents and instruments were used, including: 1. Informed Consent 2. On line Tasks prompts 3. Post-test Questionnaire or System Usability Scale (SUS) 7 The participant’s interaction with the HELP Allergies was captured and recorded digitally. A web camera recorded the participant’s facial expressions synced with the screen capture. Comments were recorded by the camera’s microphone. PARTICIPANT INSTRUCTIONS The usability study moderator allowed time for the participant to review the Informed Consent document 8 . The usability study moderator also reviewed the following highlighted areas of Consent & Release document aloud to each participant. … I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation… USABILITY METRICS According to the NISTIR Guide to the Processes Approach for Improving the Usability of Electronic Health Records, electronic health record applications should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency, and user satisfaction were captured during the usability testing. The goals of the study were to assess: 1. Effectiveness, by measuring participant success rates. 7

See Appendix 2 for more information about the System Usability Scale (SUS)

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 8 of 13

2. Efficiency, by measuring the average task time. 3. Satisfaction, by measuring ease of use ratings. DATA SCORING Morae can be configured to collect data about Time on Task (ToT), Success & Failure Achievement and the Number & Severity of Errors. For this study, Time on Task and Success & Failure were captured. Satisfaction data captured through surveys was also configurable. The post-test System Usability Scale (SUS) questionnaire – is an industry-standard set of ten (10) questions that captures the overall satisfaction rating for each participant. Morae also allows for the calculation of an overall SUS score for the entire set of participants. DATA ANALYSIS AND REPORTING The results of the usability study were calculated according to the methods noted in the previous Usability Metrics section and the following types of data were collected for each participant: • • • • • •

Number of tasks successfully completed - within the allotted time and without assistance. Number of tasks not successfully completed. Time to complete each task. Major findings and/or Identified Problems and Opportunities based on issues that came up during the tasks. Participant’s verbalizations. Participant’s satisfaction ratings for the product’s ease of use.

PRODUCT EFFECTIVENESS RESULTS Task 1: Import the Patient’s Allergies from HELP2 – 75% Successful Three (3) of the four (4) participants successfully completed this task. Task 2: Discontinue an Allergy – 66% Successful Due to system crash, only three (3) participants were able to perform this task. Of these three (3), two (2) completed the task successfully. Task 3: Review the list of no longer applicable allergies – 50% Successful Two (2) of the four (4) participants successfully completed this task. Task 4: Add a Medication Allergy –100% Successful Due to system crash, only three (3) participants were able to perform this task. Of these three (3), all were able to successfully complete the task. Task 5: Add a Food Allergy –100% Successful Due to system crash, only three (3) participants were able to perform this task. Of these three (3), all were able to successfully complete the task. Task 6: Add a Second Reaction – 30% Successful Due to system crash, only three (3) participants were able to perform this task. Of these three, only one (1) successfully completed the task.

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 9 of 13

PRODUCT EFFICIENCY RESULTS None of the participants were able to complete the tasks within the timeframe of that set by an expert user. Task 1: Import the Patient’s Allergies from HELP2 – 0% Efficiency Task 2: Discontinue an Allergy – 66% Success, 0% Efficiency Task 3: Review the list of no longer applicable allergies – 50% Success, 0% Efficiency Task 4: Add a Medication Allergy –100% Success, 0% Efficiency Task 5: Add a Food Allergy –100% Success, 0% Efficiency Task 6: Add a Second Allergy Reaction – 30% Success, 0% Efficiency

ToT Task 1 Expert (P0) 0.09 0.11 P0 (*1.25) 15.94 P1 14.29 P2 57.13 P3 66.35 P4

Task 2 Task 3 Task 4 0.59 0.74 118.89 201.76 316.98 414.29

0.48 0.60 141.57 49.84 251.33 208.48

0.21 0.26 12.59 14.04 N/A 9.14

Task 5 0.67 0.84 94.8 168.11 N/A 168.68

Task 6 Task 7 0.7 0.88 73.87 77.28 N/A 216.56

0.89 1.11 83.21 69.66 N/A 142.62

Legend: Dark Gray represents tasks that were not successfully completed. PRODUCT SATISFACTION RESULTS 9

The System Usability Scale (SUS) is a survey containing ten (10) questions. The average score for all ten questions was calculated for each participant. The overall score for the entire product was then taken from the average score from all five (5) participants. A SUS score above a 68 is considered above average and anything below 68 is below average. 10 The HELP Allergies module received a score of 40.83 thus indicating a below average level of satisfaction. All Participants P1 P2 P3 P4 SUS Score Standard Dev.

9 10

SUS Score 45.00 40.00 N/A 37.50 40.83 3.82

A participant’s task efficiency is measured against an expert user’s time on the same task multiplied by 1.25% Sauro, J. (2011). Measuring Usability with the System Usability Scale (SUS) ♦ http://www.measuringusability.com/sus.php

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 10 of 13

MAJOR FINDINGS In addition to the performance data collected, the following qualitative observations were made: •

While experienced users can speedily move through the Command Line Interface (CLI), novice users found a non-GUI interface challenging to use, since functionality is hidden or obscured deep inside of menu sets.

Command Line Interface (CLI) • Several layers into the Allergy Module menus, a message appeared that said 'Press Any Key,’ (implying to continue), however when the participant pressed a number of keys, there was no response. It was not until the participant selected the Enter key, that the screen changed. • One participant made the following comment for improved interface efficiency: “If there is only one option, selecting it should automatically move you along and not require you to hit Enter again.” • •



Keystrokes to access to the Pick from Thousands (PFT) list was not familiar to novice users, thus a rapid way to obtain data went unnoticed. During the import of HELP2 Allergies into HELP, a dialog displaying HELP2 Allergy Information. The dialog had a recognizable bright red banner, but required a keystroke to dismiss the dialog and move on to the next screen. Novice users did not recognize this functionality and while they believed that had completed their task; they were actually still trapped in an incomplete state and the data was unrecorded. “The UI is not user friendly and it is cumbersome in the way you have to get places. When you are not familiar with the system and until you learn the system it is not easy. When you are unfamiliar with the process it is slow to chart. It doesn't give you all the buttons so you know where to go next.” (Participant 2)

Importing Allergies from HELP2 • Finding: The novice participants had difficulty locating the menu to import allergies from the outpatient product, HELP2. • Finding: During the HELP2 allergy list import, the message “No Ingredient has a HIC Code. Allergy will not be stored” appeared. All four (4) participants • P2 Quote: “It was nice that it [HELP] pulled in the allergies from the outpatient program. This is a huge benefit.” Adding an Allergy • The patient was allergic to Shrimp however; Shrimp was not available as an input option. Thus a cognitive process was required to put in correct allergy type – Shellfish. • Adding multiple reactions was troublesome. There was no visible display for a second or third line entry, but the option was actually available. This caused a significant amount of confusion, as to how to enter multiple reactions. • Novice users do not understand the PFT lists (Pick from Thousands). Thus many did not choose a reason for marking allergy as discontinued.

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 11 of 13

APPENDICES The following appendices include supplemental data for this usability test report. Following is a list of the appendices provided:

Appendix 1: Demographic Survey Appendix 2: System Usability Scale (SUS) Questionnaire Appendix 3: Informed Consent and Release Form

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 12 of 13

Appendix 1: DEMOGRAPHIC SURVEY Demographic Survey

Thank you for participating in today's study. To help us understand your area of expertise, please take a few minutes to provide answers to the following questions. 1. What is your age group? () 18-34 () 35-49 () 50+ 2. What is your gender? () Female () Male 3. What is your current clinical specialty? () MD () Medical Assistant () Nurse Practitioner () Physician’s Assistant () Pharmacist () RN () Other Additional Comments: 4. How long have you worked in your area of clinical expertise? () 0-5 years () 6-10 years () 11-15 years () Over 15 years Additional Comments: 5. Where would you rate your electronic health record computer expertise? Novice 1 2 3 4 5 Expert Additional Comments: 6. How familiar are you with the HELP Allergies module? Novice 1 2 3 4 5 Expert Additional Comments: 7. Which best applies to your current work environment? () Clinic-Outpatient () Inpatient () Both Additional Comments: Nov. 22, 2013

HELP/Tandem Allergies Module

Page 13 of 13

Appendix 2: INFORMED CONSENT & RELEASE FORM For valued contributions received, the recognition and sufficiency of which are hereby acknowledged, I ___________________________________________________ (name of test, interview or research participant) have voluntarily agreed to participate in User Centered sessions, assessments or interviews conducted and/or recorded by: _________________________________________(name of person conducting tests, interviews or recordings) I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation. Although the sessions, assessments or interviews are not designed to create stress or discomfort, I understand that I may experience some stress or discomfort and that I may withdraw at any time without penalty. I understand the software designs I view may be confidential and I will not discuss them with anyone outside of this study group. I also understand that these designs may be experimental in nature. I also understand that they may or may not be implemented in any actual products. I have read the above release and consent, prior to its execution; I fully understand the contents and consequences thereof. This agreement shall be binding upon me and my heirs, legal representatives and assigns.  YES, I have read the above statement and agree to be a participant. __________________________________________________________ (Signature of Participant) Date: ________________________

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 14 of 13

Appendix 3: SYSTEM USABILITY SCALE QUESTIONNAIRE

1. I think that I would like to use this system frequently.

Strongly Disagree

Strongly Agree

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

2. I found the system unnecessarily complex.

3. I thought the system was easy to use.

4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly. 8. I found the system very cumbersome to use.

9. I felt very confident using the system.

10. I needed to learn a lot of things before I could get going with this system.

Nov. 22, 2013

HELP/Tandem Allergies Module

Page 15 of 13

Usability Study Report

Clinical Decision Support §170.314(a)(8) Product(s) & Version:

Patient Advisory v. 1.17 and CDS via Help2 Clinical Desktop v. 2014-M05.15

Safety Enhanced Design Requirements Addressed or Referenced Test Procedures 1:

§170.314(a)(8) Clinical Decision Support, Ambulatory and Inpatient Setting

Dates of Usability Test:

April 23-28, 2014

Date of Report:

May 9, 2014

Report Prepared By: Carl Bechtold, Intermountain Healthcare Phone Number: (801) 507-9168 Email address: [email protected]

1

The Office of National Coordinator for Health Information Technology ♦ 2014 Edition: Test Script for §170.314(a)(8) Clinical Decision Support ♦CCHIT Approved Test Procedure Version 1.2♦Jan. 2, 2013

Table of Contents EXECUTIVE SUMMARY ..................................................................................................................................... 3 PERFORMANCE RESULTS ............................................................................................................................. 3 MAJOR FINDINGS............................................................................................................................................. 3 METHOD .......................................................................................................................................................... 4 INTENDED USERS/PARTICIPANTS ................................................................................................................ 4 STUDY DESIGN ................................................................................................................................................. 4 TASKS ........................................................................................................................................................... 5 PROCEDURES ............................................................................................................................................... 7 TEST ENVIRONMENT ................................................................................................................................... 7 TEST FORMS AND TOOLS ............................................................................................................................. 8 PARTICIPANT INSTRUCTIONS ...................................................................................................................... 8 USABILITY METRICS ..................................................................................................................................... 8 DATA SCORING ............................................................................................................................................ 8 DATA ANALYSIS AND REPORTING ............................................................................................................... 9 PRODUCT EFFECTIVENESS RESULTS ................................................................................................................ 9 PRODUCT EFFICIENCY RESULTS ....................................................................................................................... 9 PRODUCT SATISFACTION RESULTS ................................................................................................................ 10 MAJOR FINDINGS........................................................................................................................................... 10 INDENTIFIED PROBLEMS & OPPORTUNITIES FOR IMPROVEMENT ............................................................... 10 APPENDICES ................................................................................................................................................... 12 Appendix 1: DEMOGRAPHIC SURVEY ........................................................................................................ 13 Appendix 2: INFORMED CONSENT & RELEASE FORM ............................................................................... 14 Appendix 3:

SYSTEM USABILITY SCALE QUESTIONNAIRE ...................................................................... 15

EXECUTIVE SUMMARY A summative usability study was completed by Intermountain Healthcare’s, Clinical Information Systems Business Analyst Team for Patient Advisory v. 1.17 and CDS via Help2 Clinical Desktop v. 2014-M05.15 product (hereafter referred to as Patient Advisory and Help2). Study sessions were conducted from April 23-28, 2014 at multiple Intermountain Medical Center. The purpose of the study was to further evaluate the ease of use for the current user interface and, additionally, to provide evidence of usability for Clinical Decision Support within the Help2 product. During the usability study, five healthcare providers served as participants and used the Help2 modules product in simulated but representative tasks. The study collected performance data from two similar tasks typically done within an episode of care, with particular emphasis on transfer of care. Morae Version 3.3.3 2 was used to record, observe, and analyze study data. Participants also completed the study for 170.314(b)(3) Electronic Prescribing during their scheduled appointments to accommodate the brevity of the e-Prescribing tasks. During the one-hour scheduled appointments, each participant was greeted by the Usability Study Moderator and asked to review and sign an informed consent and release form 3 and were told that they could withdraw from the studies at any time. The Usability Study Moderator introduced the study and instructed participants to complete a series of tasks. As part of the study protocol, the Usability Study Moderator did not assist the participant unless task completion was necessary to begin a subsequent task. If assistance was required, the task results are noted as having been “failed.” Participant identification has been removed. Study data can be linked, in some cases, to participants’ demographics but not to participants’ names. After the final task of the study, participants were asked to complete a System Usability Scale (SUS) questionnaire and were compensated for their time according to Intermountain Healthcare payment guidelines. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records (NISTIR 7741) were used to evaluate the effectiveness, efficiency, and satisfaction of Patient Advisory and Help2. PERFORMANCE RESULTS • For purposes of this study, the software proved 100 percent effective relative to the Clinical Decision Support tasks and criteria. Both effectiveness and efficiency are discussed in greater detail below. MAJOR FINDINGS In addition to the performance data collected, the following qualitative observations were made: •

Patient Advisory was well accepted as a concept, although three of the participants remained undecided about where use of the tool might fit in their workflow.

2

Morae usability test suite, v. 3.3.3, from TechSmith Corporation. Morae is produced byTechSmith Corporation. Morae Recorder, Observer, and Manager were used during the course of this study. See http://www.techsmith.com/morae.html for additional product information.

3

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 3 of 13





Participants also expressed an interest in having alerts become active in relevant modules within Help2. For example, an alert discussing creatinine might display in the medications module since the alert was tied to an active Metformin order. The presence of notifications is denoted by a blue icon at the top of Help2’s left navigation menu. It is not intrusive, and thus not easily discoverable without training.

Other problems and areas for possible improvement are discussed below.

METHOD INTENDED USERS/PARTICIPANTS The intended users for Help2 are licensed providers and their legal proxies. Five clinicians took part in the study. Participants were recruited from a pool of Intermountain Healthcare employees. These individuals were compensated for their time based on Intermountain Healthcare compensation policies. Participants had a mix of clinical specialties and demographic characteristics. The following table illustrates each participant by characteristic. To ensure anonymity, participant names are replaced with Participant ID’s. A Technical Analyst highly skilled with the cPOE also completed the test scenario to provide benchmark metrics. Subject

Age

Gender

Years

EHR

Help2 Exp.

Ambulatory/Acute care

Benchmark

50+

M

6 to 10

4

5

Ambulatory

Participant 1

50+

M

>15

5

3

Both

Participant 2

50+

M

>15

4

3

Ambulatory

Participant 3

35-49

M

0 t0 5

3

3

Both

Participant 4

35-49

F

11 to 15

4

3

Ambulatory

Participant 5

35-49

F

11 to 15

5

4

Both

Participants were scheduled for one hour sessions. Participants were scheduled to take part in the study at Intermountain Medical Center.

STUDY DESIGN Overall, the objective of this study was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction; and areas where the application failed to meet the needs of the participants. The data from this study may serve as a baseline for future studies using an updated version of the Patient Advisory product. In short, this study serves as both a means to record or benchmark current usability and also to identify areas where improvements might be made. During the usability study, participants interacted with the same versions of the Help2 and all of its relevant modules, and were provided with the identical instructions. The system was evaluated for effectiveness, efficiency, and satisfaction as defined by measures collected and analyzed for each participant: • • • • May 8, 2014

Number of tasks successfully completed within the allotted time and without assistance Time to complete each task Participant comments Participant satisfaction ratings measured by Likert Scale surveys Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 4 of 13

TASKS Tasks were determined based on the following criteria: 1) prioritized tasks based on fundamental and commonly performed activities; b) tasks specified in the ONC’s Test Procedure § 170.314(a)(8) Clinical Decision Support certification criteria. Task prioritization included consideration of severity of risk, as required in NISTIR 7741. The study collected performance data from 12 tasks. A practice task was also included to allow participants to become familiar with the process of interacting with on-screen tasks instructions and taking an online post-task survey. Morae Version 3.3.3 was used to record, observe, and analyze study data.

Tasks for this study are listed below. Inset notations in italics indicate 170.314 (a) (8) criteria relevant to the task. These criteria also support the clinical scenario workflow and the investigation of the general usability of the products under consideration.

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 5 of 13

Task 1 Select the patient "XTESTUS, UCD-G" Facilitates Transfer of Care or Continuation of Care Task 2 View this patient's Problems Provides access to diagnostic and therapeutic references Task 3 This patient has Supranuclear Palsy. Find more information about this condition. Tests access to diagnostic and therapeutic references Task 4: A light blue icon has appeared in the left navigation column of Help2. What is it? What does it tell you about the patient? Facilitate first discovery of Patient Advisory CDS affordance Task 5 Can you tell where this advice came from? Display required information in 170.314 (a) (8) V 1-4) Task 6 Find more information about the patient’s Creatinine Level. Find information about the test itself, including ranges and diagnostic information. (About the test, rather than this set of results.) CDS combined alert from two or more sources (Labs and Medications) Discovery of diagnostic and therapeutic references relevant to CDS notification. Task 7 Change Patient: Patient XTESTUS, UCD-H has returned from her winter residence. Intermountain has received information about her care when she was away. Select this patient. Transfer of care. Incoming care record has triggered CDS messages Task 8 View Problems: What problems does this patient have? Task 9 View Advisories: What Patient Advisory notices does this patient have? Show what actions you would take based on this information. Demonstrate actions triggered by CDS to Medications, Vitals, Problems, Alergies, etc. Task 10 e-Prescribe through the North Ogden Walgreens: Jantoven 2 mg oral tablet, QD, 30 tablets, no refills. If an alert appears, find the source of the information. Drug-Drug alert exposes source/authority of support statement Task 11 In the patient's medication list, what does the "N" icon mean? Non-managed medication is source of CDS message in Patient Advisories Replace Medication: Discontinue Cough Control DM and replace with e-Prescription: Dextromethorphan -Guaifenesin 10 mg-200 mg capsule, Triggers dual alert from two issues: Medication Alergy and Drug-Drug Interaction. Test ability to determine source/authority of alert. Task 12 Find more information about Marplan Discover diagnostic and therapeutic references.

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 6 of 13

PROCEDURES Upon arrival, participants were greeted and their identity was verified. Each participant reviewed and signed an informed consent and release form 4. The facilitator witnessed the participant’s signature. A usability study moderator was assigned to set up the portable usability lab, administer instructions and conduct post study interviews. Morae software was used to administer surveys and tasks, record task times, log task success and failures, register mouse clicks and capture participant comments. A study recorder/observer assisted in marking key areas of the session recording to be reviewed after the study and captured comments made by stakeholders. For each task, the participants were presented with on-screen instructions. Task timing began when the participant indicated they were ready to start the task by clicking the Start Task button displayed in an on-screen Task Instructions dialog box. Task time was stopped when the participant indicated they had successfully completed the task by clicking the End Task button displayed in the Task Instructions dialog. Following the last task, a post-test questionnaire known as the System Usability Scale (SUS) was administered electronically. Once the post-test questionnaire was completed, participants were asked to discuss their rationale for answering the post test questions and were also requested to discuss the things they liked and did not like about the modules under consideration. Participants were also invited to suggest improvements they felt would be useful. Participants were then thanked for their time and were invited to return to their normal activities. The Morae recordings captured each participant’s image as a “picture-in-picture” along with live action video of the software screen, mouse clicks, types of mouse clicks (left/right) and mouse movement. Participant demographic s, time on tasks, satisfaction surveys, comments and post-test questionnaire responses were also recorded with Morae. A compilation of all participant recordings were assembled together into a Morae project, thus allowing data analysis to be compiled and reviewed for all five participants as a set, as well as by individual participant. TEST ENVIRONMENT Help2 is used in a variety of clinical settings ranging from portable or countertop computers in Emergency Departments to private physician office or cubicle spaces. To ensure more consistent metrics for these summative sessions, the studies were conducted in a small conference room where participants were not likely to be interrupted. Studies were run on a portable computer system comprising Intermountain’s current standard clinical software and operating system. This included a laptop computer running 64-bit Windows 7 connected to a standard-issue Microsoft keyboard and mouse, and a 22-inch 1680 X 1050 pixel Dell monitor. The system also included a Logitech Web Cam to record audio and video of the participants. Since all participants are part of the Intermountain Healthcare Corporation, the Intermountain LAN provided access to the Help2 and architecture (services, storage, etc.)

Help2 was run from the Test-Verification environment. Patient Advisory and all other modules for this study were used in conjunction with the Help2 desktop application. From a technical and practical point of view, system performance, response time, etc. was representative of an authentic implementation experience.

4

See Appendix 3 for more information about the Consent and Release form.

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 7 of 13

TEST FORMS AND TOOLS During the usability study, various documents and instruments were used, including: 1. Informed Consent 2. On line Tasks prompts 3. Post-test Questionnaire or System Usability Scale (SUS) 5 The participant’s interaction with the software under consideration was captured and recorded digitally. A web camera recorded the participant’s facial expressions synced with the screen capture. Comments were recorded by the camera’s microphone. PARTICIPANT INSTRUCTIONS The usability study moderator allowed time for the participant to review the Informed Consent document 6. The usability study moderator also reviewed the following highlighted areas of Consent & Release document aloud to each participant. … I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation… USABILITY METRICS According to the NISTIR Guide to the Processes Approach for Improving the Usability of Electronic Health Records, electronic health record applications should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency, and user satisfaction were captured during the usability testing. The goals of the study were to assess: 1. Effectiveness, by measuring participant success rates. 2. Efficiency, by measuring the average task time. 3. Satisfaction, by measuring ease of use ratings. DATA SCORING Morae can be configured to collect data about Time on Task (ToT), Success & Failure Achievement and the Number & Severity of Errors. For this study, Time on Task and Success & Failure were captured. Satisfaction data captured through surveys was also configurable. The post-test System Usability Scale (SUS) questionnaire is an industry-standard set of 10 questions that captures the overall satisfaction rating for each participant. Morae also 5

See Appendix 2 for more information about the System Usability Scale (SUS)

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 8 of 13

allows for the calculation of an overall SUS score for the entire set of participants. DATA ANALYSIS AND REPORTING The results of the usability study were calculated according to the methods noted in the previous Usability Metrics section and the following types of data were collected for each participant: • • • • • •

Number of tasks successfully completed - within the allotted time and without assistance. Number of tasks not successfully completed. Time to complete each task. Major findings and/or Identified Problems and Opportunities based on issues that came up during the tasks. Participant’s verbalizations. Participant’s satisfaction ratings for the product’s ease of use.

PRODUCT EFFECTIVENESS RESULTS All participants were able to complete of the tasks relevant to Decision Support activities except one. All participants failed to find the source of information indicating the source of drug allergy and drug-drug interactions. The affordance for that discovery is a “secret handshake” comprising an info tip exposed by hovering over a label. (See below in Opportunities for Improvement.) Participants also had difficulty with e-Prescription culminating gestures. Those, however, were extensively covered in a separate report of tasks completed in the same sessions as those used for this report 7 and are not documented here.

PRODUCT EFFICIENCY RESULTS Study Time on Task (ToT) measurements hope for average participant times to equal about 125 percent of a benchmark or “super user.” In this case, the total mean times of all participants were 19.16 minutes compared to the benchmark total of 5.53, or roughly 346 percent. Given some of the functionality was new, and the participants were often verbose during the sessions, this metric appears exaggerated. Task 1 Benchmark 0.08 Participant 1 0.32 Participant 2 0.6 Participant 3 0.42 Participant 4 0.48 Participant 5 0.36 Minimum 0.32 Maximum 0.6 Mean 0.44 Stnd. Dev. 0.11 Efficiency % 18.2

7

author. May 8, 2014

2 0.24 0.15 0.29 0.26 0.19 0.52 0.15 0.52 0.28 0.15 85.7

3 0.27 0.16 1.37 0.39 2.28 0.73 0.16 2.28 0.98 0.85 27.6

4 0.23 0.59 4.3 1.27 3.94 0.79 0.59 4.3 2.18 1.79 10.6

5 0.12 0.57 1.06 0.74 0.55 0.43 0.43 1.06 0.67 0.24 17.9

6 0.52 1.04 1.65 0.98 2.81 0.45 0.45 2.81 1.38 0.9 37.7

7 0.15 0.3 0.73 0.5 3.82 0.25 0.25 3.82 1.12 1.52 13.4

8 0.25 0.38 0.54 0.26 3.4 0.31 0.26 3.4 0.98 1.36 25.5

9 0.73 0.93 0.94 3.05 5.31 1.96 0.93 5.31 2.44 1.83 29.9

10 1.62 2.97 5.2 4.8 1.16 5.49 1.16 5.49 3.92 1.83 41.3

11 1.08 3.79 5.92 2.61 2.19 2.8 2.19 5.92 3.46 1.49 31.2

12 0.24 0.38 0.72 2.58 2.26 0.6 0.38 2.58 1.31 1.03 18.3

Bechtold, Carl. (2014) Usability Study Report Electronic Prescribing § 170.314(b)(3) available from the Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 9 of 13

PRODUCT SATISFACTION RESULTS The System Usability Scale (SUS) is a survey containing ten (10) questions. The average score for all ten questions was calculated for each participant. The overall score for the entire product was then taken from the average score from all five participants. A SUS score above a 68 is considered above average and anything below 68 is below average. 8 The Help2 products evaluated in this study received a satisfaction score of 77.08. Satisaction Scores Benchmark Participant 1 Participant 2 Participant 3 Participant 4 Participant 5 Minimum Maximum Mean Standard Dev.

Score 75 85 67.5 92.5 72.5 70 67.5 92.5 77.08 9.67

MAJOR FINDINGS As stated in the Executive Summary, the following bulleted items are major findings for this study are: • •



Patient Advisory was well accepted as a concept, although three of the participants remained undecided about where their use of the tool might be utilized in their workflow. Participants also expressed an interest in having alerts become active in relevant modules within Help2. For example, an alert discussing creatinine might display in the medications module since the alert was tied to an active Metformin order. The presence of notifications is denoted by a blue icon at the top of Help2’s left navigation menu. It is not intrusive, and thus not easily discoverable without training.

INDENTIFIED PROBLEMS & OPPORTUNITIES FOR IMPROVEMENT As stated in the Executive Summary, the following bulleted items are areas of improvement for this study: •

8

The presence of Advisories is altogether as an icon in the Help2 left navigation panel. In early pre-session trials of the scenario it was not discovered by experienced Help2 users. As a result, a task was added to point the users’ attention to it and to activate it. The following are related issues expressed by the participants during the studies and retrospective discussions:

Sauro, J. (2011). Measuring Usability with the System Usability Scale (SUS) ♦ http://www.measuringusability.com/sus.php

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 10 of 13

o o o

• •





Other Help2 alerts scroll across the screen, often in red type. The user must “go find” these advisories, even though some are critical. The Advisories icon has no hover state. Some users thought it was decorative or that they could only hover over it and read the tool tip. The advisories icon also has no label, though it might, given the display of other icons in Help2.

Participants clicked on items in the advisory rows, expecting to be taken to relevant modules. (For example, clicking on Creatinine or Metformin might take them to the Labs or Medications modules. The Info button in Patient Advisories provides information unlike that found in other modules. In other Help2 modules, the button produces diagnostic or therapeutic information. In retrospectives, all participants questioned the need for the information displayed, and some offered a suggestion that it be provided by a link titled “About,” or in the “Help” link to User Assistance, as it is in other Help2 modules. The source of information in Drug-Allergy alerts was not discoverable to any of the participants. It is activated as an info tip over a label. That said, none of the participants thought the information revealed was helpful, especially at the moment of decision when the alert window was open. (They couldn’t think of any reason why they would want that information at that moment in the workflow.)

Two participants suggested a report of advisories for all of their patients might be useful.

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 11 of 13

APPENDICES The following appendices include supplemental data for this usability test report. Following is a list of the appendices provided: Appendix 1:

Demographic Survey

Appendix 2:

System Usability Scale (SUS) Questionnaire

Appendix 3:

Informed Consent and Release Form

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 12 of 13

Appendix 1: DEMOGRAPHIC SURVEY

Demographic Survey

Thank you for participating in today's study. To help us understand your area of expertise, please take a few minutes to provide answers to the following questions. 1. What is your age group? ( ) 18-34 ( ) 35-49 ( ) 50+ 2. What is your gender? ( ) Female ( ) Male 3. What is your current clinical specialty? ( ) MD ( ) Medical Assistant ( ) Nurse Practitioner ( ) Physician’s Assistant ( ) Pharmacist ( ) RN ( ) Other Additional Comments: 4. How long have you worked in your area of clinical expertise? ( ) 0-5 years ( ) 6-10 years ( ) 11-15 years ( ) Over 15 years Additional Comments: 5. Where would you rate your electronic health record computer expertise? Novice 1 2 3 4 5 Expert Additional Comments: 6. How familiar are you with Product Name? Novice 1 2 3 4 5

Expert

Additional Comments: 7. Which best applies to your current work environment? ( ) Clinic-Outpatient ( ) Inpatient ( ) Both Additional Comments: May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 13 of 13

Appendix 2: INFORMED CONSENT & RELEASE FORM For valued contributions received, the recognition and sufficiency of which are hereby acknowledged, I ___________________________________________________ (name of test, interview or research participant) have voluntarily agreed to participate in User Centered sessions, assessments or interviews conducted and/or recorded by: _________________________________________(name of person conducting tests, interviews or recordings) I understand these sessions are examinations of software or processes and not studies of my abilities or knowledge. I understand that my participation in these sessions, assessments or interviews does not in any way, positively or negatively, affect my employment or position within my organization, and will not be used in any way to evaluate my abilities or performance. I understand that I will be recorded with both video and voice recording equipment during this study and the equipment records all of my keyboard and mouse actions, as well as what I say. I understand that I may withdraw from this session, assessment or interview at any time and request that the results and/or recordings not be used. I also understand that withdrawing will not affect any consideration given for participation and will not be used in any way to affect my employment or position within my organization or my annual corporate performance evaluation. Although the sessions, assessments or interviews are not designed to create stress or discomfort, I understand that I may experience some stress or discomfort and that I may withdraw at any time without penalty. I understand the software designs I view may be confidential and I will not discuss them with anyone outside of this study group. I also understand that these designs may be experimental in nature. I also understand that they may or may not be implemented in any actual products. I have read the above release and consent, prior to its execution; I fully understand the contents and consequences thereof. This agreement shall be binding upon me and my heirs, legal representatives and assigns. 

YES, I have read the above statement and agree to be a participant.

____________________________________________ (Signature of Participant)

_______________________ (Date)

_____________________________________________ (Usability Study Topic/Modules)

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 14 of 13

Appendix 3:

SYSTEM USABILITY SCALE QUESTIONNAIRE

1. I think that I would like to use this system frequently.

Strongly Disagree

Strongly Agree

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

1

2

3

4

5

2. I found the system unnecessarily complex.

3. I thought the system was easy to use.

4. I think that I would need the support of a technical person to be able to use this system. 5. I found the various functions in this system were well integrated. 6. I thought there was too much inconsistency in this system. 7. I would imagine that most people would learn to use this system very quickly. 8. I found the system very cumbersome to use.

9. I felt very confident using the system.

10. I needed to learn a lot of things before I could get going with this system.

May 8, 2014

Summative Usability Testing Results, Clinical Decision Support, Patient Advisory via Help2

Page 15 of 13

Recommend Documents
Schumacher, R. M., & Lowry, S. Z., & National Institute of Standards and Technology ...... {Cora Mae Riley/James Harris Johnston/Janice Vivian Erath/Gerald Ray ...

Lexicomp. ONC 314a15. Patient-Specific Education. Resources. NewCrop Direct Email. ONC 314b1, ONC 314b2. Direct secure email. VSS Patient Portal.

Test Results Summary for 2014 Edition EHR Certification. Version ..... (Because asymmetric cryptography is used to generate the event si gnature, the si gnature.

Sep 8, 2017 - Page 1 of 2. Template Version 1. ONC HIT Certification Program. Test Results Summary for 2014 Edition EHR Certification. Part 1: Product and ..... Certification Inherited from CHPL Product Number: 140202R05 .... *For a list of the 2014

Mar 20, 2017 - (b)(4). (d)(7). (a)(9). (b)(5). (d)(8). (g)(1). (a)(10). (b)(6) Inpt. only. (d)(9) Optional. (g)(2). (a)(11). (b)(7). (e)(1). (g)(3). (a)(12). (c)(1). (e)(2) Amb. only. (g)(4). (a)(13). (c)(2). (e)(3) Amb. only. No inherited certificat

Apr 18, 2014 - Phone: 2800 Rockcreek Parkway; Kansas City, Missouri 64117 .... The following identifies the standard(s) that has been successfully tested.

100. 111. 190. 53. 102. 113. 55. 104. 114. 60. 105. 171. 3.2.8 Automated Numerator Recording and Measure Calculation. 3.2.8.1 Automated Numerator Recording ..... Task Success Rate: 80% of participants successfully complete the task (NISTIR 7741). ...

Mar 20, 2015 - “five rights” checking for medication administration .... (e)(2) Amb. only. 1.2. 1.6. (a)(13). 1.2. (e)(3) Amb. only. 1.3. (a)(14). 1.2. (f)(1). 1.2. 1.2. (a)(15) .... 1.1.6 170.314(a)(16) Electronic medication administration recor

A single UCD Design Process was used for all of the above criteria post development: NISTIR. 7741: NIST ..... clinical users comport to the effectiveness and efficiency of our expert user, indicating the software can be highly ...... orderable items,

Dec 23, 2015 - ICSA Labs, an independent division of Verizon ..... practicing family physicians guided the software developer in the context of the Aligent.