“Think aloud” and “Near live” usability testing of two complex clinical decision support tools

Safiya Richardson, Rebecca Mishuris, Alexander O'Connell, David Feldstein, Rachel Hess, Paul Smith, Lauren McCullagh, Thomas McGinn, Devin Mann

Research output: Contribution to journalArticlepeer-review

67 Scopus citations

Abstract

Objectives Low provider adoption continues to be a significant barrier to realizing the potential of clinical decision support. “Think Aloud” and “Near Live” usability testing were conducted on two clinical decision support tools. Each was composed of an alert, a clinical prediction rule which estimated risk of either group A Streptococcus pharyngitis or pneumonia and an automatic order set based on risk. The objective of this study was to further understanding of the facilitators of usability and to evaluate the types of additional information gained from proceeding to “Near Live” testing after completing “Think Aloud”. Methods This was a qualitative observational study conducted at a large academic health care system with 12 primary care providers. During “Think Aloud” testing, participants were provided with written clinical scenarios and asked to verbalize their thought process while interacting with the tool. During “Near Live” testing participants interacted with a mock patient. Morae usability software was used to record full screen capture and audio during every session. Participant comments were placed into coding categories and analyzed for generalizable themes. Themes were compared across usability methods. Results “Think Aloud” and “Near Live” usability testing generated similar themes under the coding categories visibility, workflow, content, understand-ability and navigation. However, they generated significantly different themes under the coding categories usability, practical usefulness and medical usefulness. During both types of testing participants found the tool easier to use when important text was distinct in its appearance, alerts were passive and appropriately timed, content was up to date, language was clear and simple, and each component of the tool included obvious indicators of next steps. Participant comments reflected higher expectations for usability and usefulness during “Near Live” testing. For example, visit aids, such as automatically generated order sets, were felt to be less useful during “Near-Live” testing because they would not be all inclusive for the visit. Conclusions These complementary types of usability testing generated unique and generalizable insights. Feedback during “Think Aloud” testing primarily helped to improve the tools’ ease of use. The additional feedback from “Near Live” testing, which mimics a real clinical encounter, was helpful for eliciting key barriers and facilitators to provider workflow and adoption.

Original languageEnglish
Pages (from-to)1-8
Number of pages8
JournalInternational Journal of Medical Informatics
Volume106
DOIs
StatePublished - Oct 2017
Externally publishedYes

Keywords

  • Clinical decision support
  • Health informatics
  • Provider adoption
  • Usability
  • Usability testing
  • User experience
  • Workflow

Fingerprint

Dive into the research topics of '“Think aloud” and “Near live” usability testing of two complex clinical decision support tools'. Together they form a unique fingerprint.

Cite this