0 comments

Fewer alerts, higher response rates, safer patients

  • Steven Shaha, Ph.D., DBA
  • 09/01/2015

Applications of electronic health record (EHR) technology and capabilities often do NOT consider the needs of pediatric settings. As healthcare clinicians know, pediatric patients are not just small adults. Children have unique and different needs and interventions than their adult counterparts, so that many adult-ready HIT solutions do not necessarily fit comfortably with pediatric patients or caregivers.

One of our clients, a pediatric specialty hospital, had been successfully using the EHR with full computerized physician order entry (CPOE) for three years. As part of its ongoing emphasis on continuous improvements, the organization programmed and adapted the EHR to better manage dosing-related computations. Concurrently, the organization was focused on reducing alerts due to alert fatigue issues clinicians were experiencing.

Would clinicians value computer-generated recommendations for dosing?

The guiding imperative was simple:  Clinicians value and therefore act on owned, computer-generated recommendations for dosing. On the other hand, if pediatric caregivers don’t have a programmable EHR, they are forced to rely either on old-school pocket notes and computations to determine best dosing OR on black-box-like, EHR-imposed, inflexible requirements from elsewhere without any local or innovation reflective capabilities.

This facility had the benefit of a programmable EHR, Allscripts SunriseTM. These clinicians had their EHR locally programmed to calculate doses for each child invisibly in the background, each reflecting clinical documentation, such as age, weight, and other drugs onboard. Dosing recommendations for prioritized drugs were programmed into the EHR for genuine clinical decision support for prescribers in real-time. The dosing recommendations appear within informed, intelligent order sets for prescribers. Then clinicians use the EHR’s real-time, ad-hoc analytics and reporting capabilities to monitor efficacy and guide continuous improvement thereafter.

Again, the imperative was for the EHR to generate dosing recommendations for improving clinician decisions and their impact on pediatric patients. To do so, the organization engaged local clinicians in defining best practices to improve responsiveness to alerts and adherence to recommendations.

Analyses revealed statistically significant and favorable results for the locally-defined computer-generated dosing recommendations for three metrics:

31% fewer alerts for questionable doses

The alert rate was 31.2% lower (p<0.001) for pediatric orders through the computer-computed order sets versus alerts with CPOE alone. Prescribers with computer-assisted dosing had fewer errors. The alert rate was already under 10% due to previous CPOE- and EHR-based alert-reduction improvements, so this reduction, while statistically significant, was not as immense as it might have been in usual circumstances.

Prescribers twice as likely to respond to alerts

While alerts were less frequent, prescribers were 122.8% (p<0.001) more likely to respond to alerts received than before – that’s more than double the likelihood of responding versus prior to the implementation. Prescribers were more attentive to alerts, perceiving the alerts valuable and relevant, even when they reflected computationally-based recommendations. This response rate verified that former black-box-like alerts were viewed as nuisance alerts, and sources of “alert fatigue”, and thus had lower responsiveness.

59% fewer medication-related incidents

Finally and most importantly, the percentage of orders that led to medication-related errors or incidents fell to 59.2% fewer (p<0.001) than before the newer alert-based approach.

With guidance and decision-making assistance through the EHR, prescribers received fewer alerts, were more likely to respond to and act on those alerts, and provided their patients with significantly fewer errors and incidents. Bottom line, when the EHR) computes the dosing, with local definitions and engagement, the results are more precise and lead to significantly and substantially fewer errors.

Only four of every 10 errors that would have previously occurred now reached a patient. The organization improved quality of care through the power of the programmable EHR. And that programming reflected locally specified parameters, not those dictated by any distant HIT vendor or former model not reflecting openness to innovation and improvement.

It remains curious and mysterious to me how so many leaders in health care are willing to hold fast to blind adherence by clinicians to inflexible EHR-vendor-determined standards of care as a measure of quality or excellence. Programmable and adaptable EHRs, ready for innovation and progress in care models and delivery, seem a more desirable ideal.

For Sunrise clients interested in learning about more of our clients’ best outcomes, visit our Client Outcomes Collaboration Program on ClientConnect.

Editor’s Note: Learn more about how Sunrise helped Phoenix Children’s Hospital with dose range checking in a recent blog post: Kid-sized doses in an adult-sized world

Tags: , , ,

About the author

Steven H. Shaha, Ph.D., DBA, is a Professor at the Center for Policy & Public Administration, and the Principal Outcomes Consultant for Allscripts. Dr. Shaha received his first doctorate in Research Methods and Applied Statistics from UCLA and has taught and lectured at universities including Harvard, University of Utah, UCLA, Princeton, Cambridge and others. An internationally recognized thought leader, lecturer, consultant and outcomes researcher, Dr. Shaha has provided advisory and consulting work to healthcare organizations including the National Institutes for Health (NIH), and to over 50 non-healthcare corporations including RAND Corp, AT&T, Coca-Cola, Disney, IBM, Johnson & Johnson, Kodak, and Time Warner. Dr. Shaha has presented over 200 professional papers, has over 100 peer-reviewed publications in print, over 35 technical notes and two books. He served on the 15-member team that authored and piloted the Malcolm Baldrige National Quality Award for Health Care, and he contributed to the Baldrige for Education.

SHARE YOUR COMMENTS:

Your email address will not be published. Required fields are marked *


*