Wednesday, February 08, 2012

Cullen and friends helped show the way

Putting aside my occasionally snarky attitude about its slow-to-innovate health care market, one of the pleasures of living in Boston is casually running into some of the intellectual greats in the field.  These are extremely well-intentioned and thoughtful people who have done some of the seminal work in health care quality improvement.

One such person is David Cullen, whom I ran into at our dentist's office.  (I trust it is not a HIPAA violation here to report that he was smiling as he left!)  David, a former anesthesiologist at MGH, participated in a very important systems analysis of adverse drug events (ADEs), published in JAMA in 1995.*

This was a great piece of work, conducted with Lucian Leape, David Bates, and other notable folks.  Here were the conclusions:

Results. —During this period, 334 errors were detected as the causes of 264 preventable ADEs and potential ADEs. Sixteen major systems failures were identified as the underlying causes of the errors. The most common systems failure was in the dissemination of drug knowledge, particularly to physicians, accounting for 29% of the 334 errors. Inadequate availability of patient information, such as the results of laboratory tests, was associated with 18% of errors. Seven systems failures accounted for 78% of the errors; all could be improved by better information systems. 

Conclusions. —Hospital personnel willingly participated in the detection and investigation of drug use errors and were able to identify underlying systems failures. The most common defects were in systems to disseminate knowledge about drugs and to make drug and patient information readily accessible at the time it is needed. Systems changes to improve dissemination and display of drug and patient data should make errors in the use of drugs less likely. 

Importantly, Cullen and his colleagues found that incident reporting systems in hospitals did not capture these events.  In a second study in the Journal of Quality Improvement, they noted:

Of 54 adverse drug events identified by the study, only 3 patients (6%) had a corresponding incident report submitted to the hospital's quality assurance program or called into the pharmacy hotline. One additional ADE was identified by an IR, but not by the ADE study. Of the 55 ADEs, 15 were preventable, and 26 were serious or life-threatening, yet only 2 of the 26 led to an incident report. The three voting groups agreed that most ADEs justified an IR, but judged that in actual practice, an IR would infrequently have been filed.

After these studies, Cullen and the other authors quantified the costs of these types of errors.  As reported in JAMA,

After adjusting for our sampling strategy, the estimated postevent costs attributable to an ADE were $2595 for all ADEs and $4685 for preventable ADEs. Based on these costs and data about the incidence of ADEs, we estimate that the annual costs attributable to all ADEs and preventable ADEs for a 700-bed teaching hospital are $5.6 million and $2.8 million, respectively.

I was not involved in health care in those days, and so I don't know how these reports were received by the profession.  Reading them today, you have to be impressed with the methodologies employed and the clear statement of conclusions.  They should have been hard to ignore.

I am guessing that the articles led to an accelerated adoption of computerized provider order entry (CPOE) systems.  CPOE can be an excellent technological fix to several of the problems noted in the study.  The algorithms in CPOE can protect against drug-drug interactions, can ensure that dosing is proportional to the size and weight of the patient, can avoid allergic reactions, and the like.  CPOE also gets rid of transcription errors resulting from bad handwriting!

That being said, "accelerated" may be an overstatement, as the roll-out of CPOE in the nation's hospitals was very, very slow.  When I was at BIDMC, we had a very good system in place, but at the time, fewer that 10% of the nation's hospitals did.  In this article published by Kevin, MD, our CIO John Halamka summarized our results:  "Our experience with CPOE over the past 7 years is that it has reduced medication error by 50%, it paid for itself within 2 years, and clinicians have embraced it."  He then catalogued obstacles faced by others.  These, ironically, are often indicative of systems failures in themselves.  John explains with this example:

Automating a bad process does not improve anything. When I was a resident, I was told that heparin should be dosed as a 5000 unit bolus then an infusion of 1500 units per hour for every patient. I was not taught about relating heparin dosing to body mass index, creatinine clearance or the presence of other medications. Unfortunately, it often took days to get the heparin dosing right because 5000/1500 is definitely not a one size fits all rule. Creating an automated CPOE order for 5000/1500 is not going to improve the safety or efficacy of heparin dosing. Implementing a new protocol for dosing based on evidence that includes diagnosis, labs, and body mass index will improve care. Our experience is that it is best to fix the process, then automated the fixed process. By doing this, no one can blame the software for the pain of adapting to the process change. 

What are the lessons of all this?

There is no substitute for good research.  In a data-driven world, well performed studies help demonstrate that there is scientific basis for quality improvement.  We could be making even more progress if academic medical centers supported and rewarded this kind of work -- in terms of faculty career advancement and recognition -- as much as they reward basic science research.  Likewise, medical schools and residency programs should be teaching their students how to conduct this kind of research.

The spread of technology in the health care field is as slow as molasses.  My theory is that this is a function, in part, of the difference in priorities between clinicians and administrators in hospitals.  Shared governance of health care quality and safety issues is essential to moving forward.  Technological solutions are expensive and resource-intensive.  The care delivery and management teams must have a common vision of needs and priorities to make this happen.

The manner in which information technology is applied in the health care field is often counterproductive.  Building on Halamka's observations, unless a hospital first studies and enhances the manner in which its work is done, it will simply codify bad processes into machine language.  That is no solution to a quality or safety problem.  Shared governance and commitment between clinicians and management to Lean process improvement or some other philosophy is essential.

Thanks to David Cullen for reminding me of the great work done by him and his colleagues and giving me an excuse to riff off of those studies to offer these observations fifteen years later.

---
*   (JAMA being JAMA, you still can't get a full text version through their site without paying for it -- even though they claim that anything older than six months is available for free.  Luckily, you can read the whole thing here.)

2 comments:

David Cullen said...

Allow me to add a few thoughts to add to your comments. First, as to how these reports were received by the profession, within a few months it was clear that the medical profession as well as the media were very taken with the fact that we even did the study and that the MGH and the BWH were willing to report these errors in the public arena. This led fairly quickly to the first of many Patient Safety conferences involving a coalition of many concerned groups like JCAHO, consumer advocates, AMA, malpractice insurers etc. Subsequently, this led to the formation of The National Patient Safety Foundation modeled after the Anesthesia Patient Safety Foundation under the great leadership of Ellison “Jeep” Pierce and Jeff Cooper.

Because we had a natural experiment with the adoption of CPOE at the Brigham but not at the MGH, we were able to do a second series of studies comparing the effect of introducing CPOE hospitalwide at the Brigham with no CPOE at the MGH. At the same time, we were able to compare adverse drug event rates at 2 medical intensive care units at the MGH, the intervention in this case being the introduction of an extensive and deliberate role of clinical pharmacists in one ICU but not the other. The clinical pharmacy intervention at the MGH was as dramatic in reducing the adverse drug event rate as the introduction of CPOE at the Brigham. Since the Brigham was going to introduce CPOE with or without our study, I can't say that our first set of studies led to adoption of CPOE. After showing that the intervention worked in dramatic fashion, they probably did accelerate adoption of CPOE in other hospitals. I'm not sure that there was an increased role for clinical pharmacists at hospitals around the country although there certainly should have been.

These were the first studies of adverse drug events with denominator data and served as a paradigm for human error studies in general, building on the Harvard Adverse Event Study led by Howard Hiatt and published mainly in the New England Journal of Medicine a few years earlier. We are convinced that these two groups of studies served as the scientific database foundation for the Institute of Medicine's human error report in 1999 and made studies of human error in healthcare "socially acceptable". Unfortunately, as you point out time and again, hospital leadership has too often been passive in dealing with the problem of systems failures leading to errors in healthcare, injury and death.

Anonymous said...

Excellent comment; Dr. Cullen. As I keep reading these things, I become more and more convinced that the reason there is not a larger public outcry is that we only kill people one at a time, rather than in large numbers such as a plane crash (including the pilot!), or in scary fashion as in a nuclear plant accident. The public doesn't think through that these one-by-ones add up to very large numbers.

As for the general physician community, they almost uniformly contest the validity of the IOM report, even questioning the ethics of its authors. THAT boggles my mind.

nonlocal MD