Putting aside my occasionally snarky attitude about its slow-to-innovate health care market, one of the pleasures of living in Boston is casually running into some of the intellectual greats in the field. These are extremely well-intentioned and thoughtful people who have done some of the seminal work in health care quality improvement.
One such person is David Cullen, whom I ran into at our dentist's office. (I trust it is not a HIPAA violation here to report that he was smiling as he left!) David, a former anesthesiologist at MGH, participated in a very important systems analysis of adverse drug events (ADEs), published in JAMA in 1995.*
This was a great piece of work, conducted with Lucian Leape, David Bates, and other notable folks. Here were the conclusions:
Results. —During this period, 334 errors were detected as the causes of 264 preventable ADEs and potential ADEs. Sixteen major systems failures were identified as the underlying causes of the errors. The most common systems failure was in the dissemination of drug knowledge, particularly to physicians, accounting for 29% of the 334 errors. Inadequate availability of patient information, such as the results of laboratory tests, was associated with 18% of errors. Seven systems failures accounted for 78% of the errors; all could be improved by better information systems.
Conclusions. —Hospital personnel willingly participated in the detection and investigation of drug use errors and were able to identify underlying systems failures. The most common defects were in systems to disseminate knowledge about drugs and to make drug and patient information readily accessible at the time it is needed. Systems changes to improve dissemination and display of drug and patient data should make errors in the use of drugs less likely.
Importantly, Cullen and his colleagues found that incident reporting systems in hospitals did not capture these events. In a second study in the Journal of Quality Improvement, they noted:
Of 54 adverse drug events identified by the study, only 3 patients (6%) had a corresponding incident report submitted to the hospital's quality assurance program or called into the pharmacy hotline. One additional ADE was identified by an IR, but not by the ADE study. Of the 55 ADEs, 15 were preventable, and 26 were serious or life-threatening, yet only 2 of the 26 led to an incident report. The three voting groups agreed that most ADEs justified an IR, but judged that in actual practice, an IR would infrequently have been filed.
After these studies, Cullen and the other authors quantified the costs of these types of errors. As reported in JAMA,
After adjusting for our sampling strategy, the estimated postevent costs attributable to an ADE were $2595 for all ADEs and $4685 for preventable ADEs. Based on these costs and data about the incidence of ADEs, we estimate that the annual costs attributable to all ADEs and preventable ADEs for a 700-bed teaching hospital are $5.6 million and $2.8 million, respectively.
I was not involved in health care in those days, and so I don't know how these reports were received by the profession. Reading them today, you have to be impressed with the methodologies employed and the clear statement of conclusions. They should have been hard to ignore.
I am guessing that the articles led to an accelerated adoption of computerized provider order entry (CPOE) systems. CPOE can be an excellent technological fix to several of the problems noted in the study. The algorithms in CPOE can protect against drug-drug interactions, can ensure that dosing is proportional to the size and weight of the patient, can avoid allergic reactions, and the like. CPOE also gets rid of transcription errors resulting from bad handwriting!
That being said, "accelerated" may be an overstatement, as the roll-out of CPOE in the nation's hospitals was very, very slow. When I was at BIDMC, we had a very good system in place, but at the time, fewer that 10% of the nation's hospitals did. In this article published by Kevin, MD, our CIO John Halamka summarized our results: "Our experience with CPOE over the past 7 years is that it has reduced medication error by 50%, it paid for itself within 2 years, and clinicians have embraced it." He then catalogued obstacles faced by others. These, ironically, are often indicative of systems failures in themselves. John explains with this example:
Automating a bad process does not improve anything. When I was a resident, I was told that heparin should be dosed as a 5000 unit bolus then an infusion of 1500 units per hour for every patient. I was not taught about relating heparin dosing to body mass index, creatinine clearance or the presence of other medications. Unfortunately, it often took days to get the heparin dosing right because 5000/1500 is definitely not a one size fits all rule. Creating an automated CPOE order for 5000/1500 is not going to improve the safety or efficacy of heparin dosing. Implementing a new protocol for dosing based on evidence that includes diagnosis, labs, and body mass index will improve care. Our experience is that it is best to fix the process, then automated the fixed process. By doing this, no one can blame the software for the pain of adapting to the process change.
What are the lessons of all this?
There is no substitute for good research. In a data-driven world, well performed studies help demonstrate that there is scientific basis for quality improvement. We could be making even more progress if academic medical centers supported and rewarded this kind of work -- in terms of faculty career advancement and recognition -- as much as they reward basic science research. Likewise, medical schools and residency programs should be teaching their students how to conduct this kind of research.
The spread of technology in the health care field is as slow as molasses. My theory is that this is a function, in part, of the difference in priorities between clinicians and administrators in hospitals. Shared governance of health care quality and safety issues is essential to moving forward. Technological solutions are expensive and resource-intensive. The care delivery and management teams must have a common vision of needs and priorities to make this happen.
The manner in which information technology is applied in the health care field is often counterproductive. Building on Halamka's observations, unless a hospital first studies and enhances the manner in which its work is done, it will simply codify bad processes into machine language. That is no solution to a quality or safety problem. Shared governance and commitment between clinicians and management to Lean process improvement or some other philosophy is essential.
Thanks to David Cullen for reminding me of the great work done by him and his colleagues and giving me an excuse to riff off of those studies to offer these observations fifteen years later.