The causes of the crapification are legion, but one that is having a bigger impact on health care than is widely recognized is bad information technology implementation. And I don’t mean the healthcare.gov website.
In case you missed it, the Federal government is in the midst of a $1 trillion experiment to promote (as in force) the use of Electronic Health Care records, or EHRs. Astonishingly, this program has been launched with no evidence to support the idea that rendering records in electronic form will save patient lives. From a Freedom of Information Act filing by the American Association for Physicians and Surgeons got this response, which was reprinted in their April newsletter (emphasis ours):
The American Recovery and Reinvestment Act of 2009 (ARRA) created the Medicare and Medicaid Electronic Health Record (EHR) Incentive Programs. While our Office of E-Health Standards and Services works to implement the provisions of the ARRA, we do not have any information that supports or refutes claims that a broader adoption of EHRs can save lives.
Now of course, one might argue based on intuition that surely electronic data would help patient care. Think of all those illegible doctor scrawls that get misread from time to time. But you need to weigh those errors against those of bad data entry, difficult to read file formats, difficulty in converting records to electronic form, and greater risk of loss of patient data (hard disk crashes and faulty backups).
In fact, I’ve seen good health care information technology in action. When I lived in Sydney in 2002 to 2004, every doctor I saw had a little black flat panel screen in their office or examination room, and most would enter data during the session. The doctors I saw were in solo or small practices. Their fee levels (assuming a dollar for dollar exchange rate, which was not the case at the time) were 25% to 35% of New York City rates for comparable services. That suggests that the use of IT wasn’t a costly addition to their practice overheads.
But could the US adopt the sensible course, which would be to look for successful health care information technology implementations overseas and learn from them? No way. As Informatics MD notes at the Health Care Renewal blog (emphasis ours):
I know from personal development and implementation experience that when “done well”, that is, when good health IT and good implementation practices are offered and with patient safety as a priority, health IT can save lives and improve care. It’s just that the commercial for-profit health IT sector does not meet those expectations, due largely to its leadership model from the merchant-computing culture. Instead, bad health IT is the norm.
We’ll get to the lousy patient outcomes part in due course. But I wanted to focus on a less obvious but no less significant element of this health care information technology push: that it is accelerating the death of solo practices. Mind you, this was already well underway, as reader Juneau noted in our recent post on corporatized medicine:
Going from working for a large corporate healthcare entity to working alone, I have seen insurance rates cut by 40 percent simply for going from “group” to “solo” status. Those who can afford to “do it right” (maybe those without kids or a mortgage or 3 divorces to pay for) feel like dopes. Colleagues who put themselves first survive. Those who made sacrifices, provide free care to indigent patients, accept insurance, etc…..are now the low tier low status docs who work 60 plus hours to make overhead and stay afloat.
This article from UTSanDiego explains the impact of the health care information technology requirements from the doctor perspective:
…doctors who see patients under Medicare and Medi-Cal programs have been forced by the phase-in of a 2009 federal “stimulus” law to install expensive, complex software systems that sharply reduce time for patients….
Yet the unkindest cut has been the electronic records mandate.
Nearly 70 percent of physicians say digitizing patient records has not been worth the cost, according to a survey by Medical Economics magazine. This negative cost-benefit view comes even after $27 billion in subsidies to health care providers for the systems.
One big problem is the dozens of systems don’t talk to each other, because the feds didn’t mandate interoperability before the rollout.
So communication gains among hospitals, clinics and doctors offices aren’t happening. Adding insult, doctors can be criminally liable if hackers get hold of patient data.
Worse is the hit to productivity. Doug says he once aimed to see four patients an hour for normal office visits. Now he struggles to see three each hour, and colleagues report much the same.
The article concedes that the productivity loss should decline over time. But we have the hard dollar cost combined with the toll on doctors’ time….and worse outcomes from a medical safety standpoint. From a Sunday post at the Health Care Renewal website:
From a colleague, a physician and blogger and fellow AMIA member with an eclectic background, on the state of healthcare information technology. Reposted with his permission.
The fact of the matter is that the EMR remains in the United States a tool for maximization of reimbursement and as such is not a technological destination but rather a technological dead end. The driver for proliferation of this ‘dead end’ is the government being willing to fund its expansion with their fervent hope that it will be their magic bullet for finding the cheats and cheaters of Medicare….
The reality is the train has left, those of us addicted to patient care watch in dismayed horror as our productivity plunges and we struggle to restructure not our workflows but our clinical thought processes to badly designed, logically flawed, and obscenely overpriced documentation tools that distract the expert clinician from a high quality clinical encounter.
Quite honestly gentleman and gentlewomen of the jury, I don’t give a ‘rats a**’ about superior documentation, I am obsessed with superior outcomes, and as somebody who actually has to work with this junk, it all sucks………. and will continue to suck until such time as real world clinicians have veto power over the efforts of systems design teams with respect to their information design efforts…. What information design efforts? My point precisely…….
So in other words, the implementation of EMR had nothing to do with improving patient care. Nada. It was all about the money, supposedly improving doctor’s ability to get money back from insurers and helping the government catch cheats.
And if you think I’m exaggerating the risk to patients, the latest ECRI Institute report puts health care information technology as the top risk in its 2014 Patient Safety Concerns for Large Health Care Organizations report. Note that this ranking is based on the collection and analysis of over 300,000 events since 2009.
ECRI did an earlier deep dive on the health care information technology issue in 2012,. †he results were not pretty. The Institute found 171 technology-induced problems were reported in 9 weeks by 36 “facilities,” which were mainly hospitals. Eight of the incidents resulted in harm to the patient and three may have contributed to deaths.
Health Care Renewal gave a back-of-the-envelope calculation as to how to extrapolate these results to the US as a whole:
I note that’s 36 of 5,724 hospital in the U.S. per data from the American Hospital Association (link), or appx. 0.6 %. A very crude correction factor in extrapolation would be about x 159 on the hospital count issue alone, not including the effects of the voluntary nature of the study, of non-hospital EHR users, etc. Extrapolating from 9 week to a year, the figure becomes about x 1000. Accounting for the voluntary nature of the reporting (5% of cases per Koppel), the corrective figure approaches x20,000. Extrapolation of course would be less crude if # total beds, degree of participant EHR implementation/use, and numerous other factors were known, but the present reported numbers are a cause for concern
One of the basic concepts I learned many years ago was that managing was all about making decisions under uncertainty, and that there was a cost to obtaining information to try to reduce uncertainty. The gains in certainty had to be weighed against other costs.
But that isn’t what is operating here. It’s not hard to see that this is an enormously expensive exercise relative to the promised gains in billing efficiency and in catching cheaters. Even before you get to loss of doctor productivity or the harm done to patients, this IT boondogle doesn’t remotely pass muster as an investment to lower costs. So it should be no surprise that it was thrown in the 2009 stimulus package, as something that on a superficial level sounded like it was worth doing and wasn’t as controversial as other spending options. After all, throwing money at white collar workers is not a hard sell, particularly if you pretend you can increase government efficiency, rather than helping struggling borrowers or *gasp* poor people.
And our latest example of crapification is well beyond the point of no return. High levels of disapproval by doctors and bad patient outcomes are irrelevant, since each group’s welfare was never the object of this exercise. This is kleptocracy, designed and executed to occur where the grifters were confident the public would never take notice.