Graphic: New York Times
Yet another case study on how the most educated of our professionals are not fail-safe. Not just not-fail-safe, but not not-able-to-tell-up-from-down-safe. The New York Times has an incredible story today, apparently one of many, into the dangers of new radiation treatment called Intensity Modulated Radiation Therapy.
It covers a lot of ground, but one anecdote that sticks out is of Alexandra Jn-Charles, who underwent IMRT to treat breast cancer. IMRT involves delivering radiation as a precise beam to kill a tumor…a great way to avoid the healthy-cell-killing symptoms of traditional radiation treatment.
However, Ms. Jn-Charles ended up with a hole in her chest so big that “you could just see my ribs in there.â€
How did it happen? Numerous therapists, and even physicists, failed to notice a simple binary error:
One therapist mistakenly programmed the computer for “wedge out†rather than “wedge in,†as the plan required. Another therapist failed to catch the error. And the physics staff repeatedly failed to notice it during their weekly checks of treatment records.
Even worse, therapists failed to notice that during treatment, their computer screen clearly showed that the wedge was missing. Only weeks earlier, state health officials had sent a notice, reminding hospitals that therapists “must closely monitor†their computer screens.
…
The series of moronic, tragic errors calls to mind Atul Gawande’s story of the checklist, in which a 5-step list of tasks for doctors, as simple as washing their hands, reduced infection rate for a certain procedure to zero.
What’s the checklist for this cutting-edge radiation therapy?
Maybe there would be one if hospitals weren’t underreporting their accidents, according to NYC’s health department, by “several orders of magnitude.” (According to the NYT, the department apparently did not realize this until the Times started asking).
And then there’s the bad software angle. Varian Medical Systems gets criticized for code that, while allowing for the delivery of a precise and powerful stream of electrons to a tumor, has the stability and error-recovery ability of Windows ME. In the case of Mr. Jerome-Parks, an IMRT machine delivered radiation “from the base of his skull to his larynx” instead of just at the tumor. The reported problem: crash-prone software with poor/non-existent data recovery:
The investigation into what happened to Mr. Jerome-Parks quickly turned to the Varian software that powered the linear accelerator.
The software required that three essential programming instructions be saved in sequence: first, the quantity or dose of radiation in the beam; then a digital image of the treatment area; and finally, instructions that guide the multileaf collimator.
When the computer kept crashing, Ms. Kalach, the medical physicist, did not realize that her instructions for the collimator had not been saved, state records show. She proceeded as though the problem had been fixed.
“We were just stunned that a company could make technology that could administer that amount of radiation — that extreme amount of radiation — without some fail-safe mechanism,†said Ms. Weir-Bryan, Ms. Jerome-Parks’s friend from Toronto. “It’s always something we keep harkening back to: How could this happen? What accountability do these companies have to create something safe?â€
Just incredible. Read the whole story here.
Also, a great animated graphic illustrating how IMRT can go awry.
My blog headline says “doctors” when it was “therapists” who apparently missed the “out” and “in” difference 27 times…though, presumably, doctors are involved somewhere in the operational process, even if they aren’t programming the machine themselves.
A NYT reader who says he’s an engineer has this insight:
What did Wedge in / Wedge out really imply to the software programmer? Did he understand the true consequences of the two setting options? Did he have any understanding of medicine at all? Or was his knowledge just limited to what the lines of software code could do?
This person might previously have written software for operating a sprinkler in a garden, where he provided options for turning the sprinkler on and off. Thus, a line of software code could manage Sprinkler On / Sprinkler Off. A similar line of code could also manage Wedge In / Wedge Out. The software is not really all that different; very often, all it does is activate/deactivate one or another relay. But what were the relative levels of importance of the selected options in these two cases? Sprinkler Off would mean the lawn didn’t get watered on one day. No big deal, and easily fixed. What about Wedge Out? Did he know what that could mean for the patient, and how many checks and verifications he would need to include for that in order to take into account situations like the operators of the equipment being mentally distracted, careless, etc.? Should he make lights to flash; warning sounds to be emitted; additional confirmational prompts and checklists each time? To make the system 100% foolproof, would the operator in this case require additional reminders / actions to be taken, which might not be required in the case of the gardener?
I think, now that technology is here to stay and since we are growing increasingly dependent on it, that every person in the chain, including electricians, mechanics, software programmers and others, need to become more medically aware of the implications of his/her particular role in the chain. They should no longer be distanced from the ultimate outcome as they are now, focussed on local actions and completion of job targets.
For instance, this programmer must be made aware that he is setting the radiation scope that could destroy a person. He must think deeply about practical issues and about how to take things like human error into account. He should not get away with just thinking he has met his daily target for number of lines of code written.
I usually don’t use the word “paradigmâ€, but I think what we need here is a major paradigm shift regarding what we should expect from technology and its providers in medicine. The old saying, “A chain is only as strong as its weakest linkâ€, applies very strongly here.