Nash and Wohlforth ’86 Tackle the Chaos of COVID Head-on in Latest Book

Photo by Dallas Hetherington

carlett spike
By Carlett Spike

Published Oct. 24, 2022

9 min read

The book: Authors David B. Nash and Charles Wohlforth ’86 set out to dissect why America’s health-care system failed during the COVID pandemic. Using compelling stories, How COVID Crashed the System (Rowman & Littlefield) presents answers to some of the hardest questions to arise from the crisis: How could this happen? Why did the U.S. experience the most loss of patients? And what can we learn from this? After analyzing the problems, Nash and Wohlforth conclude on a positive note, highlighting how the pandemic has presented an opportunity to rethink and reform the country’s health system.  

Image

The authors: David B. Nash is the founding dean emeritus of the Jefferson College of Population Health and a professor at Thomas Jefferson University in Philadelphia, Pennsylvania. He is an expert in quality and safety of medical care, health reform, the coronavirus pandemic, and medical education. Charles Wohlforth ’86 is an award-winning writer and author. He is the author and coauthor of more than 10 books including The Whale and the Supercomputer. He has been published in a variety of outlets including The Daily Beast, The New Republic, and PAW.

Excerpt

Our Investigation

This is an investigation, but we are not looking for a culprit. Amid the Covid pandemic, villains are abundant and unhidden—and heroes, too. Faced by a test as great and deadly as a war, humanity once again has produced its wondrous extremes: of selfishness and ignorance, of courage and ingenuity, of death and endurance. The stories will be retold for ages. Our investigation goes deeper.

Think of an aircraft lying disintegrated and cratered in a farmer’s field, just outside the fencing beyond the runway. Fires are out and ambulances have departed. At the airport, the bright lights of news cameras glare on scenes of grief. Out in the scorched field, a crew of investigators in black baseball caps is alone, picking through the pieces, searching the fragments for clues. A plane roars overhead, departing from the airport. The work here in the field is to keep those above safe and to protect everyone on airplanes everywhere. Blame won’t save lives now. Preventing errors may.

The American health care system crashed as surely as the most magnificent of technological machines that has ever fallen from the sky. The novel coronavirus posed a deadly threat, but it should have been manageable, and for some nations, it was. But against the United States, with the world’s most expensive and scientifically advanced medical system, the virus won. America recorded more Covid deaths than any other nation. Short of death, Covid also caused long-term illnesses, bankrupt businesses, blocked educations, the mentally devastating isolation of two years of repeated lockdowns and social disruptions, and bitter divisions about vaccination and public health measures. More than thirty-six hundred US health care workers lost their lives taking care of Covid patients in the first year, a fourteenfold increase over deaths in an ordinary year.

Why? Why did the United States fare uniquely poorly among developed nations? Why did the 18 percent of gross domestic product we spend on health care fail to protect us? It’s as if the US military, mightier than the forces of all other nations combined, suddenly was routed by a mainland invader. And our spending on the military is “only” 3 percent of GDP, not 18 percent.

A partial, obvious explanation for the Covid failure could be that hospitals were never intended to handle the millions of sick people suddenly arriving at their doors. But the health system reaches far beyond hospital doors. Why did Americans get sick in such great numbers?

Our investigation will answer that question. We will look deeply into this complex system. And, as in any investigation of a complex system that failed, we will find more than one answer. An airliner’s crash always requires more than one mistake. So says our friend, the pilot and author John Nance, who first taught David about the National Transportation Safety Board model of investigations. “This is the lesson that we learned in aviation safety in the late ’70s and early ’80s,” he said. “Every single solitary contributing cause must be addressed or you will see it as part of a causal chain in another accident.”

Airline crashes almost never happen anymore. In the 1970s, aviation also was considered safe, but if the rate of accidents back then were happening now, with skies many times busier, we would be seeing the equivalent of a jumbo jet of passengers dying every other day, Nance told us. Instead, years pass without a single death. That improvement happened because the NTSB investigated crashes looking for every cause—and without looking for anyone to blame.

“Blame has absolutely nothing to say to us about how to prevent things in the future,” Nance said. “That’s an ethical thing, but it is definitely not for repairing, through understanding causation, what needs to be corrected.”

David has frequently taken these ideas into investigations of medical errors. In 2005, he was at his daughter’s field hockey game when Connecticut’s health commissioner called to ask him to investigate three recent deaths at the Connecticut Children’s Medical Center in Hartford. A child had died overnight because an emergency room doctor misread an X-ray and failed to order a needed test. A victim of a car crash with an injured heart could have been saved by emergency surgery but died because no one looked at his X-ray for ninety-nine minutes. A visiting seven-year-old boy was allowed to wander alone into the room of an unrelated, severely disabled baby and dropped her on the floor. The baby died.

In an understaffed, overtaxed, undertrained emergency room with an inexperienced, interim head, the doctors’ errors were understandable. No one could blame the seven-year-old. But when David visited the hospital board’s chair, he immediately saw a deeper cause. The chairman—who was a banker, very smooth, with a beautiful office—said he was not responsible. David informed him that he was wrong. Legal precedent going back to 1961 made the board responsible for the quality of patient care. But more important than the law, the leaders hadn’t created a culture of patient safety. “Members of the Board of Directors of CCMC do not have a clear strategic-level commitment to quality measurement and safety improvement,” David wrote in his report (which was then leaked to the local newspaper).

To understand that conclusion, set aside how you think of investigations. TV detectives sleuth for clues to narrow responsibility for a crime to a single person at a discrete moment. Investigating a flawed system—whether in aviation, medicine, or any other complex human enterprise—is more like diagnosing an ill patient. The system has produced an error or a close call, so we know something is not right, but errors are unintentional, so knowing of the error alone does not, by itself, reveal the cause. Errors point to a cause, like the symptoms of a disease. A symptom is not the same as the disease, and a disease is not the same as the infectious agent that caused it, and the ability of that agent to produce the disease also is influenced by the environment and perhaps other issues that weakened the body’s defenses.

Errors can never be reduced to zero. To err is human. But some complex systems have reduced accidents to near zero, including commercial aviation, by interposing layers of safeguards. A pilot makes a mistake, but the copilot catches it. Or the copilot also misses the mistake, but a checklist helps, or a computer. Each layer has some holes, like a slice of Swiss cheese, but with enough layers in place, the probability drops to near zero that the holes in all the layers will line up and allow an error to slip through. Investigating errors in a complex system is more complicated than finding a culprit because the goal is larger: to find systemic vulnerability and fix it. Where are the missing layers or the oversized holes?

When David entered that emergency room in Connecticut, he didn’t see confusion, or disorder, or improper health care. He didn’t ask primarily about the three fatal errors or the sequence of events in those particular incidents. His questions were diagnostic, using techniques developed in aviation and, over the last twenty years, by pioneers in performance improvement and population health, fields in which David has worked for decades. Since we are planning to use this framework to investigate the Covid disaster—the biggest medical error in history— it’s worth explaining the process step by step.

In an emergency room, junior doctors frequently have expert attending physicians they can call on at home, who are paid to be available for those questions. Sometimes doctors get in trouble and need advice or simply lack the expert knowledge of a specialist. David asked each young pediatrician in the emergency department what he or she would do when uncertain. Would they call? He got eye-rolling. Follow-up question: Calling should be routine; why not call for help? “Well, Doctor So-and-So reams you out when you call.” Another senior doctor just wouldn’t pick up the phone.

That answered diagnostic question one: What was the authority gradient? The idea of the authority gradient goes back to a 1977 aviation disaster in Spain’s Canary Islands, the deadliest crash in history. The extremely experienced, high-ranking pilot of a KLM 747 attempted to take off in a fog without getting clearance and hit another 747 taxiing on the runway. Cockpit tapes of the KLM crew—who all died—indicated that the copilot and flight engineer had recognized the pilot’s mistake but were too intimidated by his superior authority to forcefully point out his error. The lesson for health care: flattening the authority gradient and creating collaborative teams in hospitals can help stop errors.

David’s next diagnostic question checked for loners among the doctors in the ER. Often, full-time doctors in an emergency department have a cowboy attitude—highly confident in their own abilities, moving rapidly and skillfully in a crisis, and without the burden of follow-up contact with their patients. A certain personality type gravitates to the work. Other doctors rotate into the department and can bring a more collaborative style, different skill sets, and a greater willingness to reach out for help. But at the hospital in Connecticut, a few conversations made it clear that the cowboys ran the show and set the standard. Loners might not make more errors than the rest of us, but the errors they do make are less likely to be caught by colleagues before doing harm.

The third diagnostic question was for the nurses. David asked—pulling each aside in quiet moments—whether, when seeing something that was clearly wrong, they felt empowered to say “stop” and call a halt to a procedure, as if pulling an emergency brake or “stopping the line” in manufacturing. They looked at David as if he were from another planet. Impossible! That response isn’t unusual in a hospital with patient safety problems—which is the great majority of hospitals. Numerous examples have been documented of nurses staying silent even when they knew a surgeon was operating on the wrong part of a patient’s body, something that still happens hundreds of times a year in the United States. A toxic culture makes it impossible to challenge an exalted doctor.

A baby was dropped, an X-ray misread, a patient overlooked—but the real problem at the hospital in Connecticut was cultural. And it permeated the organization, all the way up to the board chair, with his refusal to take responsibility.

This is the final and most important question for a medical error investigation: Is the culture just? That is, does it treat everyone equally, respect the contributions of all members of the team, and avoid assigning blame to individuals for failures of the system? In that Connecticut hospital, the culture was unjust, and patients suffered. Is America’s health care culture just? That question is among the most important for investigating why we failed the Covid challenge.

Excerpted from the book How Covid Crashed the System: A Guide to Fixing American Health Care by David B. Nash, MD, and Charles Wohlforth. Used by permission of the publisher Rowman & Littlefield. All rights reserved.

Review

“This book does an unparalleled job of explaining what went wrong as it relates to Covid and health care delivery. It will take dedication to the principles of responsible innovation to ensure that we do not repeat the same mistakes.” — Hemant Taneja, managing partner, General Catalyst

0 Responses

Join the conversation

Plain text

Full name and Princeton affiliation (if applicable) are required for all published comments. For more information, view our commenting policy. Responses are limited to 500 words for online and 250 words for print consideration.

Related News

Newsletters.
Get More From PAW In Your Inbox.

Learn More

Title complimentary graphics