Read How Doctors Think Online

Authors: Jerome Groopman

How Doctors Think (9 page)

BOOK: How Doctors Think
4.43Mb size Format: txt, pdf, ePub
ads

In Seattle, Alter had been trained in Bayesian analysis, a mathematical approach to making decisions in the face of uncertainty. When he sees a patient, he calculates a numerical probability for each possible diagnosis. In Nathan's case, he was at a loss to assign such a probability. There simply was no database to refer to for a compression fracture of the tenth thoracic vertebra in an overweight but otherwise healthy ten-year-old Hopi boy. So Alter tried to approach the problem not by sophisticated mathematics but by common sense. "The event was nothing," Alter said, "a student jumping on Nathan for a piggyback ride, and it didn't seem enough to explain the injury." But the pediatrician seemed to think looking for another answer was unnecessary. "I was at a loss," Alter told me. "This was a specialist talking to me. I should bow to his authority."

I heard a familiar resonance in Nathan's story, not because I trained in emergency medicine or take care of children, but because I've heard specialists say, "We see this sometimes." It has the ring of confidence, a statement based on long experience, and is meant to lift the burden of further investigation off everyone's shoulders. But it should be said only after an exhaustive search for an answer and ongoing monitoring of a patient. If said glibly, it shows worrisome ignorance instead of representing reassuring knowledge. It means that everyone should stop thinking.

Alter had no choice but to move on. He had done the best he could, more than most would have done in evaluating Nathan. But he couldn't stop thinking about him. He was forced to wait until, in clinical jargon, the problem would "declare itself."

That declaration took place some weeks later when Nathan got out of bed and immediately collapsed in pain. He was rushed back to the ER. Alter examined him and confirmed that
D
was in order: that Nathan's legs were not weak and his reflexes were intact. He ordered another set of x-rays. Now there were four wedge fractures of the spine. Alter transferred Nathan to a hospital in Phoenix. An orthopedic surgeon there performed a bone biopsy and sent it to the pathology laboratory. Peering into the mi croscope, the pathologist in Phoenix saw sheets of large round cells inside the bone; each cell resembled the next, dark blue with a convoluted nucleus. Special tests identified the enzymes within the cells and the proteins on their surface. The diagnosis soon was clear: Nathan had acute lymphoblastic leukemia. The leukemia had so weakened the vertebra that it had collapsed with a piggyback ride. Now the case made sense. As Alter had surmised, what the pediatrician had said did not. "No one—no doctor, no patient—should ever accept, as a first answer to a serious event, 'We see this sometimes,'" Alter said. "When you hear that sentence, reply, Let's keep looking until we figure out what is wrong or know the problem has passed."

 

 

One winter day in the same year that Nathan Talumpqewa fell ill, a Navajo woman in her sixties named Blanche Begaye came to the emergency department because she was having trouble breathing. Mrs. Begaye was a compact woman with slate-gray hair gathered in a bun who worked in a grocery store on the reservation. Over the past few weeks, a nasty virus had been moving through the close-knit community, and scores of patients like Blanche Begaye had come to the hospital with viral pneumonia. Mrs. Begaye said that she first thought she had just "a bad head cold." So she had drunk lots of orange juice and tea and taken a few aspirin, but the symptoms worsened and now she felt terrible.

Alter noted that she was running a low-grade fever, 100.2° F, and her respiratory rate was almost twice normal. He examined her lungs and heard the air rapidly moving in and out, but none of the harsh sounds, called rhonchi, that are caused by accumulated mucus. Alter obtained blood tests. Blanche Begaye's white blood cell count was not elevated, but her electrolytes showed that the acid-base balance of her blood had tipped toward acid, not uncommon in someone with a major infection. Her chest x-ray did not show the characteristic white streaks of viral pneumonia.

Alter made the diagnosis of "subclinical viral pneumonia." He told Mrs. Begaye that she was in the early stages of the infection, subclinical, so that the footprints of the microbe were not yet evident on her chest x-ray. Like many other patients with pneumonia whom he had seen recently, she should be admitted to the hospital and given intravenous fluids and medicine to keep her fever down. At her age, he said, viral pneumonia can tax the heart and sometimes cause it to fail, so it was prudent to keep her under observation.

Alter handed off the case to an internist on the staff and began evaluating another patient, a middle-aged Navajo man who also had fever and shortness of breath. A few minutes later, the internist approached Alter and took him aside. "That's not a case of viral pneumonia," he said. "She has aspirin toxicity."

Even years later in retelling the story, Alter groaned. "Aspirin poisoning, bread-and-butter toxicology," he said, "something that was drilled into me throughout my training. She was an absolutely classic case—the rapid breathing, the shift in her blood electrolytes—and I missed it. I got cavalier."

As there are classic clinical maladies, there are classic cognitive errors. Alter's misdiagnosis resulted from such an error, the use of a heuristic called "availability." Amos Tversky and Daniel Kahneman, psychologists from the Hebrew University in Jerusalem, explored this shortcut in a seminal paper more than two decades ago. Kahneman won the Nobel Prize in economics in 2002 for work illuminating the way certain patterns of thinking cause irrational decisions in the marketplace; Tversky certainly would have shared the prize had he not died an untimely death in 1996.

"Availability" means the tendency to judge the likelihood of an event by the ease with which relevant examples come to mind. Alter's diagnosis of subclinical pneumonia was readily available to him because he had seen numerous cases of the infection over recent weeks. As in any environment, there is an ecology in medical clinics. For example, large numbers of patients who abuse alcohol populate inner-city hospitals like Cook County in Chicago, Highland in Oakland, or Bellevue in Manhattan; over the course of a week, an intern in one of these hospitals may evaluate ten trembling alcoholics, all of whom have DTs—delirium tremens, a violent shaking due to withdrawal. He will tend to judge it as highly likely that the eleventh jittery alcoholic has DTs because it readily comes to mind, although there is a long list of diagnostic possibilities for uncontrolled shaking. DTs is the most available hypothesis based on his most recent experience. He is familiar with DTs, and that familiarity points his thinking that way.

Alter experienced what might be called "distorted pattern recognition," caused by the background "ecology" of Begaye's case. Instead of integrating all the key information, he cherry-picked only a few features of her illness: her fever, her rapid breathing, and the shift in the acid-base balance in her blood. He rationalized the contradictory data—the absence of any streaking on the chest x-ray, the normal white blood cell count—as simply reflecting the earliest stage of an infection. In fact, these discrepancies should have signaled to him that his hypothesis was wrong.

Such cognitive cherry-picking is termed "confirmation bias." This fallacy, confirming what you expect to find by selectively accepting or ignoring information, follows what Tversky and Kahneman referred to as "anchoring." Anchoring is a shortcut in thinking where a person doesn't consider multiple possibilities but quickly and firmly latches on to a single one, sure that he has thrown his anchor down just where he needs to be. You look at your map but your mind plays tricks on you—confirmation bias—because you see only the landmarks you expect to see and neglect those that should tell you that in fact you're still at sea. Your skewed reading of the map "confirms" your mistaken assumption that you have reached your destination. Affective error resembles confirmation bias in selectively surveying the data. The former is driven by a wish for a certain outcome, the latter driven by the expectation that your initial diagnosis was correct, even if it was bad for the patient.

After the internist made the correct diagnosis, Alter replayed in his mind his conversation with Blanche Begaye. When he asked whether she had taken any medication, including over-the-counter drugs, she replied, "A few aspirin." He heard this as further evidence for his anchored assumption that she had a viral syndrome that began as a cold and now had blossomed into pneumonia. "I didn't define with her what 'a few' meant," Alter said. It turned out to be several dozen.

The irony is that Alter had suspended judgment about Nathan Talumpqewa's diagnosis, had not anchored his thinking at all, because he could not estimate a probability for a particular disease or identify a biological mechanism causing a vertebra to collapse. This had held him back from accepting the pediatrician's glib assurance. Yet he jumped to a conclusion with Blanche Begaye, assigning a probability of 100 percent to her case. "I learned from this to always hold back, to make sure that even when I think I have the answer, to generate a short list of alternatives." That simple strategy is one of the strongest safeguards against cognitive errors.

 

 

Imagine that you are an emergency physician like Harrison Alter or Pat Croskerry. In most instances, you don't know the patients you see. So you have to rely on a snapshot view of their illness—unlike an internist in his office, who is familiar with his patients and their families, knows their character and their behavior, and can observe the evolution of a clinical problem over time. Imagine that it is a typically busy evening, and the triage nurse has assigned three patients to you over a half-hour period. Each patient has a host of complaints. Pat Croskerry told me that at moments like this he feels as if he is "plate-spinning on sticks," like a circus performer using sticks to spin plates without letting them slow down or fall.

Actually, it's harder than spinning plates, because plate-spinning requires a single rotary motion and all the plates are of similar size and weight. Each patient, of course, is different, and for each you may have to go through different motions quickly to reach a working diagnosis, treat any urgent problems, and then decide on the safest disposition: admission to the hospital, transfer to another institution, or discharge to home. Now consider what you must do to meet these goals of diagnosis, treatment, and disposition. First you have to figure out the main reason each patient has come for emergency care. While that may seem straightforward, it is not. Patients may give a triage nurse or a doctor a reason that is tangential to the real, more serious underlying problem, or they may offer the symptoms that bother them most but may be unrelated to their underlying disease. All doctors work under time constraints, and this is especially true in the emergency department. So, as we saw with Alter, the questions you choose to ask and how you ask them will shape the patient's answers and guide your thinking. You may go off on a tangent if you try to elicit a history too quickly, but you will neglect your other tasks if you take too long to hear what is wrong.

I recall an elderly man who arrived at the ER complaining of pain in his ankle after tripping on the street. All he wanted was to be reassured it wasn't broken and be given a painkiller. Everyone focused on his ankle. No one thought about why he might have tripped. Only much later we learned that he'd fallen because he was weak from an undiagnosed anemia. The cause of his anemia turned out to be colon cancer. To compound matters, patients may not remember key aspects of their past medical history, and without a hospital chart or office record, you lack any independent source to help fill in the gaps. This is especially true with regard to medications. "I take a blue pill and a pink pill for my heart," a dizzy patient may say, but he doesn't recall the names and doses of the pills, and you are at a loss to assess whether his nausea and dizziness are related to his therapy.

After you determine your patient's primary complaint, you must then decide what blood tests and x-rays to order, if any. Harrison Alter returned to Oakland's Highland Hospital after three years in Tuba City. He told me that he emphasizes to his interns and residents in the emergency department there that they should not order a test unless they know how that test performs in a patient with the condition they assume he has. That way, they can properly weigh the result in their assessment. This is not as easy as it may sound. Take, for example, Pat Croskerry's encounter with Evan McKinley, the forest ranger whose chest pain was not typical of angina. As Croskerry's colleague pointed out, he went the extra mile in ordering tests on McKinley, not only an EKG and a chest x-ray but also a test for cardiac enzymes. For each, Croskerry had to judge whether the result was normal, abnormal, or spurious. Laboratory, x-ray, and EKG technicians make mistakes. I recall when I incorrectly placed the EKG leads on a patient's chest and, not realizing my mistake, concluded he had a serious problem with the electrical conduction pathways of his heart. He had no such thing; the EKG was an artifact generated by my mistake. Other errors can be more subtle, like taking a chest x-ray when the patient has not held his breath. This can cause white streaking in the lower lungs, a sign of pneumonia.

For each patient, you are making scores of decisions about his symptoms, physical findings, blood tests, EKG, and x-rays. Now, multiply all of those decisions you made for each patient by three assigned in thirty minutes by the triage nurse; the total can reach several hundred. The circus performer spins only a handful of plates. A more accurate analogy might be to stacks of plates, one on top of another, and another, and another, all of different shapes and weights. Add to these factors the ecology of an emer gency department, the number of people tugging at your sleeve, interrupting and distracting you with requests and demands as you spin your plates. And don't forget you are in an era of managed care, with limited money, so you have to set priorities and allocate resources parsimoniously: it costs less if you can take several plates off their sticks, meaning if you limit testing and rapidly send the patient home.

BOOK: How Doctors Think
4.43Mb size Format: txt, pdf, ePub
ads

Other books

Sowing Poison by Janet Kellough
The Cage by Megan Shepherd
Midnight Pearls by Debbie Viguié
Whirlwind by Charles Grant
A Wrongful Death by Kate Wilhelm
Perfect Match by Byrum, Jerry