A moment with ... Talya Miron-Shatz

Paul Bosch

T alya Miron-Shatz, a visiting research scholar at Princeton’s Center for Health and Wellbeing, professor of marketing at the Ono Academic College in Israel , and lecturer on consumer behavior at the Wharton School at the University of Pennsylvania, focuses on how people interpret and misunderstand medical information and the implications for their health. She writes a blog for called Baffled By Numbers, at www.psychologytoday.com/blog/baffled-numbers.

In what ways do you see people being rational and irrational in thinking about health care?

For me, the word “rational” flew out the window a long time ago, because what it implies is that people are thinking almost like calculation machines, which is not the case. We do, however, follow a logic that makes us predictably fallible in our medical consumerism — for example, to expect probability when none exists, or to be myopic in our behavior. Right now, a lot of wonderful remedies and technologies are available, but we’re seeing that to develop a phenomenal medical product or concept is not enough — merely giving people information about what is right doesn’t necessarily help.  

Have new technologies, such as screening tests, made it harder or easier for patients to make decisions about their care?

It’s a little bit of both. The premise that people can make decisions about their care is a relatively new ethical advance that dovetails with the fact that we now have more information. We’re living in a technologically advanced society, but our brains are not necessarily as advanced. So when given an abundance of information, people often use hints, rules of thumb, impressions, and sometimes their wishes to make a decision, and they expect certainty even when it cannot be delivered.

Also, we don’t always think about whether people necessarily need all the information. Now that you have the technology to test for so many conditions, do you want to know the results? If you’re 30 years old and you’re going to find out whether you have the gene for increased probability of Alzheimer’s disease, are you thinking of the implications of having this knowledge for the rest of your life? I’m not sure. The gap between the wonderful technological advances and people’s ability to handle them properly can run the whole gamut, from having expectations that cannot be met to not realizing what dealing with new information will involve.  

  What should we make of the recent controversy over a national medical panel’s recommendation that most women under 50 not receive regular screening mammograms?  

It shows that placing something as a default gives it credibility — if you already have screening for women, to remove it seems outrageous, and the rhetoric can become “nobody cares about women’s health.” But if it means 70 women receive a false positive in a mammogram and have a biopsy and all the anxiety involved in order for one woman’s cancer to be picked up, is it worth it? There’s no easy equation to place these numbers into, but I think they need to be part of the discussion. It’s impossible to be an informed patient when you’re not really being informed of the numbers.

Is medical information, even when explained properly, more difficult to understand than other topics?

Probably not the numbers themselves, but their emotional meaning. Doing taxes can be annoying, but you go to your accountant’s office and they figure it out for you and you write a check and you don’t stress over it. Whether or not you should get a prophylactic mastectomy is not something that you can just ask and have a doctor tell you, because sometimes there isn’t a right answer. Systems are moving toward engaging patients in the decision-making, sometimes leaving it in their hands, and I think these systems need to account for the human dynamic.

It’s a fine line a doctor is treading between giving a patient all the information, which can be too much, and making sure the important things are sinking in. A lot of it is about the doctor-patient communication, but people — patients as well as experts — often do not understand probabilistic information. Too much information too quickly can get confusing, and more so in a stressful situation, which has nothing to do with health care and everything to do with the way our minds are wired.

I think life could be easier and more efficient if information environments were designed to be simpler. Colleagues and I have written a book chapter — it’s coming out soon — arguing that information about risk should be standardized the same way that caloric information is on food packaging. You don’t have to have a master’s degree in nutrition to know which cornflakes have more fat and then make a choice. 

— Interview conducted and condensed by Rachel Lieff Axelbank ’06

1 Response

Brian Zack ’72

8 Years Ago

“Talya Miron-Shatz, on medical misunderstandings” (A Moment with, March 17) admirably addresses the importance of providing patients with information in an understandable format, so that they may best make decisions when “there isn’t a right answer.”

Of equal importance, but shamefully receiving much less attention, is the urgent need for development of ­medical fail-safe mechanisms to ensure that in the many cases in which there is a “right answer,” it is that answer that is chosen.

Consider the recent New York Times articles detailing inexcusable and sometimes fatal errors in the provision of radiation therapy to cancer patients, when there would presumably be 100 percent agreement that the right answer is to provide the correct dose. Formal checklists to prevent such errors were not followed.

Less dramatic, but far more common, is the scenario in which a physician chooses a medication based on what free samples recently have been provided by a pharmaceutical representative, rather than on what has been demonstrated to be most effective.

Imagine what might happen if pilots did not go through their routine pre-flight checklists, or if airline mechanics chose replacement parts based on which manufacturer had gifted them with free samples and taken them out to dinner. If we ran our aerospace industry the way we provide medical care, planes would be dropping out of the sky with unfortunate frequency.

Much of the fault lies with the myth of the all-knowing physician. We doctors must be much more willing to acknowledge when we are unsure of the best approach, even when all agree on the desired outcome, and must accept the professional ignominy of developing fail-safe checklists rather than pretending to an infallible intuition. Equally, patients must learn to appreciate doctors’ candor in this respect, rather than gravitating to those physicians who seem to know everything. None of us do.

Join the conversation

Plain text

Full name and Princeton affiliation (if applicable) are required for all published comments. For more information, view our commenting policy. Responses are limited to 500 words for online and 250 words for print consideration.

Related News