A Glimpse into the Brain-Bending Way We Appraise Health Information

The following post was created as part of the assigned work for Henry Jenkins's PhD seminar, Public Intellectuals. The goal of the class is to help communication and media studies students to develop the skills and conceptual framework necessary to do more public-facing work. They learn how to write op-eds, blog posts, interviews, podcasts, and dialogic writing and consider examples of contemporary and historic public intellectuals from around the world. The definition of public intellectuals goes beyond a celebrity-focus approach to think about all of the work which gets done to engage publics -- at all scales -- with scholarship and critiques concerning the media, politics, and everyday life. Our assumption is that most scholars and many nonscholars do work which informs the public sphere, whether it is speaking on national television or to a local PTA meeting. 


Since the early days of newspaper printing in the United States, mass media and public health scholars have observed spikes in the prevalence of mis/disinformation related to major public health advancements, including the discovery of immunizations, the introduction of water fluoridation, and the implementation of tobacco regulations. While the problem of mis/disinformation is not new, the advent of social media has magnified the reach and impact of unverified and harmful health information. The cost has never been more visible than it is today, amid a global pandemic that has claimed more than 6.5 million lives, disproportionately affected communities of color, and erased decades of women’s progress in the labor market.

Despite the toll of the COVID-19 pandemic, vaccine hesitancy remains a significant public health challenge, and it’s only one of many health arenas plagued by uninformed or misinformed decision-making. To make sense of it, let’s begin with some of the sources that produce mis/disinformation, putting aside for a moment the personal and cognitive attributes that make someone susceptible to fake news/conspiratorial thinking.

Increasingly, junk science is being introduced through various digital media sources. One recent study found that 36% of U.S. adults cited the internet as their most common source of health information (roughly half cited their health professional, and a third cited television). Alternative medicine websites, such as NaturalNews.com, serve as examples of online worlds that promote false and misleading claims to millions of users through a network of 200+ interlinked websites, such as VaccineHolocaust.org and Healthfreedom.news. In more sophisticated cases, peddlers of false health information employ a common yet insidious tactic – embedding a kernel of truth within layers of spurious claims. The single piece of truth serves to lend credibility to a series of unfounded connections.

Take, for example, the Brownstone Institute, which positions itself as a nonprofit “think tank” staffed with credentialed academics and physicians. In one recent Brownstone article called “Have the Children Been Poisoned?” the article’s author claims that the use of masks and hand sanitizer lead to toxic poisoning of children (spoiler: this is not true). Upon further evaluation, more skeptical readers will find that the Brownstone Institute is a special interest group comprising anarcho-capitalist physicians who have been condemned by the U.S. medical establishment and reprimanded by their respective state medical boards.

figure 1

Now let’s return to the question of personal susceptibility: why do some people stop to question the claim that face masks poison children and others don’t? What underlying skills, competencies and biases allow some people to sail past junk science and others to capsize? One piece of the answer is media literacy, or one’s ability to “access, analyze, evaluate, create and participate with messages in a variety of forms” (Center for Media Literacy). In this context, numerous studies examining the relationship between media literacy and health beliefs have consistently shown that low levels of media literacy are linked with greater susceptibility to digital mis/disinformation. There is also a body of evidence that shows an independent linear relationship between low functional literacy (i.e., basic reading and writing skills) in adults and poor health outcomes, including higher rates of hospitalizations and mortality. Translation: If you struggle to understand what your doctor is saying, or you can’t follow the instructions on your prescription bottle, you’re probably less likely to achieve positive health outcomes.

Given these findings, there is ample groundwork to suggest a relationship between the intersection of media and health literacies – referred to here as Media Health Literacy and health beliefs/outcomes (that is, if you have a high level of MHL, you are more likely to dismiss junk science and less likely to suffer from preventable health issues). But before I could test that hypothesis, I needed a way to measure Media Health Literacy – conceptualized here as how well adults in the U.S. access, produce, identify, critically evaluate and engage with mediated health information. The complete definition shown in Figure 2 is the result of an exploratory research project conducted last year, which also yielded an accompanying checklist of operationalized competencies (for those interested, that list is available here).    

figure 2

But what does MHL look like in ‘real life’?

It turns out, Media Health Literacy is easy to define but hard to capture. Three weeks ago, I wrapped up a limited set of cognitive interviews with six people who agreed to take a MHL survey while talking out loud about their thoughts and decision-making processes. The survey itself seeks to measure a diverse range of media and health literacy competencies, including critical reasoning, under the umbrella term Media Health Literacy. The “think-aloud” component began as a way to pilot test the survey’s wording and design, but it ultimately revealed something far more interesting: the derailing effect of health-specific stimuli on an otherwise linear cognitive track. 

One interviewee in particular (let’s call her Jennifer) provides a helpful illustration of the type of meandering heuristics that surfaced in all six think-aloud interviews. For background, Jennifer is a white woman in her early 40s who lives in Los Angeles, has a high school education, and identifies as Jewish and a Democrat. As part of the think-aloud interview, we spent about an hour going through 20 questions, stopping at each one to discuss and debrief.

Let’s unpack one example related to online source credibility. The question asked Jennifer to evaluate the reliability of two different websites – 1) a brazen junkyard of health mis/disinformation (NaturalNews.com) and 2) an evidence-based website hosted by the National Institutes of Health (MedlinePlus.gov). Surprisingly, Jennifer ranks them as equally unreliable on a scale of 1 to 6. She awarded both websites a “2” for “Unreliable,” even though each source represents dramatically different (and often contradictory) ideologies, presentations, agendas and revenue sources. Here’s how she explained her rankings: 

“I would rank ‘NaturalNews’ as 2 (Unreliable), but I would not say ‘Very unreliable’ because someone did take the time to gather all this information so there is some accuracy I imagine. [But] anytime I see anything like this, in this format, it makes me feel like I’m being lied to, or that they’re trying to sell something. For the second one [MedlinePlus.gov], I would say 2, unreliable, again. Because there’s room that there is some truth but I don’t trust it. I feel like there’s Western medicine that I don’t trust completely and it looks like all this information comes from Western medicine. I would say 2 because there’s probably some truth in there but I’m just really skeptical.”

figure 3

Interestingly, when shown a TikTok video that argued against the use of antibacterial soap, Jennifer found the source –immunologist and pharmaceutical scientist Morgan McSweeney, PhD –highly credible even though he presents as highly “Western.” She did not address the contradiction but did explain that seeing the information presented by one person in a video format was much more compelling than reading online articles without a “face” to them. She explained:

[The video] confirms what I already believe. You know what this video makes me realize?: if those news articles [NaturalNews and MedlinePlus] were done ‘in person’ by someone like that, who sounds like he knows what he’s talking about, I would be more apt to follow it rather than reading it. I need someone to be confident and actually sell it.”

In the previous example, the appearance of “trying to sell something” was cited as a reason not to trust the source; in the most recent example, Jennifer contradicts this reasoning and commends Dr. McSweeney for assuming the role of a confident salesperson. While the medium (text vs video) may be a contributing factor, as she claims, her assessment is also likely influenced by confirmation bias – one’s natural inclination to reject information that conflicts with our pre-existing beliefs. In this example, she appears to accept new information that validates her existing beliefs, which favor natural remedies and alternative medicine, regardless of the source/format.

In a different part of the survey interview, Jennifer was asked to consider which of the five different sources on her screen – such as her personal doctor, a family member or a celebrity – would most influence her decision-making on a health matter (in this specific scenario, it was related to weight loss). As she thought through the pros and cons of each source, she explicitly acknowledged the role confirmation bias plays in her decision-making process:

Definitely not my family. I don’t trust anything they say. ‘Advice from your doctor’… I think if they have the same beliefs, I’d say highly influential. If it was a vegan doctor, I’d say highly influential but if it was a doctor who says ‘you gotta eat animal protein to live’ I would say not at all influential.”

 

What are the implications of confirmation bias in the appraisal of health information?

As I reflected on what all six participants shared with me, it became apparent that four of the five MHL competencies – access, production, identification and engagement – followed a relatively predictable pattern when accounting for demographic co-variates that would normally be associated with certain responses (for example, a younger participant was more likely than an older participant to produce media content, or a Democrat was more likely to trust a government website than a Republican). But when it came to the competency labeled critical reasoning, it was hard to comport their reasoning with the usual demographic determinants. It was as if the introduction of health topics/questions triggered a cognitive escape hatch. Was exposure to health information cueing a substantially different response process than other types of information stimuli?

As I transcribed the interviews, I heard participants processing different stimuli in ways that ran counter to off-the-shelf behavioral theories. I heard a physical connection to the decision-making process, an overriding sense that, on some level, they were assessing personal risk more than the actual health information at hand – risk of irreparable bodily harm (will this vaccine hurt me?), risk of radiation exposure or pain (do I really need a mammogram?), risk of confronting long-held beliefs that may not be true (anything that says “anti-bacterial” must be better at preventing illness!). I heard hesitation and reflection as participants thought through their answers, but I also heard persistent attempts to contort answers in a way that aligned with pre-existing beliefs.

figure 4

As I worked through the mental gymnastics on display, I was reminded of the game Plinko, which some of you may remember from the show The Price is Right. For those of you unfamiliar with Plinko, imagine you drop a disc at the top of the board, expecting it will follow a straight shot down to your intended end-point – but instead, it changes direction unpredictably, bouncing off different pegs until it lands. It might end up where you intended, or it might take a sharp left turn. In this analogy, the disc represents a specific health stimulus, such as exposure to an alternative medicine website that claims COVID-19 vaccines cause hair loss; the pegs are different considerations that influence the decision-making process, such as personal anecdotes and experiences or fear of bodily harm; and the slots at the end represent the different outcomes, such as accurately identifying fake health news and rejecting the information as false.

As the Plinko example illustrates, an individual’s appraisal of health information appears to be more susceptible to confirmation bias than more disembodied behavioral decision-making processes, such as product research and purchasing decisions. A relatively inconsequential decision on which laptop to buy or at what hotel to stay is often preceded by reading product reviews or flipping through customer images on TripAdvisor. If our money is at stake, we generally tend to do our research, ask questions, dig deeper and negotiate. But when it comes to routine yet consequential health decisions, we often take heuristic shortcuts; we choose short-term comfort over long-term wellness; we forgo fact-checking and critical thinking for information that validates what we already know: I got the flu shot last year and still got the flu, so why bother? One of my close friends told me that vaping was healthier than drinking, so why quit vaping? My doctor gave me terrible advice that one time, so why trust her or any other doctor?

There’s an irony here that’s hard to ignore: as individuals, we don’t value our health until it’s too late; and as a society, we value the dollar more than a human life. Even our legal discourse places a greater emphasis on protecting consumer rights than it does on human rights like health care, reproductive autonomy and education.

For media and public health scholars, this incongruity presents an opportunity to work together to rebalance the scales – to confront our biases, to change the structural incentives that prompt us to use a calculator instead of a magnifying glass, and to imagine a society in which our personal and collective well-being is more valuable than the plastic cards sitting in our pockets.  

Shadee Ashtari is a second-year PhD student at USC Annenberg. Her research interests include health and risk communication, media/health literacy, strategic communication, and group cohesion and decision-making. Before joining Annenberg, Shadee served as Advisor to the Dean of the UCLA School of Medicine and chief editor of UCLA Health’s patient, executive and crisis communications. Prior to UCLA, Shadee worked as a political reporter and editor at The Huffington Post, a congressional campaign policy analyst and speechwriter, and a health communication researcher at the Johns Hopkins Institute of Genetic Medicine and City of Hope. Shadee holds a Master of Science in Public Health from the Johns Hopkins Bloomberg School of Public Health, with a concentration in Health Policy and Management, and a Bachelor of Arts in Communication from UCLA with a minor in Political Science.