Critical Thinking banner

Jason Riis

Article Content

    1. The Problem

    Consumer exposure to food science is largely through headlines and conclusions: This ingredient is safe. That food has a health benefit. A celebrity says that diet works.

    Scientists, of course, do not communicate with each other that way. Scientists communicate via papers and talks. Scientific method requires clear statements, careful causal inference, and keen awareness of what is known and not known. To be scientific is to be obsessed with evidence.

    By contrast, consider the mindset of a typical reader of social media. They browse Facebook and Twitter to unwind, not to scrutinize evidence. David Rand, a behavioral scientist at MIT who studies false news on social media calls it “mental laziness” (Gonzalez 2018).

    But the communication challenge in food science goes deeper than social media. Decades of research in behavioral science (mainly cognitive psychology) has shown that routine human thought is not scientific thought. Human thought relies heavily on intuitions and heuristics. Human thought routinely fails on scientific standards of logical reasoning and self-reflection.

    Communicators must be realistic about this. Simply providing more and better information will have
    limited success in influencing people. Science communicators must find ways to support what is at the heart of scientific thinking: sound reasoning and humble self-reflection.

    2. Human Thought Is Fast And Automatic, So It Is Prone to Reality Gaps.

    Science is a systematic attempt to understand reality. Human thought, on the other hand, is primarily directed at assessing immediate threats and opportunities so that they can quickly be acted upon.

    Our uniquely human thought processes are baked into a casserole of evolutionarily ancient attentional, perceptual, and affective processes, as well as some newer “analytical” ones.

    Mostly, this casserole contains processes that are automatic, and very, very fast. It takes much less than a second to recognize a familiar face or to read a word.

    First impressions of people and situations happen almost instantly, and we can’t stop our first impressions. You don’t choose your emotions. They just happen. Very fast. And automatically.

    Thought bubbles vector

    The human thought process happens quickly, and sometimes it gets in the way of critical thinking.© artvea/DigitalVision Vectors/Getty Images

    Thought bubbles vector

    The human thought process happens quickly, and sometimes it gets in the way of critical thinking.© artvea/DigitalVision Vectors/Getty Images

    Fast and automatic thoughts are helpful when you need to spot predators and other dangers. But there is no “off” switch for the systems that produce these thoughts. They are always on, and always influencing us.

    Cognitive psychologists, most famously Nobel Laureate Daniel Kahneman, refer to the vast set of fast and automatic processes as “System 1” (Kahneman 2013). System 1 produces our first thoughts.

    We have many such first thoughts every second. Some of them capture reality. Some of them don’t. Visual illusions are the result of fast and automatic visual processes taking shortcuts and serve as a good example of how our intuitions and first thoughts can seem right but actually be very wrong.

    Even though we can often convince ourselves that a visual illusion is “wrong” in some way, we still see the illusion. It doesn’t just go away because you recognize that it is an illusion. Similarly, even if we can come to recognize that some of our emotional reactions are misguided, we still feel the emotion. The emotion keeps coming back, automatically. This makes it hard to fight the intuition that the emotion is “wrong.” Phobias are an extreme case.

    Because our initial reactions from System 1—perceptions, intuitions, emotions—can be so visceral, we find it hard to believe that they are wrong. If we get an intuition that something feels unsafe, then we tend to believe that the thing is, in fact, unsafe.

    Oftentimes, logic and reason are not quite enough. Our intuitions come first and can get in the way of sound reason and logic.

    Tamar Gendler coined the term “alief” to capture the difference between what we know to be true, and our intuitive sense about it (Gendler 2008). While standing on a balcony with a glass floor, you may believe you are safe. But when you look down, you may become terrified. You believe you are safe, but you do not alieve it. And if an engineer comes along and explains very reasonably and logically that the glass floor is completely safe, and that an elephant could walk on it and it would be fine, you might nod your head and believe every word of it. But you still might not be able to convince yourself that it’s OK to jump on the glass floor.

    Food can be especially prone to alief formation. In a famous 1986 study, Paul Rozin and colleagues showed that people were averse to eating fudge in the shape of dog feces even though they knew it to be fudge (Rozin et al. 1986). Oftentimes, logic and reason are not quite enough. Our intuitions come first and can get in the way of sound reason and logic.

    But in both these situations, the individuals are aware of the true state of the world. Imagine if they were not aware, and you were simply trying to convince them that the glass balcony is safe or that the “dog doo” is fudge. Intuition will kick in first, and it’s a strong intuition that even very reasonable people have a hard time getting over when they know the truth. Ultimately, they will probably side with their intuition and call it reason.

    3. It’s Not Enough to Give Better Information, Yet We Act as Though It Is.

    When intuitions are powerful, they are not easily corrected with new information. Yet in science communication, there is a tendency to fight misperceptions only with better information.

    Information availability is hardly a problem in the digital age. We have Wikipedia. We have Google. We have Ted Talks.

    Risk and benefit

    An assessment of risks and benefits isn’t always based on rational thinking. ©AndreyPopov/iStock/Getty Images Plus

    Risk and benefit

    An assessment of risks and benefits isn’t always based on rational thinking. ©AndreyPopov/iStock/Getty Images Plus

    More information, or even better information, will not be much help when people don’t want to hear it. As the psychologist Philip Fernbach and colleagues note in a paper on the psychology of GMO opposition, “Extremists think they understand this stuff already, so they are not going to be very receptive to education. You first need to get them to appreciate the gaps in their knowledge” (Fernbach et al. 2019).

    Information faces the additional challenge of getting through emotional barriers. People tend to judge risks through “gut feeling,” a process psychologists often call Affect Heuristic. If something feels wrong or risky, then it is wrong or risky. Feelings are often triggered by factors such as lack of control or novelty, and those may have nothing to do with costs, benefits, or probabilities that would go into an objective risk assessment. As the psychologist Yoel Inbar and colleagues note, “Not only are perceptions of risks and benefits often affectively based, but at least in some cases, affectively backed moral values are associated with willingness to disregard risks and benefits entirely.”

    What communicators really need to do is a) help people make better inferences (e.g., by seeing beyond their immediate emotional reactions) and b) help people self-reflect to better distinguish between what they really know and what they don’t. Yet providing more and better information remains the priority. When we surveyed almost 1,000 registered dietitians for a survey with the trade publication Today’s Dietitian, we found that the majority saw providing better information as the key to helping consumers avoid critical thinking errors (Riis et al. 2019). Only a minority saw helping with better inference or better self-reflection as more important.

    4. We Have to Support Better Inference.

    How can communicators help people make better inferences?

    First, we need to appreciate the common difficulties of probabilistic and causal inference and find ways to increase consistency and raise the quality of probabilistic messaging. Even a simple distinction like the difference between relative and absolute risk rarely gets appropriate treatment.

    Bacon

    How much does bacon consumption increase the risk of colorectal cancer? A widely referenced study drew attention to relative risk rather than absolute risk. © Bestfotostudio/iStock/Getty Images Plus

    Bacon

    How much does bacon consumption increase the risk of colorectal cancer? A widely referenced study drew attention to relative risk rather than absolute risk. © Bestfotostudio/iStock/Getty Images Plus

    For example, a widely cited research finding is that eating 50 g of processed meat (e.g., four slices of bacon) increases the risk of developing colorectal cancer by about 16% (Aubrey 2015). That’s a statement about relative risk. But relative to what?

    The absolute risk of colorectal cancer in the United States is approximately 4%. The 16% increase in relative risk increases the absolute risk from 4% to about 4.6%. Some will see that as an important difference and some won’t, but at least the risk becomes a little easier to visualize.

    We need to help people put risks in context. The finding implies that, if 1,000 people ate four slices of bacon daily, 46 of them would develop colorectal cancer. If none ate four slices of bacon daily and all other risk factors were equal, 40 of them would still develop colorectal cancer.

    If the risk is easier to understand, the tradeoff is easier to consider. Some people will think the tradeoff of increased risk for bacon consumption is worth making; some people won’t. But it’s difficult to begin to understand the tradeoff without understanding the absolute risk change.

    By the way, the absolute risk reduction from many narrow dietary changes (like those offered in fad diet books or over-sensationalized headlines) will be much smaller, and probably zero.

    In addition to supporting inferences with better framing of risk information, communicators must also be prepared to deactivate emotion. Fear and disgust are widely experienced emotions when it comes to food technologies and additives. Activists often try to trigger such emotions, often trying to take advantage of chemophobia (Saleh 2019).

    In 2018 the activist organization Friends of the Earth, Washington, D.C., produced a report called, “Gene Editing in Agriculture Poses New Risks to Health, Environment.” In it they say that “several new studies [are] revealing genetic havoc as a result of gene editing” (Friends of the Earth 2018).

    Genetic havoc sounds really bad. It sounds like something maybe only humans could cause. But biologists would argue that genetic havoc is a natural process that occurs in all genomes because genomes are dynamic. Such genetic havoc is therefore natural. Gene editing is not making it worse in some sense. But the activists want to give readers a visceral or emotional reaction in this story about gene editing, and phrases like “havoc” can do that very nicely. Imagine how your clients or audiences would react seeing a claim like that. The claim could easily activate disgust, or fear, and once those strong emotions are there, the details may no longer matter. The context and lack of relevancy is lost in the emotions that the word “havoc” causes.

    There are many evidence-based ways to help support better inference by overcoming, or at least diminishing, the emotion-based reasoning that so often occurs in the realm of food. One of my favorites is the use of conversion stories—that is, the personal testimonial of someone who has changed his or her mind on an issue. In a recent study, for example, researchers used video clips of the environmentalist Mark Lynas talking about his own “conversion” from a genetic modification (GM) crop opponent to an advocate (Lyons et al. 2019).

    Lynas, a former Greenpeace activist, began reading about the science of GM, and seeing that the preponderance of evidence suggested that these were useful technologies that could be used safely, he began to change his view and eventually began advocating for them. The study’s lead author notes: “People exposed to the conversion message rather than a simple pro-GM message had a more favorable attitude toward GM foods . . . The two-sided nature of the conversion message—presenting old beliefs and then refuting them—was more effective than a straightforward argument in favor of GM crops.”

    Conversion messages help audiences connect to the speaker’s emotional sensibility because that person embodies trust and shared values.

    5. We Have to Support Better Self-Reflection.

    People often don’t know how much they don’t know. Scientists are trained to carefully articulate what they know and what they don’t. We do this when we state our research questions in the context of past research, when we discuss the limitations of our studies, and when we discuss next steps in a research program to address open questions. But these practices require science culture and science training. They require self-reflection. This kind of self-reflection is not something people just do all the time, naturally, unprompted.

    Non GMO sign

    Personal testimonials from someone who has changed his or her opinion about a controversial topic like GMOs can be an effective approach to changing others’ opinions. © Михаил Руденко/iStock/Getty Images Plus

    Non GMO sign

    Personal testimonials from someone who has changed his or her opinion about a controversial topic like GMOs can be an effective approach to changing others’ opinions. © Михаил Руденко/iStock/Getty Images Plus

    But there are ways to prompt self-reflection and to help people see gaps in their knowledge. One approach investigated by Phil Fernbach and his colleagues is to encourage people to give explanations (Fernbach 2013). The idea is that most of us are prone to knowledge illusions—beliefs that we understand things just because we are familiar with them. Consider the case of the bicycle. Or a zipper. Do you really know how they work? For most people, the familiarity of using these items, and the fact that these are products that other humans have been producing for decades, lulls us into thinking that we, each of us, actually knows how the object works. In the case of the bicycle, we get the impression that we actually know the mechanisms of how pedaling leads to forward movement. But we don’t really know the mechanism. And we don’t realize that we don’t know until we try to actually explain it.

    In food media, we commonly see claims of products that boost metabolism. Many people believe such claims. Inviting credulous consumers to explain how a metabolism booster works could, appropriately, reduce confidence that such “boosters” are effective. How is it that it works? How does one product impact the many, many processes that constitute “metabolism”? Most people would find it hard to speak confidently on those questions.

    Another approach to prompting self-reflection is simpler, though admittedly its effectiveness is anecdotal. Annie Duke, a former professional poker player, suggests asking people, “How sure are you?” Duke explains: “Instead of asking, ‘Are you sure?’ Try asking, ‘How sure are you?’ ‘Are you sure?’ is a yes or no question. It demands unreasonable certainty. ‘How sure are you?’ allows for shades of gray. It says uncertainty is okay. How often in a day do you casually ask, ‘You sure?’” (Duke 2019).

    This tweak can nudge people to reevaluate the sources of their opinions on their own. It’s a nice option when you’re looking for a quick (and gentle) way to push back.

    These aren’t silver bullets, of course. People don’t have to answer questions. They can walk away from discussions. They can change the subject. But many consumers are sincere and do want to know the reality. Finding ways to respectfully engage them in self-reflection can provide the invitation they need to identify their knowledge gaps.

    6. A New Approach to Communication

    Critical thinking and scientific thinking have developed through centuries of human culture and activity, and millions of incremental improvements in our methods and institutions. Humans born into the twentieth and twenty-first centuries do not simply absorb those thinking styles by some kind of osmosis. We are born with the same brains that people were born with millennia ago. Those brains are capable of critical thinking, but it is not some kind of natural process that we default into, effortlessly.

    Critical thinking isn’t easy. Communicators must always remember this. We must help people think critically and not merely give them the information that we think good critical thinkers should have. We must find ways to support what is at the heart of scientific and critical thinking: sound reasoning and humble self-reflection.

    REFERENCES

    Aubrey, A. 2015. “Bad Day For Bacon: Processed Meats Cause Cancer, WHO Says.” The Salt, Oct. 26. https://www.npr.org/sections/thesalt/2015/10/26/451211964bad-day-for-bacon-processed-red-meats-cause-cancer-says-who.

    Duke, A. 2019. “Twitter Post.” Twitter, March 18. https://twitter.com/AnnieDuke/status/
    1107645252817424385.

    Fernbach, P. M., N. Light, S. E. Scott, et al. 2019. “Extreme Opponents of Genetically Modified Foods Know the Least but Think They Know the Most.” Nature Human Behaviour 3. https://doi.org/10.1038/s41562-018-0520-3.

    Fernbach, P. M., T. Rogers, C. R. Fox, et al. “Political Extremism Is Supported by an Illusion of Understanding.” Psychological Science (24):6. https://doi.org/10.1177/0956797612464058.

    Friends of the Earth. 2018. “New Report: Gene Editing in Agriculture Poses New Risks to Health, Environment.” Friends of the Earth, Sept. 12. https://foe.org/news/new-report-gene-editing-agriculture-
    poses-new-risks-health-environment/.

    Gendler, T. S. “Alief and Belief.” 2008. The Journal of Philosophy 105(10): 634–63.

    Gonzalez, R. 2018. “Don’t Want to Fall for Fake News? Don’t Be Lazy.” Wired, Nov. 9. https://www.wired.com/story/dont-want-to-fall-for-fake-news-dont-be-lazy/.

    Kahneman, D. Thinking, Fast and Slow. 2011. New York: Farrar, Straus and Giroux.

    Lyons, B. A., A. Hassel, M. Tallapragada, et al. 2019. “Conversion messages and attitude change: Strong arguments, not costly signals.” Public Understanding of Science 28:(3). https://doi.org/10.1177/0963662518821017.

    Riis, J., B. McFadden, and K. Collins. 2019. “Thinking Critically About Nutrition.” Today’s Dietitian August. https://www.todaysdietitian.com/newarchives/0819p36.shtml.

    Rozin, P., L. Millman, and C. Nemeroff. 1986. “Operation of the Laws of Sympathetic Magic in Disgust and Other Domains.” Journal of Personality and Social Psychology 50(4): 703–12. https://doi.org/10.1037/0022-3514.50.4.703.

    Saleh, R., A. Bearth, and M. Siegrist. 2019. “‘Chemophobia’ Today: Consumers’ Knowledge and Perceptions of Chemicals.” Risk Analysis 39:(12). https://doi.org/10.1111/risa.13375.

    About the Author

    Jason Riis, PhD, is the founder, CEO, and chief behavioral scientist of Behavioralize, Philadelphia([email protected]).

    In This Article

    1. Risk Analysis