From QAnon to anti-vaccination scholar Andy Norman says we face a scourge of mind parasites

Author Andy Norman talks about his new book, "Mental Immunity," and how noxious ideas spread virulently

Salon/September 24, 2021

We are in the midst of an ignorance outbreak. QAnon's account of global politics, despite being both irrational and implausible, has enraptured thousands. Specious anti-vaccine rhetoric abounds even among the educated. Everywhere we turn, bad ideas are spreading like a, well, virus.

Author Andy Norman takes that problem literally. In his provocative book "Mental Immunity: Infectious Ideas, Mind-Parasites, and the Search for a Better Way to Think," the director of the Humanism Initiative at Carnegie Mellon University and founder of the Cognitive Immunology Research Collaborative reveals a growing scourge — and explores what we can do to fend it off. As he explains why thinking for yourself is a poor strategy and why everyone is not, in fact, entitled to their own opinion, Norman offers a compelling case for a regimen of mental resistance. Salon talked to the author recently about how to survive an era where misinformation is more common than the flu, and why "humility is a really important, under-appreciated cognitive virtue."

As always, our conversation has been edited and condensed for clarity.

When you say that bad ideas are mind parasites, you mean just that. Tell me about that, because that is hard, especially for those of us with a philosophical bent to get our heads around.

It's certainly stretching the concept of a parasite in a new direction, right? Words evolve as people find analogies or metaphors that are useful. One way to get used to the idea is to acknowledge that I'm stretching the concept of parasite into a new domain. It doesn't have to change our concept of bad idea; it's another way to look at it. Parasites require a host, bad ideas require a host. Parasites often compromise the health of their hosts. Bad ideas can also compromise the mental wellbeing of their hosts. Parasites can leap from body to body. Bad ideas can leap from mind to mind.

Parasites can induce behavior that spreads the parasite to other bodies by inducing a sneeze, and bad ideas can induce their hosts to spread them by, for example, proselytizing or sharing things on Facebook. If you go through and create a list of attributes that parasites have, bad ideas have all of the relevant properties. It makes sense to think of bad ideas as mind parasites, and it's useful in illuminating in ways that give us new strategies for combating them.

When we think about inoculating the mind — how do we do that when so many of us really seem to be very adamantly anti-vaccine?

Convincing the anti-vaxxers to get their minds inoculated is of course probably the hardest problem of them all. One of the things the science says is that it's easier to prevent the mind infection than to cure one. If you can teach somebody critical thinking when they're young, hopefully you don't have to do any cult deprogramming later in their lives. Or you don't have to try to rid your crazy uncle of QAnon beliefs if that crazy uncle learned critical thinking young and never has to be deprogrammed

Psychologists have been studying mind inoculation for about fifty years now. Exposure to certain kinds of arguments and objections can strengthen the mind's resistance to bad ideas. It can also strengthen our mind's resistance to good ideas and arguments. In other words, you can manipulate a mental immune system in such a way as to make it reject good arguments. Propagandists and demagogues use such strategies almost instinctively. The famous logical fallacy of straw man argumentation works because it sort of hijacks the mind's immune system and induces it to overreact to a good argument.

One of the things I found interesting in the book is this idea of a social immunity building. In the same way that a collective ideology can be dangerous, collaborative understanding can strengthen us.

Jonathan Rauch's new book, "The Constitution of Knowledge," is very good on this. He points out that the best exemplars we have in our culture of knowledge construction are all profoundly collaborative. Good investigative reporting often involves many people collaborating together. Research often requires peer review and an entire community of researchers checking results. The best thinking is deeply collaborative. That means that the advice we often give young people today, "Think for yourself," is terribly ambiguous and not necessarily good advice. We worry that when people think together, they'll slip into groupthink, and that's why we run around telling young people to think for themselves. what we now need to do is teach people how to think together and to help each other identify each other's mind parasites. No one of us can do enough to keep our minds healthy, unless we learn how to collaborate with others who see things differently and can spot things we miss.

And that is tricky, because you talk about the Lake Wobegon effect, where we all think we're above average. How do we get unstuck from that, and admit, "I don't know everything"?

Humility is a really important, under appreciated cognitive virtue.One way to do it is to engage in Socratic dialogue. Socrates, the ancient Greek philosopher, would pose questions that would raise people's awareness of the gaps in their understanding. The more aware you become the gaps in your understanding, the more humble you'll be, and the wiser you become. We need to learn a form of discourse, a form of conversation and a mode of idea testing, that gives much more time and attention to the questions and that fixates less on easy answers. When you spend time considering other people's doubts and questions and challenges, those are essentially the mind's antibodies.

Think of doubts as the antibodies of the mind, and your mind will generate some of them. Other people's minds can generate other questions, other doubts. By paying attention to them in a conversation that's full of clarifying questions, that turns out to be one of the most powerful mind inoculants of all. It requires getting the hang of a really fascinating mode of discourse, which is kind of philosophical. It places a great deal of emphasis on clarifying and trying to, together, figure out the shape of the gaps in our understanding and starting to get our heads around what it would take to fill those gaps.

It requires being okay with the discomfort of maybe being wrong, and not getting the high that we get from our confirmation biases.

The great British philosopher Bertrand Russell said one of the premier virtues of philosophy is that it helps us become comfortable with uncertainty. There's a great deal of uncertainty in the world and especially in this fast breakneck society we have right now. Then once you learn how to be comfortable with uncertainty and to navigate that uncertainty primarily with questions. The fast pace of modern life can make you really anxious. And when you're anxious, you don't think clearly.

I'm wondering where you see the place of faith and spirituality in this really strong interrogation. When you say a belief is reasonable if it can withstand challenges, where do people of faith fit into that?

I have both a cautionary word for people of faith and a word of encouragement. When people use the word "faith" to excuse irresponsible believing, believing that has no basis, that weakens their mind's immune system. You can accustom yourself to believing things without support or believing things without empirical validation or believing by simply brushing aside questions. When your mind becomes comfortable with that mode, your resistance to mind parasites declines. There's research out of Canada now that says that if you raise a child to accept things on faith, they're more likely to become a conspiracy theorist later in life. Many forms of religious faith are very problematic from a mental health standpoint. They provide a kind of superficial comfort at the expense of our long-term ability to spot and remove bad ideas.

But many people use the word "faith" in a somewhat different way that I want to express sympathy and approval of. That's to say, we need hope. There's something profoundly admirable about being resolutely hopeful, being determined, just being willfully hopeful. I think it's great to be willfully hopeful. I think it's bad to indulge in willful believing. Depending on what we mean by faith, that's either a good or a very bad thing. Faith understood as resolute hopefulness is a wonderful thing, faith understood as willful belief I think is a profoundly harmful thing for collective prospects.

One of the important things that I took from this book was this idea that our beliefs are not private. This both-sides-ism, "Everyone is entitled to their own opinion" doesn't help and is actively toxic. You say there are certain worldviews that are just toxic.

When people start doing things that harm others, I think we start to develop grounds for objecting to those behaviors, and belief is just like that. If you believe things that don't harm others, fine. But if you believe things that indirectly do harm others, they become a matter of public concern. So if I believe that vaccines are the spawn of the devil and refuse to vaccinate my kids, my kids end up being harmed. My beliefs can harm others. If I believe irresponsible things and end up casting my votes or for a would-be authoritarian leader, I end up harming the entire public.

When you dwell on examples of beliefs that harm others, you realize that it's perfectly irresponsible to indulge the idea that everyone is entitled to their beliefs. It lets our beliefs both drift away from what's genuinely helpful and moral, but it also lets them drift away from reality and lets them drift away from each other. And when our worldviews drift too far apart, as we're seeing now, it gets really hard to have productive conversations and to keep a specific experiment together. So for social reasons, for truth and honesty reasons, and for moral reasons, we need to get rid of the idea that belief is where everyone is entitled to their opinion, and instead adopt a more public spirited concept of belief.

How do we then translate that to our combative relatives and neighbors in a way that doesn't escalate the polarization?

The same way your body's immune system can overreact and attack your body, the mind's immune system can overreact and attack good information, even in your own mind. The trick is learning how to calm your mind so you don't feel defensive so that we can actually dialogue together in fruitful ways. I think that the emerging science of mental immunity canhelp us learn how to maintain that calm demeanor, where we don't get defensive by other people's arguments and reasons. The problem is that when we get defensive, we stop listening to one another. We stop hearing each other's reasons, and learning from them.