The survival value of wrong beliefs

… reasons for the survival of antiscientific views. It’s basically a matter of evolution. When crazy ideas negatively affect survival, they die out. But evolutionary forces are vastly diminished under some conditions, or even point the wrong way …

NPR has an alarming piece on school science.

She tells her students — like Nick Gurol, whose middle-schoolers believe the Earth is flat — that, as hard as they try, science teachers aren’t likely to change a student’s misconceptions just by correcting them. Gurol says his students got the idea of a flat planet from basketball star Kyrie Irving, who said as much on a podcast.

“And immediately I start to panic. How have I failed these kids so badly they think the Earth is flat just because a basketball player says it?” He says he tried reasoning with the students and showed them a video. Nothing worked.

“They think that I’m part of this larger conspiracy of being a round-Earther. That’s definitely hard for me because it feels like science isn’t real to them.”

For cases like this, Yoon suggests teachers give students the tools to think like a scientist. Teach them to gather evidence, check sources, deduce, hypothesize and synthesize results. Hopefully, then, they will come to the truth on their own.

This called to mind a post from way back, in which I considered reasons for the survival of antiscientific views.

It’s basically a matter of evolution. When crazy ideas negatively affect survival, they die out. But evolutionary forces are vastly diminished under some conditions, or even point the wrong way:

  1. Non-experimental science (reliance on observations of natural experiments; no controls or randomized assignment)
  2. Infrequent replication (few examples within the experience of an individual or community)
  3. High noise (more specifically, low signal-to-noise ratio)
  4. Complexity (nonlinearity, integrations or long delays between cause and effect, multiple agents, emergent phenomena)
  5. “Unsalience” (you can’t touch, taste, see, hear, or smell the variables in question)
  6. Cost (there’s some social or economic penalty  imposed by the policy implications of the theory)
  7. Commons (the risk of being wrong accrues to society more than the individual)

These are, incidentally, some of the same circumstances that make medical trials difficult, such that most papers are false.

Consider the flat earth idea. What cost accrues to students who hold this belief? None whatsoever, I think. A flat earth model will make terrible predictions of all kinds of things, but students are not making or relying on such predictions. The roundness of the earth is obviously not salient. So really, the only survival value that matters to students is the benefit of tribal allegiance.

If there are intertemporal dynamics, the situation is even worse. For any resource or capability investment problem, there’s worse before better behavior. Recovering depleted fish stocks requires diminished effort, and less to eat, in the near term. If a correct belief implies good long run stock management, adherents of the incorrect belief will have an advantage in the short run. You can’t count on selection weeding out the “dumb tribes” for planetary-scale problems, because we’re all in one.

This seems like a pretty intractable problem. If there’s a way out, it has to be cultural. If there were a bit more recognition of the value on making correct predictions, the halo of that would spill over to diminish the attractiveness of silly theories. That’s a case that ought to be compelling for basketball fans. Who wants to play on a team that can’t predict what the opponents will do, or how the ball will bounce?

3 thoughts on “The survival value of wrong beliefs”

  1. Tom, couldn’t agree more on the value of critical thinking in education. At the same time, it’s worth noting a recursion in this story: namely, that the challenges of replicability also apply to studies of the “backfire effect” — i.e., Ms. Yoon’s assertion that “as hard as they try, science teachers aren’t likely to change a student’s misconceptions just by correcting them.” Context matters.

    A recent interview (“Walking Back the Backfire Effect”)with @BrendanNyhan (media/political context rather than educational/physical science one):
    http://www.wnyc.org/story/walking-back-backfire-effect/

Leave a Reply

Your email address will not be published. Required fields are marked *

+ 76 = 81

This site uses Akismet to reduce spam. Learn how your comment data is processed.