Changing Minds
Most scientists and programmers’ implicit model of belief is roughly Bayesian: when someone who believes something about the world receives new evidence, they update their beliefs in the way that fits that evidence best. This model is (mostly) true in domains that people aren’t invested in emotionally, but fails in predictable ways for beliefs that are tied to group membership.
Research in social psychology has established that beliefs about contested political and social issues function primarily as signals of group identity rather than as conclusions from evidence. Holding the wrong belief does not just mean being misinformed: it puts you outside the group, so updating the belief means leaving that group. The social cost of updating is therefore often higher than the mental cost of the cognitive dissonance incurred by staying wrong.
The backfire effect is the most striking manifestation of this pattern. In a range of experimental settings, presenting people with accurate evidence that contradicts a strongly held belief does not lead to them updating that belief. Instead, the person exposed to the corrective information often becomes more confident in the original belief, not less.
Not every study of the backfire effect has replicated cleanly, and its size and generality is still debated, but the underlying mechanism is robust.
Motivated reasoning is a related and better-established phenomenon. People do not evaluate arguments neutrally. Instead, they are significantly better at identifying flaws in arguments that lead to conclusions they dislike than in arguments that lead to conclusions they support. For example, a trained scientist who is also a gun owner will scrutinize studies on gun violence more skeptically than studies on climate change if their social circle treats the former as identity-threatening but not the latter. This isn’t dishonesty in the ordinary sense because the person doesn’t know they’re doing it. They believe their reasoning is rigorous because the biased scrutiny is real scrutiny: the flaws they find are often genuine flaws. But the asymmetric attention means that they have reached a conclusion before their evaluation begins.
A colleague once told me that people want data, but believe stories. This makes sense in light of motivated reasoning: data provides cover for a decision already reached on other grounds, while stories transmit the emotional and social context that actually drives belief change. It’s also why the most effective public health campaigns don’t focus on presenting statistics; instead, they present specific, named people in specific situations.
This rule is also why industries that want to prevent people from changing their beliefs fund think tanks that produce reports full of data. The data itself is not meant to influence people: what is is the appearance of rigor, which mimics the form of legitimate evidence without the substance.
This pattern occurs again and again in the history of regulatory resistance. When an industry faces regulatory pressure, its most effective response isn’t to dispute the facts directly, but to fund alternative research and elevate contrarians with convincing credentials in order to give the appearance of genuine scientific controversy. The goal isn’t to win the argument, but to prevent the argument from ending. As long as the question appears open, the existing situation is the default and regulatory action requires legislators to take sides in the face of a lack of consensus that the industry has deliberately manufactured. The tobacco industry did this for forty years; the fossil fuel industry has been doing it for the last thirty, and now big tech companies are borrowing their methods (and in some cases hiring the same PR firms).
If we want AI, social media, and the software industry in general to be regulated in meaningful ways, finding and presenting evidence of harm is a naïve strategy. Instead, we need to change the social context in which beliefs are held by finding trusted messengers within the relevant communities, reframing the issue so that updating does not require identity betrayal, and working through social networks rather than through arguments. This is not manipulation: it is taking the actual psychology of belief seriously.
- Achen2017
- Christopher Achen and Larry Bartels: Democracy for Realists: Why Elections Do Not Produce Responsive Government. Princeton University Press, 2017, 978-1400888740.
- Hoffer2010
- Eric Hoffer: The True Believer: Thoughts on the Nature of Mass Movements. Harper Perennial, 2010, 978-0060505912.
- Kahneman2011
- Daniel Kahneman: Thinking, Fast and Slow. Farrar, Straus and Giroux, 2011, 978-0374533557.
- Robin2017
- Corey Robin: The Reactionary Mind. Oxford University Press, 2017, 978-0190692001.
- SteinLubrano2024
- Alexis Stein Lubrano: Don’t Talk About Politics: How to Change 21st Century Minds. Bloomsbury Continuum, 2025, 978-1399413923.