Chief
~ Shmalpha ~
- Admin
- #1
I like the You Are Not So Smart podcast. It's one of those ones you feel smarter after listening to it. Then you forget most of it.
First podcast:
We don’t treat all of our beliefs the same.
The research shows that when a strong-yet-erroneous, belief is challenged, yes, you might experience some temporary weakening of your convictions, some softening of your certainty, but most people rebound from that and not only reassert their original belief at its original strength, but go beyond that and dig in their heels, deepening their resolve over the long run.
Psychologists call this the backfire effect, and this episode is the first of three shows exploring this well-documented and much-studied psychological phenomenon, one that you’ve likely encountered quite a bit lately.
In this episode, we explore its neurological underpinning as two neuroscientists at the University of Southern California’s Brain and Creativity Institute explain how their latest research sheds new light on how the brain reacts when its deepest beliefs are challenged.
Second podcast:
If you try to correct someone who you know is wrong, you run the risk of alarming their brains to a sort-of existential, epistemic threat, and if you do that, when that person expends effortful thinking to escape, that effort can strengthen their beliefs instead of weakening them.
In this episode you'll hear from three experts who explain why trying to correct misinformation can end up causing more harm than good.
Third podcast:
If dumping evidence into people’s laps often just makes their beliefs stronger, would we just be better off trying some other tactic, or does the truth ever win?
Do people ever come around, or are we causing more harm than good by leaning on facts instead of some other technique?
In this episode we learn from two scientists how to combat the backfire effect. One used an ingenious research method to identify the breaking point at which people stop resisting and begin accepting the fact that they might be wrong. The other literally wrote the instruction manual for avoiding the backfire effect and debunking myths using the latest psychological research into effective persuasive techniques.
So then what happened?
The backfire effect seemed to disappear when trying to repeat the experiments.
It's popularity as an explanation might even have been almost entirely the result of a complex question which was then simplified by journalists.
Fourth podcast:
Last year on this show, we did three episodes about the backfire effect, and by far, those episodes were the most popular we’ve ever done.
In fact, the famous web comic The Oatmeal turned them into a sort of special feature, and that comic of those episodes was shared on Facebook a gazillion times, which lead to a stories about the comic in popular media, and then more people listened to the shows, on and on it went. You can go see it at The Oatmeal right now at the top of their page. It’s titled, you are not going to believe what I am about to tell you.
The popularity of the backfire effect extends into academia. The original paper has been cited hundreds of times, and there have been more than 300 articles written about it since it first came out.
The backfire effect has his special allure to it, because, on the surface, it seems to explain something we’ve all experienced -- when we argue with people who believe differently than us, who see the world through a different ideological lens -- they often resist our views, refuse to accept our way of seeing things, and it often seems like we do more harm than good, because they walk away seemingly more entrenched in their beliefs than before the argument began.
But…since those shows last year, researchers have produced a series new studies into the backfire effect that complicate things. Yes, we are observing something here, and yes we are calling it the backfire effect, but everything is not exactly as it seems, and so I thought we should invite these new researchers on the show and add a fourth episode to the backfire effect series based on what they’ve found. And this is that episode.
SO?
It seems that we tend not to change our attitudes, and there is a potential for backfire in some rare cases.
New information can be absorbed. Corrections to factual inaccuracies can change our factual beliefs.
But will these new beliefs change our behaviour? Not in the way they would if we were perfectly rational beings.
Man as the rationalising animal.
The roadmap for development of our attitudes, it seems, goes a little like:
Regional cultural norms --through normative influence--> Tell us what we should or should not be doing --Informs--> Values and Attitudes and Opinions --Which we protect and justify with--> Factual Beliefs --Through--> Cherry picking facts and evidence
Educating a person about specific, erroneous factual beliefs does not have enough force, usually, to go back up that chain and change values and attitudes and behaviour.
So, people can change their false factual beliefs. But persuasion is hard, so "the backfire effect" is blamed for failed persuasion.
Values and opinions relating to identity, self, and tribe are particularly stubborn to shift.
Trump supporters are capable of agreeing that Trump lied about a specific fact, such as unemployment in some states (he said it was rising, when it was falling), but won't change their attitude toward Trump, instead finding other factual beliefs to protect and justify the attitude.
The anti-vaxxer will still tend not to vaccinate their kids, even though they know that there is no connection to autism. In fact in the vaccination study, the backfire wasn't regarding beliefs, it was about attitudes. The backfire still happened - intention to vaccinate dropped from 70% to 45%.
Factual beliefs, it seems, are cheap and expendable when protecting our opinions, specially when that opinion is formed out of a regional, self-identity value.
Or at least that is what I got out of this series.
First podcast:
We don’t treat all of our beliefs the same.
The research shows that when a strong-yet-erroneous, belief is challenged, yes, you might experience some temporary weakening of your convictions, some softening of your certainty, but most people rebound from that and not only reassert their original belief at its original strength, but go beyond that and dig in their heels, deepening their resolve over the long run.
Psychologists call this the backfire effect, and this episode is the first of three shows exploring this well-documented and much-studied psychological phenomenon, one that you’ve likely encountered quite a bit lately.
In this episode, we explore its neurological underpinning as two neuroscientists at the University of Southern California’s Brain and Creativity Institute explain how their latest research sheds new light on how the brain reacts when its deepest beliefs are challenged.
Second podcast:
If you try to correct someone who you know is wrong, you run the risk of alarming their brains to a sort-of existential, epistemic threat, and if you do that, when that person expends effortful thinking to escape, that effort can strengthen their beliefs instead of weakening them.
In this episode you'll hear from three experts who explain why trying to correct misinformation can end up causing more harm than good.
Third podcast:
If dumping evidence into people’s laps often just makes their beliefs stronger, would we just be better off trying some other tactic, or does the truth ever win?
Do people ever come around, or are we causing more harm than good by leaning on facts instead of some other technique?
In this episode we learn from two scientists how to combat the backfire effect. One used an ingenious research method to identify the breaking point at which people stop resisting and begin accepting the fact that they might be wrong. The other literally wrote the instruction manual for avoiding the backfire effect and debunking myths using the latest psychological research into effective persuasive techniques.
So then what happened?
The backfire effect seemed to disappear when trying to repeat the experiments.
It's popularity as an explanation might even have been almost entirely the result of a complex question which was then simplified by journalists.
Fourth podcast:
Last year on this show, we did three episodes about the backfire effect, and by far, those episodes were the most popular we’ve ever done.
In fact, the famous web comic The Oatmeal turned them into a sort of special feature, and that comic of those episodes was shared on Facebook a gazillion times, which lead to a stories about the comic in popular media, and then more people listened to the shows, on and on it went. You can go see it at The Oatmeal right now at the top of their page. It’s titled, you are not going to believe what I am about to tell you.
The popularity of the backfire effect extends into academia. The original paper has been cited hundreds of times, and there have been more than 300 articles written about it since it first came out.
The backfire effect has his special allure to it, because, on the surface, it seems to explain something we’ve all experienced -- when we argue with people who believe differently than us, who see the world through a different ideological lens -- they often resist our views, refuse to accept our way of seeing things, and it often seems like we do more harm than good, because they walk away seemingly more entrenched in their beliefs than before the argument began.
But…since those shows last year, researchers have produced a series new studies into the backfire effect that complicate things. Yes, we are observing something here, and yes we are calling it the backfire effect, but everything is not exactly as it seems, and so I thought we should invite these new researchers on the show and add a fourth episode to the backfire effect series based on what they’ve found. And this is that episode.
SO?
It seems that we tend not to change our attitudes, and there is a potential for backfire in some rare cases.
New information can be absorbed. Corrections to factual inaccuracies can change our factual beliefs.
But will these new beliefs change our behaviour? Not in the way they would if we were perfectly rational beings.
Man as the rationalising animal.
The roadmap for development of our attitudes, it seems, goes a little like:
Regional cultural norms --through normative influence--> Tell us what we should or should not be doing --Informs--> Values and Attitudes and Opinions --Which we protect and justify with--> Factual Beliefs --Through--> Cherry picking facts and evidence
Educating a person about specific, erroneous factual beliefs does not have enough force, usually, to go back up that chain and change values and attitudes and behaviour.
So, people can change their false factual beliefs. But persuasion is hard, so "the backfire effect" is blamed for failed persuasion.
Values and opinions relating to identity, self, and tribe are particularly stubborn to shift.
Trump supporters are capable of agreeing that Trump lied about a specific fact, such as unemployment in some states (he said it was rising, when it was falling), but won't change their attitude toward Trump, instead finding other factual beliefs to protect and justify the attitude.
The anti-vaxxer will still tend not to vaccinate their kids, even though they know that there is no connection to autism. In fact in the vaccination study, the backfire wasn't regarding beliefs, it was about attitudes. The backfire still happened - intention to vaccinate dropped from 70% to 45%.
Factual beliefs, it seems, are cheap and expendable when protecting our opinions, specially when that opinion is formed out of a regional, self-identity value.
Or at least that is what I got out of this series.
Last edited: