Facts don't change minds
There's an article (it's rather long, so below are some excerpts) that says that when people are exposed to facts, they become even more entrenched in their wrong headed beliefs rather than changing their minds. Many of us have seen this in our daily lives, especially when we are accused of being the one at fault when we are the victim.
But the article also says that the most sophisticated thinkers make that same mistake.
I'm wondering whether or how this applies to Aspies. Are we more logical and less likely to be swayed by misinformation? Or more rigid and therefore more likely to stick to our assumptions, even if they are wrong?
Link to article.
In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds... They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
A 2006 study showed that sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.
Yes, you have to be always aware of your own cognitive biases, or you're in danger of not being able to correct your mistakes. We know already that people pay much more attention to information that supports conclusions they've already drawn--which makes it difficult to change your opinions on things.
I think your title's a bit misleading, though: "Facts don't change opinions" is not quite right; "Facts are extremely inefficient at changing opinions" is probably closer to the truth.
It's possible, of course. Expose yourself to enough facts, while aware of your own tendency to ignore the ones that don't support your current opinions, and you become capable of identifying your own errors. If I'm not mistaken, the people who did the study in question only tested their subjects once--they didn't test what happened after those people had read a book or two, or done intensive research, on the ideas they were initially opposed to.
It's actually deciding to go and do the research that's the difficult part.
If you think about it, though, people do this for a reason. It's adaptive to keep the same idea about something until something really impressive causes you to change it. If you're forever thrown into cognitive dissonance by every new fact that comes along, you spend too much time thinking, not enough time doing, and eventually a lot of time in a saber-toothed tiger's belly.
This is why we need the scientific method. Done right, it helps us overcome our own cognitive biases and pick out the ideas most likely to be fact.
_________________
Reports from a Resident Alien:
http://chaoticidealism.livejournal.com
Autism Memorial:
http://autism-memorial.livejournal.com
Well I think it could go either way with Aspies. I think I am more likely to go with the facts, whereas I know someone with AS who is extremely fixed in their views, including prejudices, and won't change their mind despite the facts.
It's the same old story, you can't generalize people with AS because we all have our own unique personality.
I, for example, have a liberal political cognitive bias. Fox News could report something that is 100% factually accurate and I won't believe them until I hear the exact same news story on CNN.
That's a better title (more...factual). I think facts are like raindrops. If you feel just one it doesn't change your conviction that you are dry and it isn't raining. If you feel 10, you might look up at the sky and be open to changing your conviction but still not change it yet. If you feel 1,000 you reverse your position entirely and are now convinced that it's raining and you are wet.
It's actually deciding to go and do the research that's the difficult part.
So very true. And I think you need some emotional investment in the subject at hand to go do the research. Somebody may tell me some absolutely true thing that sounds preposterous to me and I just ignore them. Perhaps if I invested 20 hours researching I'd discover they are right. But if I don't really care it isn't really worth the time for me to research it.
Exactly! Over the years, I have probably heard plenty of things that sounded preposterous to me that I would have discovered were true if I had spent X hours researching them. It would be maladaptive to believe every single thing somebody tells me on the chance that it may be true. It would also be maladaptive for me to invest lots of time researching every single thing somebody tells me just to find out if it's true. So I only research the things that I believe are important. Due to this, I have probably disbelieved a large number of true facts over my life just for the sake of mental efficiency. But when it really matters, I discount nothing, no matter how preposterous it sounds on its face. When I served on a jury, I listened with an unusually open mind to some things that I would have discounted as preposterous outside of a courtroom. But a courtroom is no place for the efficiency of reflexive disbelief. The jury system depends on the majority of people being able to do this when it really matters.
Yes! That's why one experiment is never enough. Repeatability is mandatory for a reason.
But if you firmly did believe that facts change minds, would you change your mind the instant you read this study? (my attempt at a paradox.

If I thought I already knew the result I wouldn't be as interested in reading the study so I probably wouldn't read it and see that I was wrong. At least I wouldn't read it *carefully* enough to discover my factual misconceptions.
I'm also not going to read a long article arguing both sides of the evidence for creationism vs. evolution debate since I've already made up my mind that creationism is illogical nonsense. I'm more interested in actually reading stuff carefully when I feel it *could* potentially change my opinion.
It's actually deciding to go and do the research that's the difficult part.
I agree it would be too cumbersome to go and research every single thing, and that it would be maladaptive since one would be constantly chasing his/her tail instead of getting things done.
And I hope it would be true that with enough research anyone would be able to change his/her position. But according to the studies in the article, that doesn't happen, not even with people who are knowledgeable enough to get 90% of their facts right and are presumably the kind of people who are willing to research quite a bit. (I don't know how reliable the study is,, but it seems plausible, just from personal experience.)
Do you really think most jurors usually do this? Look at how many people are convicted or acquitted on just a gut feeling, and how much effort lawyers go to to appeal to the jury's emotions and prejudices. To cite an example everyone is familiar with, look at the O.J. Simpson trial.
I think I don't have as much faith in humans as the two of you.

This is something that has been ... somewhat known for some time. Just look at how people think in regards to conservative/liberal ideas; it's not a matter of what facts are correct but on whose side it's helps. It's the emotional appeal versus the factual appeal, and more often than not, the emotional appeal is what wins even if it's incorrect.
It's been a long standing human failing in my opinion.
_________________
Current obsessions: Miatas, Investing
Currently playing: Amnesia: The Dark Descent
Currently watching: SRW OG2: The Inspectors
Come check out my photography!
http://dmausf.deviantart.com/
But the article also says that the most sophisticated thinkers make that same mistake.
I'm wondering whether or how this applies to Aspies. Are we more logical and less likely to be swayed by misinformation? Or more rigid and therefore more likely to stick to our assumptions, even if they are wrong?
Link to article.
In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds... They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
A 2006 study showed that sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.
I can be easily convinced to change my mind about something if the inciting argument is rational. If somebody presents me, or I present myself, with a good argument completely in opposition to something I believed previously, I cannot ignore it. If such a thing happens I must reserve judgment on the matter, depriving from my previous views the status of perceived definite until I can reach a conclusion. Facts change my mind.
This very same article was coincidentally posted as a link on the Facebook profile of a friend of mine yesterday. As others have said, this phenomenon is not news to me. I've noticed the human need to stick to ones guns regardless of empirical evidence to the contrary. I've also noticed that the propensity for increased entrenchment when challenged is common.
Personally, I think aspies would be less likely to succumb to this because we are less prone to let emotions colour our cognitive reasoning. But, we're still human and it's a good thing to be aware of. Try pointing it out to most NTs though and you are liable to be in for a frustrating discussion :S
Callista made a valid observation in the utility of entrenchment for survival reasons. We can't always spend time questioning everything, else we become paralyzed:
The answer is to also be adaptive and able to change our beliefs in the face of new evidence as well.
Besides empirical scientific evidence that protects us in the world, there is another area that I see entrenchment and that is more to do with what this study was testing. The reason for stubborn entrenchment against facts in these cases has less to do with physical survival than with ego defense mechanisms.
Most people make up their minds (opinions) about the world by the time they graduate High School and some will change while going to college or university. Most though, continue to hold the biases they formed by age 17. And most, from my experience will hold those biases for a lifetime unless they do a considerable amount of psychological work on self awareness.
Why do people for instance believe there are more people on welfare than need to be and are taking advantage of it, if the facts state otherwise? They believe this to defend their ego. Pretty much every sociological "belief system" is an ego defense.
To change a person's opinion/belief you have to not feed them facts, that makes them defensive.
Instead you have to "frame the issue" or put it into a context that makes them WANT to change their opinion.
Only they can change their minds, not you. So for them to do so, they have to want to.... You can only help get them to want to do it.
Sadly, I am not good at doing this... If you want to know how to do it, the best people to ask are preachers, self-help gurus, con-men and people of similar ilk. They're masters of it. (though i'm not 100% sure it's a technique that is backward compatible... these people work at making you believe lies, not the truth... there are inherent differences)
But if you're an aspie, give up trying, it's not in us to do it. Just record the facts you know, and you can smugly smile and say "I told you so" later.
Instead you have to "frame the issue" or put it into a context that makes them WANT to change their opinion.
Only they can change their minds, not you. So for them to do so, they have to want to.... You can only help get them to want to do it.
Sadly, I am not good at doing this... If you want to know how to do it, the best people to ask are preachers, self-help gurus, con-men and people of similar ilk. They're masters of it. (though i'm not 100% sure it's a technique that is backward compatible... these people work at making you believe lies, not the truth... there are inherent differences)
But if you're an aspie, give up trying, it's not in us to do it. Just record the facts you know, and you can smugly smile and say "I told you so" later.
This is so true. As for others of "similar ilk", you could add lawyers, politicians, salesmen, news media, and advertisers to that list. They are all masters of the lie. In fact advertising is probably the best example of influence that changes peoples opinions. As for the difference between making someone believe a lie and making them see the truth, I think it depends on the person and the issue and the defenses involved, but the same mechinations come into play.
The way con men and advertisers work is to pander to your deepest insecurities and most ingrained beliefs of who you are - your sense of self - and then play you. Thus, for you to prove you are who you say you are, you have to end up "buying" into their lie.
One successful example I can think of that's a positive in this regard is the anti drunk driving campaign. Not that many years ago few people looked twice at anyone getting behind the wheel after drinking. Now after a couple of decades of advertising campaigns, most people believe that "Friends don't let friends drive drunk." which is true.
Groups like M.A.D.D. have changed peoples opinions about what the right thing to do is by tugging on the image of self that most people hold of themselves as a good friend and a responsible person. This has worked way better than any picture of a mangled car or facts about driving drunk and statistics of death.
I wouldn't be the fact that you can loose your car and your driver's license over ridiculously low levels of alcohol in your system without even getting into an acident. They only way they changed peoples minds is by putting a gun to their head.
Similar Topics | |
---|---|
Aspergers --> Spectrum change |
05 Jul 2025, 8:48 pm |
I feel bad because I got asked for change. |
18 Jul 2025, 9:16 pm |
change, failure, rejection |
01 Jul 2025, 10:00 pm |