This is why some people are so confident in their “facts”.
Why is it sometimes so hard to convince someone that the world is indeed a globe, or that climate change is actually caused by human activity, despite the overwhelming evidence?
Scientists think they might have the answer, and it’s less to do with lack of understanding, and more to do with the feedback they’re getting.
Getting positive or negative reactions to something you do or say is a greater influence on your thinking than logic and reasoning, the new research suggests – so if you’re in a group of like-minded people, that’s going to reinforce your thinking.
Receiving good feedback also encourages us to think we know more than we actually do.
In other words, the more sure we become that our current position is right, the less likely we are to take into account other opinions or even cold, hard scientific data.
“If you think you know a lot about something, even though you don’t, you’re less likely to be curious enough to explore the topic further, and will fail to learn how little you know,” says one of the team members behind the new study, Louis Marti from the University of California, Berkeley.
For the research, more than 500 participants were recruited and shown a series of colored shapes. As each shape appeared, the participants got asked if it was a “Daxxy” – a word made up for these experiments.
The test takers had no clues as to what a Daxxy was or wasn’t, but they did get feedback after guessing one way or the other – the system would tell them if the shape they were looking at qualified as a Daxxy or not. At the same time they were also asked how sure they were about what a Daxxy actually was.
In this way the researchers were able to measure certainty in relation to feedback. Results showed the confidence of the participants was largely based on the results of their last four or five guesses, not their performance overall.
You can see the researchers explain the experiment in the video below:
The team behind the tests says this plays into something we already know about learning – that for it to happen, learners need to recognise that there is a gap between what they currently know and what they could know. If they don’t think that gap is there, they won’t take on board new information.
“What we found interesting is that they could get the first 19 guesses in a row wrong, but if they got the last five right, they felt very confident,” says Marti. “It’s not that they weren’t paying attention, they were learning what a Daxxy was, but they weren’t using most of what they learned to inform their certainty.”
This recent feedback is having more of an effect than hard evidence, the experiments showed, and that might apply in a broader sense too. It could apply to learning something new or trying to differentiate between right and wrong.
And while in this case the study participants were trying to identify a made-up shape, the same cognitive processes could be at work when it comes to echo chambers on social media or on news channels – where views are constantly reinforced.
“If you use a crazy theory to make a correct prediction a couple of times, you can get stuck in that belief and may not be as interested in gathering more information,” says one of the team, psychologist Celeste Kidd from UC Berkeley.
So if you think vaccinations are harmful, for example, the new study suggests you might be basing that on the most recent feedback you’ve had on your views, rather than the overall evidence one way or the other.
Ideally, the researchers say, learning should be based on more considered observations over time – even if that’s not quite how the brain works sometimes.
“If your goal is to arrive at the truth, the strategy of using your most recent feedback, rather than all of the data you’ve accumulated, is not a great tactic,” says Marti.
The research has been published in Open Mind.
Source: sciencealert.com