Campus Review Vol. 30 Issue 12 Dec 2020 | Page 23

campusreview . com . au industry & research believing fake news or information that they encounter online . One of those central mechanisms is the role of familiarity and repetition . One of the most robust findings in cognitive psychology in this area of belief formation is that when information is repeated over and over again , we ’ re more likely to believe it .
So one of the things we do in the handbook is look at some of the different strategies that you can think about in terms of debunking or protecting yourself from misinformation , or believing information that ’ s not true or unreliable .
Some of the different approaches we talk about include warning people that they may encounter information that is false . If you have a look at the science of false news and the science of how people come to believe things are true , a simple warning as people are encountering misinformation , or just a slight nudge to remind them that some information might be true and some might be false , can reduce people ’ s likelihood of showing this familiarity-based increase in belief .
What we know about misinformation is that it ’ s particularly sticky . For example , let ’ s say you fall sick and your suspicion is because it ’ s something you ate recently at a restaurant . And I say to you , “ Actually , Wade , it wasn ’ t something you ate at a restaurant , it was something else that caused you to fall sick .”
While you might update your beliefs about the reason that you ’ re sick , what we see in research and psychology is that you ’ ll still act in ways that are consistent with the initial information you believed at the start .
So you might , for example , avoid a certain restaurant that you think caused you to be sick , even when you know that that wasn ’ t the real reason that you fell ill . It ’ s called continued influence effect , so misinformation can be sticky and it ’ s hard to correct .
When we talk about different ways that you can correct misinformation , one of the core ways is giving people a clear and easily comprehensible alternative explanation . We talk about making the truth sticky so that the truth comes to mind quite easily and quite rapidly when people think about a particular topic .
People often make the mistake of doing what we call an internal search within a particular website . So , if I ’ m reading a story , I might make an assessment of whether it ’ s true based on how consistent it is or how coherent the narrative is . But a much more effective way of assessing whether something is credible is doing what we call a lateral search . That ’ s searching across a bunch of different highly credible websites to see whether there is consensus across websites rather than within a particular story .
Related to that is the problem of ‘ echo chambers ’, because the information that you ’ re getting if you are within a particular social circle is likely to solidify your beliefs rather than give you exposure to lateralised searches . If you search within your echo chamber , you ’ re unlikely to encounter information that ’ s going to correct your belief that you already have in mind .
A lot of academics were involved in the process of writing this book . Was that difficult or did it make it easier in some ways ? I was really curious about the process . The lead authors on the handbook went through psychological databases , looking at psychological research , and highlighted the academics who were working very closely in this area to find the experts on misinformation , and then they got us all together .
The first step was rating what we thought were reliable and important phenomena when it comes to the science of debunking . And then , after we made those ratings , we had a couple of Zoom meetings , which I thought was quite interesting because these are people in the scientific literature . Scientists don ’ t always agree .
So this was a very interesting process of getting everybody at the table to nut out what were the important mechanisms . What do we all agree were things that were highly replicable and important for the science ?
It turned out to be a really highly collaborative exercise , and new research was born out of the whole process . I wouldn ’ t say it was difficult or easier . I think it was a really novel and exciting process for science because it ’ s forced us to have those kinds of discussions you often don ’ t get to have when you ’ re in the process of publishing scientific research .
Everybody had a voice at the table and we also pre-registered the whole process . This is the starting point for how we might think of providing or producing these kinds of consensus type documents in the future .
When information is repeated over and over again , we ' re more likely to believe it .
Are you concerned about the level of critical literacy or media literacy in this era of people getting their news and facts from Facebook feeds and Google ? Yes , I ’ m concerned . When you look at some of the data on the percentage of people who get their news and facts from Facebook feeds or social media feeds , it ’ s the large majority when you look at some of the data that comes out of the US . Social media feeds are a key source of news .
Another reason to be concerned is that people are juggling a lot of information . We ’ re all multitasking these days , and that has consequences for how we distribute our cognitive resources . So , for example , we know when you look at just the data on how people engage with news online , even if you go to a news website , you ask a bunch of journalists and they ’ ll say that based on their data analytics on their websites people don ’ t necessarily get through a whole article .
So you might have an article that ’ s aimed at debunking , but people get through the headline and maybe the first couple of paragraphs . Journalists then have to be very strategic in the way that they ’ re setting up a debunking article . In the handbook , you ’ ll see us suggest that you lead with a true statement , the headline . The insidious thing here is that the fake line is much more engaging . That grabs people ’ s attention . But we should really be doing the opposite . We should be focusing on how can we hook people ’ s attention with the truth , and then explain why the truth is built on empirical evidence and why some of these rumours are actually wrong .
There are reasons to be concerned , but if you have a look at the handbook , you ’ ll see there are ways we can work on correcting it . And the goal here is to actually translate all of the science that scientists know that we use in our circle into a format that is accessible and people can actually put it into practice : not just journalists , not just policymakers , but everyday people who are concerned about the volume of misinformation . ■
21