Prebunking trumps debunking

Dr Sally Stephens, Director of Science

Fake news is about taking dubious content and making it appear genuine. We have seen its results in the political arena, but  science is also susceptible to its menace. High-priority topics, such as climate change and health care, have been impacted by misinformation. The cognitive errors that give sustenance to fake news, and the reason it is so difficult to debunk the myths propagated by science sceptics, are plain to see in the measles vaccination furore.

In 1998, Dr Andrew Wakefield, a British medical researcher claimed to have discovered, among other things, a relationship between autism and the Measles, Mumps and Rubella (MMR) vaccine. Dr Wakefield espoused his views to anyone who would listen, and sensationalist coverage ensued. In 2004, however, after further clinical research and the discovery that Dr Wakefield had been funded by lawyers involved in lawsuits against vaccine-producing companies, the claim of an autism-vaccine link was withdrawn. In 2010, the entire paper was retracted and Dr Wakefield was struck off the UK medical register (Willingham & Helft, 2014).

The 12 years between publication and retraction was ample time for the groundless vaccine-autism link to become established in the minds of worried parents. The consequence was that vaccination rates began to suffer. It is important to note here that measles is not a harmless childhood disease: in 2016, the World Health Organisation reported that despite having a safe and highly effective vaccine to stop its spread, globally, nearly 400 people were dying from measles each day (WHO, 2016). In 1997, the year before the Wakefield paper was published, measles vaccination rates in the United Kingdom were over 91 per cent. They started to fall in 1998 and in 2003-2004 bottomed out at 80 per cent (NHS Digital, nd). Herd immunity against measles is achieved at vaccination levels of around 95 per cent (Funk, 2017). A 2017 survey of Australian parents’ views on vaccinations found that 12 per cent of parents were unsure if vaccines were safe and one per cent felt they were not. Twelve per cent of parents believed that vaccines weakened the immune system when they are, in fact, designed to strengthen it. Despite extensive medical research showing no causal link, nine per cent of Australian parents believed that vaccines can cause autism, and a further 30 per cent were unsure (Royal Children’s Hospital Melbourne, 2017). A global study (n= 29133 from 38 countries) found that 55 per cent of Australian respondents reported not knowing that there was no link between autism and vaccines (Statista, nd).

Why is it so difficult to debunk this kind of sham science? How can a single discredited study outweigh the overwhelming body of evidence that confirms the safety of vaccinations? Psychologists suggest that confirmation bias and belief perseverance mess with our ability to reason, and that both factors seem well fortified against debunking.

Belief perseverance is the inclination to hold on to initial beliefs even in the face of conflicting evidence. Confirmation bias allows us to do so via the tendency to search for, and/or interpret, evidence in ways that support our existing beliefs. These two cognitive dispositions explain why our reasoning is sometimes illogical and biased. We humans have difficulty processing information in a rational, unbiased manner once we have adopted a stance on an issue, because it is important to us as reasoners that we have consistency between our stance and any related evidence that comes to our attention. Unfortunately, we can maintain this consistency in erroneous ways: by being selective with the evidence we seek; by accepting evidence that confirms our beliefs and rejecting evidence that contradicts it; and by apportioning more weight or credence to evidence that supports our favoured theory than it warrants (Nickerson, 1998).

Humans are particularly susceptible to confirmation bias and belief perseverance when engaged in complex cognitive tasks or those suffused with emotion, like the health of children. Even when we are not personally invested—and one would expect that we could be unbiased about appraising evidence and reaching a rational, justified conclusion—we can still fall prey to confirmation bias. The more important something is to our social and personal identities, the more difficult it is for us to be impartial: those elements of our perceived selves that guide decisions and actions seek primacy.

Neuroscientists have investigated this bias. When people are confronted with an idea that conflicts with their personal beliefs, the parts of the brain associated with fear are activated. When the brain is in fear mode, it suppresses the activity of the frontal part of the brain that allows us to reason and listen to arguments. Conversely, when a stance is affirmed or reinforced, the reward centres in the brain are activated, releasing dopamine, resulting in a pleasurable feeling (Gorman, 2017).

My contention, in the face of the power of fake science and the collusive power of our thinking (or unthinking) processes, is that offence might be a more successful strategy than defence in the battle for the truth: prebunking might be the key. By prebunking, I mean that when an individual is exposed to what is for them a new scientific claim they should approach it with an appropriate degree of circumspection. Before a belief has the chance to become entrenched, the claim must be evaluated for currency, relevance, authority, accuracy and agenda, and the purveyor of the claim must be assigned the value it deserves. Furthermore, prebunking a claim acknowledges the correlation between confidence in the claim and the magnitude of the body of evidence supporting it.

Belief perseverance and confirmation bias are powerful human cognitive tendencies from which no one is immune, and those who spread fake news take advantage of the ubiquity of these biases. The Wakefield saga reveals the cost of this duplicity not only to public health but also to the cause of science and its status in society. Knowing that the brain is hardwired to confirm, not falsify, beliefs, the scientific community has put protocols in place to minimise the effect of bias on scientific research. Scientists rely on empirical evidence to produce scientific knowledge, and public scrutiny and peer review are designed to expose the theoretical and cultural biases that might affect a scientist’s objectivity. Scientists have an ethical responsibility to ensure the highest standards of design, analysis and interpretation of findings. These same practices can help us prosper in our daily lives and serve as armour against fake news.

Students are immersed in the culture of science from the moment they enter their first science lesson and continue to amass the tools of science throughout their science education. Every day we strive to ensure that our graduates will be able to live an evidentiary life with all of the social and cultural benefits that that entails. We have seen the negative outcomes when fake news gains traction and becomes ‘real’, but in the end, I am confident that science will be victorious because it is a reliable method with which to make sense of the world.


Funk, S. (2017). Critical immunity thresholds for measles elimination. Retrieved from

Gorman, S. (2017, December 3). Why we deny the science. All in the Mind. ABC Radio National. Retrieved from

NHS Digital. (nd). Childhood vaccination coverage statistics. Retrieved from

Nickerson, R. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175-220. Retrieved from

Royal Children’s Hospital Melbourne. (2017). Vaccination: Perspectives of Australian parents. Retrieved from

Statista. (nd). Retrieved from

Willingham, E. & Helft, L. (Posted 09.05.14). The Autism-Vaccine Myth. NOVA Retrieved from

World Health Organisation. (2016, November 10). Measles jab saves over 20 million young lives in 15 years, but hundreds of children still die of the disease every day. Retrieved from