In 2020, fact-checkers around the world found themselves at the frontlines fighting the Covid-19 infodemic, which the World Health Organization defines as an “overabundance of information, both online and offline”.
The flood of false and misleading information continues, threatening to worsen the impact of the pandemic, undermine efforts to tackle the disease and reduce the public’s trust in the global health system.
As the problem unfolded, we realised how little we knew about the spread of the Covid-19 infodemic in an African context, especially on encrypted messaging platforms like WhatsApp.
Tackling the challenge head on, Africa Check partnered with the University of Johannesburg’s Africa Centre for Evidence (ACE) in a research project to explore the spread of Covid-19 misinformation in South Africa, Senegal, Nigeria and Kenya, with a specific focus on WhatsApp. Funded by the Konrad Adenauer Stiftung, the research project aimed to answer three questions:
What is the nature of misinformation on Covid-19 shared across WhatsApp in Africa and what level of risk does this misinformation present?
How are WhatsApp users engaging with health misinformation, and are they responding in ways that reduce the risk it presents?
What do we know about mitigating against misinformation on social media?
The work is best described as “rapid research”. We collected and analysed a large amount of data. In less than four months from contract to report, we conducted three rapid evidence assessments, a survey of social media users, a series of interviews with fact-checkers, and an analysis of claims reported to Africa Check. The full research report can be read here.
The study synthesised the limited evidence about how users respond to health misinformation in Africa, and what we could do to mitigate it. With this work as the foundation, it opens the door for continued and deeper analyses in the future, as well as ongoing rapid responses to emerging challenges.
We have applied what we’ve learned from the data and proposed five strategies for fact-checkers to consider during this and future infodemics.
These are summarised here.
1. Understand the different risks posed by false information
Clear classification of the risks the infodemic presents could help fact-checkers determine where they are making a difference and guide them to be more deliberate in their editorial decisions and discussions with donors.
The research identified five major risks related to health misinformation. The first four are harm to physical health, economic harm, social harm and political harm. The fifth risk, harm to psychological wellbeing, cuts across the other four. Although Covid-19 has largely been treated as a health crisis, the evidence suggests there is a mental health crisis following close behind, in which misinformation plays an active role.
2. Guide the public to be helpful in a time of crisis by providing accurate information
Helpfulness is a theme that runs through the research. Health misinformation on WhatsApp often takes on a personal, helpful tone that makes it seem sincere and earnest. Respondents in our survey said they shared Covid-19 information because they wanted to raise awareness about the pandemic and provide helpful information to people they cared about. The literature also confirmed that users shared information if they thought it was helpful to others, out of a sense of civic duty – especially in a time of crisis.
The finding that information consumers have a social desire to help during a crisis has implications for fact-checkers. Media literacy campaigns, for example, could promote helpfulness in key messages. Debunks posted on social media could include phrases like “help us bring an end to misinformation by sharing this message”. And fundraising strategies could appeal to audiences’ desire to be helpful by asking for individual donations.
3. Tap into trusted personal networks
Linked to the idea of helpfulness is the concept that WhatsApp users are strongly influenced by their own social circles, specifically friends and family.
From the literature, we learned that users were more likely to share information received from within a trusted personal network. From a positive perspective, they were more likely to reshare corrections received from a family member, close friend or like-minded individual. Yet, more negatively, they were also more likely to act on misinformation if they received it from a family member or friend.
When people have access to the antidote against misinformation – such as accurate, fact-based information or a clear correction that debunks a false claim – they are able to widely disseminate this information to their social circles.
Africa Check is piloting a group of “Fact Ambassadors” who will distribute fact-checking reports, factsheets, guides and media literacy information to a network of peers, through their social media accounts. The lessons learned from these innovative approaches may be important in crafting future mitigation strategies that use the idea of social influence.
4. Build trusted partnerships with media, government, civil society, religious leaders and tech companies
The research emphasised the important role of trust in information sources.
For example, users seem to trust official organisations, such as government departments of health, but they may also trust anyone who claims to be, say, a medical professional or religious leader. Our survey found that users have greater trust in messages from legitimate news sources and people they consider to be in positions of greater knowledge or authority, such as health professionals.
Many fact-checkers already follow media and other partnership models to achieve greater impact and reach. Involving government departments of health, civil society organisations and vetted religious leaders could be further explored.
In our rapid review of the evidence of mitigating strategies, we also highlight the important role that social media companies could play in discouraging misinformation on their platforms. As an example, we refer to initiatives such as Health Alert, where WhatsApp formed a partnership with the World Health Organization to fight health misinformation. We also cite the impactful #CoronaVirusFacts Alliance and WhatsApp chat bot led by the International Fact-Checking Network.
5. Empower people to identify and recognise false information on social media through self-efficacy campaigns
Self-efficacy refers to people’s ability to control their motivation, behaviour and social environment. In a misinformation context, the literature refers to building self-efficacy as a strategy to teach people – on social media, for example – how to identify and recognise misinformation.
Self-efficacy to detect misinformation was identified as an overarching strategy in all three of our data sets: the literature, our user behaviour survey and interviews with fact-checkers. This strategy relies on the individual information consumer’s ability to choose how they respond to and act on misinformation.
Messages such as “Covid-19 Misinformation – it’s within your control” could counter the mental health consequences of the infodemic. Such messages may help users realise they have agency in their response to health misinformation. And campaigns that focus on users’ personal control of misinformation may make them feel less anxious and overwhelmed.
The evidence affirms that individual empowerment may be a crucial strategy against health infodemics and one that fact-checkers should integrate into editorial messaging and social media campaigns.
As the pandemic and its consequences continue to affect Africa and the world, our research and the strategies we identify may potentially make a real difference. Tackling misinformation is an ongoing task.
With the emergence of new vaccines and a newly reinvigorated antivaxxer movement, we still have work to do.