Introduction by Croakey: As community leaders raise concerns that Federal Government education campaigns are “not cutting through” COVID vaccine misinformation, a new report has found that 65 percent of misinformation about vaccines on social media can be attributed to just 12 anti-vaccine activists.
The report also highlights the failures of social media platforms to prevent the spread of potentially dangerous anti-vax misinformation.
The report was produced by the Center for Countering Digital Hate (CCDH), a not-for-profit NGO with offices in London and Washington, which aims to combat the negative impact of what it terms the “Digital Counter Enlightenment”.
This movement includes individuals and movements who promote online hate and misinformation around a diverse range of issues, including anti-feminism, ethnic nationalism and the denial of scientific consensus on climate change.
Croakey editor Jennifer Doggett summarises the report and examines its implications for Australia’s current regulatory approach to social media platforms.
Jennifer Doggett writes:
The ‘Disinformation Dozen’ report from the CCDH focusses on the role of social media platforms in spreading misinformation about the COVID-19 vaccine.
Croakey has chosen not to amplify the profiles of these individuals by naming them here but those currently active on Twitter include:
- A medical doctor with 113,000 followers who recently tweeted that her “health plan” for combatting COVID is to “refrain from wearing masks, avoid all vaccines, get together with friends and give free hugs”
- A high profile lawyer and activist with 235,000 followers who claims that COVID testing and mortality data have been manipulated by Bill Gates in order to force lockdowns to accelerate the construction of a 5G network
- An osteopathic physician with 290,000 followers who claims that the COVID-19 pandemic was planned by a cabal of “transhumanistic, technocratic overlords” and that COVID is no more dangerous than the seasonal flu.
Key findings
Key findings of the report are as follows:
- Anti-vaccine activists on Facebook, YouTube, Instagram and Twitter reach more than 59 million followers, making these the largest and most important social media platforms for anti-vaxxers.
- Anti-vaxxers are using social media platforms to target minority groups, including communities of colour to spread conspiracies and lies about the safety of COVID vaccines.
- Facebook, Google and Twitter have put policies into place to prevent the spread of vaccine misinformation but thus far have failed to satisfactorily enforce those policies.
- Anti-Vax Watch conducted an analysis of a sample of anti-vaccine content that was shared or posted on Facebook and Twitter a total of 812,000 times between 1 February and 16 March 2021. It shows that 65 percent of anti-vaccine content on these platforms is attributable to twelve prominent anti-vax activists, whom the report terms “the Disinformation Dozen”.
- Research conducted by CCDH last year has shown that platforms fail to act on 95 percent of the Covid and vaccine misinformation reported to them.
- In particular, these platforms have failed to remove the accounts of prominent anti-vaxxers who have repeatedly violated their terms of service.
- Nine of the “Disinformation Dozen” remain on all three platforms, while just three have been comprehensively removed from just one platform.
Recommendations
The report recommends a number of measures to combat the dissemination of harmful anti-vax information, including:
- De-platforming the most highly visible repeat offenders, including the organisations these individuals control or fund, as well as any backup accounts they have established to evade removal.
- Establishing a clear threshold for enforcement action, such as two strikes, after which restrictions are applied to accounts short of de-platforming them.
- Presenting warning screens when attempting to follow links to sites known to host vaccine misinformation, and users exposed to posts containing misinformation should be shown effective corrections.
- Banning private and secret anti-vaccine groups (on Facebook) where anti-vaccine disinformation can be spread.
Australian approach
The ‘Disinformation Dozen’ report highlights the gaps and weaknesses in Australia’s self-regulatory code to address online misinformation and disinformation, which have been repeatedly identified by public health and public interest journalism experts (see Croakey Health Media’s submission).
This code was launched in February this year following a recommendation from the Digital Platforms inquiry, conducted by the Australian Competition and Consumer Commission (ACCC).
The voluntary industry code of practice was developed by digital industry association, DIGI, and will be overseen by the Australian Communications and Media Authority (ACMA). It has been adopted by Twitter, Google, Facebook, Microsoft, Redbubble and TikTok.
Croakey has previously reported on the public health community’s response to this voluntary code, which was slammed by critics as falling way short of the critical action needed to prevent growing harms to public health and democracy.
Also criticised was the lack of consultation with public health organisations and experts and with communities who bear the brunt of racism on digital platforms, including Aboriginal and Torres Strait Islander people.
When the code was launched, Terry Slevin, CEO of the Public Health Association of Australia (PHAA), said that while the PHAA welcomed its the general direction, for the most part, “voluntary codes are an unenforced gossamer-thin veil of pretence, with the objective of avoiding genuine regulation”.
“What we need is a mandatory vigorously enforced code with access to meaningful penalties for breaches, founded on scientific evidence, if we want to see any change in the prioritisation of public health over private profits,” Slevin said.
Chris Cooper, executive director of Reset Australia, the Australian arm of the global Reset initiative to counter digital threats to democracy, was scathing about the new code, saying it “does nothing but reinforce the arrogance of giants like Facebook” and that the Federal Government should not accept the “insouciant contempt” it shows for the Australian public.
Rob Moodie, Professor in Public Health at Melbourne University, described self-regulation as “a complete smokescreen” and “window dressing”, pointing to past lessons from the food, tobacco and gambling industries.
The Public Interest Journalism Initiative (PIJI) also expressed “deep concern” with both the process and the final iteration of the code.
In a statement, it said the code “falls well short” through its lack of detail and clarity of expectations on its signatories and on the role of public interest journalism to assist in combating misinformation.
PIJI CEO Anna Draffin said that the code provided little detail on governance mechanisms, administrative oversight, selection and composition of independent members and complaint handling, and does not include agreed or uniform metrics for reporting nor requirements for record-keeping.
PIJI also was concerned that harm caused by misinformation was “significantly underestimated” by the definition in the code, which refers to “harms which pose an imminent and serious threat”.
Draffin said that this failed to recognise that the accumulation of small amounts of misinformation disseminated over time “can culminate in serious threats to the democratic process and community safety.”
Future action
As part of the code, participating companies have agreed to release an annual transparency report about their efforts to improve understanding of online misinformation and disinformation in Australia.
The first set of reports are due to be released in May and will provide an indication of the code’s effectiveness (or lack thereof) in combatting the current “infodemic”.
Of particular interest, given the ‘Disinformation Dozen’ report, will be whether the code has been successful in restricting the presence of “super spreaders” of misinformation and disinformation on social media.
If not, the release of the reports will be an opportunity for the public health and public interest journalism communities to advocate for a more stringent approach to regulating social media platforms in the future.
See Croakey’s archive of stories about digital platforms and their implications for public health.
Support our public interest journalism, for health.
Other ways to support.