The spread of misinformation and disinformation on digital platforms is a global threat that is undermining health, peace, democracy and human rights, the United Nations has warned in a policy brief urging wide-ranging responses.
Misinformation and disinformation about the climate emergency are delaying urgently needed action to ensure a liveable future for the planet, and also are undermining progress on gender equality and the Sustainable Development Goals, says the brief.
Disinformation can “involve bigotry and hate speech aimed at minorities, women and any so-called ‘others’, posing threats not only to those directly targeted, but also to inclusion and social cohesion”, it says.
The brief, which includes recommendations for governments and policymakers, digital platforms and communities, outlines principles to inform development of a United Nations Code of Conduct for Information Integrity on Digital Platforms.
These include commitment to information integrity, respect for human rights, support for independent media, increased transparency, user empowerment, strengthened research and data access, scaled up responses, stronger disincentives, and enhanced trust and safety.
The brief says that digital platforms should move away from business models that prioritise engagement above human rights, privacy and safety, and states that “strengthening information integrity on digital platforms is an urgent priority for the international community”.
“Digital platforms are being misused to subvert science and spread disinformation and hate to billions of people,” said UN chief Antonio Guterres, who has been sounding this alarm for some time. “Our policy brief on information integrity on digital platforms puts forward a framework for a concerted international response anchored in human rights.”
The brief’s release is timely for Australia, coming as the Federal Government seeks feedback on plans for new laws to tighten regulation of digital platforms, and amid concerns about the spread of misinformation and disinformation on the referendum for a constitutionally enshrined Voice to Parliament for Aboriginal and Torres Strait islander people.
Consultation focus
Croakey readers have until 6 August to comment on the exposure draft of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 (whose development was foreshadowed earlier this year).
According to a statement by Communications Minister Michelle Rowland, this would give the Australian Communications and Media Authority (ACMA) information-gathering, record-keeping, code registration and standard-making powers to compel digital platforms to do more to address online misinformation and disinformation.
ACMA would not have the power to determine what is true or false or to remove individual content or posts, and its code and standard-making powers would not apply to professional news content or authorised electoral content.
Platforms would continue to be responsible for the content they host and promote. If platforms fail to act to combat misinformation and disinformation over time, the ACMA would be able to register enforceable industry codes with significant penalties for non-compliance, or create a standard requiring platforms to lift the bar on their efforts. Maximum penalties could reach $6.88 million.
Dr Becky White, a member of the Australian Health Promotion Association and Adjunct Research Fellow at Curtin University, stressed the need for comprehensive approaches.
“Measures that compel internet platforms to moderate misinformation and disinformation and increase transparency on how they do so are important. Misinformation and disinformation have significant impacts on public health, and it is vital to recognise that regulation is just one part of working to create a safer and more equitable online information environment,” she told Croakey in a statement.
“Like many health issues, we know that the social determinants of health have an impact. Research in Australia has shown that those with lower education levels, lower digital and health literacy and lower trust in government are more susceptible to misinformation. A multifaceted approach is needed that includes consideration of the many factors that influence the information environment.”
However, Australia does not have a comprehensive national strategy and plan for tackling misinformation and disinformation that encompasses wider actions. For example, the bill does not address issues of concern identified by the UN, including how the business models of digital platforms are often built on algorithms that “reward and amplify mis- and disinformation and hate speech”.
In response to these questions from Croakey, the Department of Health and Aged Care provided this statement:
“Misinformation and disinformation pose a threat to the safety and wellbeing of Australians, as well as our democracy, society and economy. Australia takes a holistic approach to combatting this growing challenge that includes educating Australians to critically engage with false and misleading information, supporting public interest journalism and media diversity, and providing support to the national broadcasters (SBS and ABC) to deliver trusted news and information to Australians, including multicultural and multilingual communities.
“The draft legislation to empower ACMA will hold digital platforms to account for harmful misinformation and disinformation online and also complement these efforts…Through a whole-of-government process, the Department of Health and Aged Care contributed to the development of the proposed legislation, which is aimed at holding digital platform services to account and creating transparency around their efforts in responding to misinformation and disinformation in Australia. The Department of Health and Aged Care takes the issue of misinformation and disinformation very seriously and actively monitors comments posted on its social media accounts to ensure Australians have access to credible, evidence-based health information.”
Undermining wellbeing
Meanwhile, the UN brief includes the following overview of how misinformation and disinformation are affecting multiple global health goals.
Media matters
The UN brief also highlights the importance of media policy in addressing misinformation and disinformation, and notes the need to address concerns such as “news deserts”, where communities lose trustworthy local news sources.
National and global studies of digital news consumption identify additional challenges for addressing misinformation and disinformation, including newsroom closures and financial constraints, declining levels of trust in media, and significant levels of “news avoidance” as people turn away from challenging topics such as climate change.
“Evidence that some people are turning away from important news subjects, like the war in Ukraine, national politics, and even climate change is extremely challenging for the news industry and for those who believe the news media have a critical role in informing the public as part of a healthy democracy,” wrote the authors of the Reuters Institute Digital News Report 2023, released earlier this year.
The detail below is taken from the Australian version of the report, produced by the University of Canberra News and Media Research Centre, released earlier this month. It found that 69 percent of Australians say they avoid the news, occasionally, sometimes or often. They are most likely to avoid stories about social justice issues.
Ongoing cuts to the ABC and similar public service media organisations around the world also undermine efforts to address misinformation and disinformation and declining trust.
Platform power
The immense power of the digital platforms to shape our news and information environments extends beyond their spread of misinformation and disinformation, as Meta reminded us this week in announcing plans to end access to news on Facebook and Instagram in Canada.
So while it’s important that health advocates engage with the current consultation, it’s clear that so much more needs to be done, to address the wide-ranging health impacts of these powerful corporations, including as vectors for harmful and hateful information.
See Croakey’s archive of articles on digital platforms