***This post was updated on 25 January with a link to a statement by Communications Minister Michelle Rowland***
International organisations, including the World Health Organization and UNICEF, have identified online misinformation and disinformation as one of the most significant current threats to the health and wellbeing of the global community.
Yet in Australia we do not have a national strategy to tackle misinformation and instead rely on a voluntary code of practice, developed by an industry association, to address the explosion of false and misleading information online.
A recent review by this industry body of its code of practice highlights the inadequacies of this approach, in particular through its ongoing failure to engage with the public health sector and communities most impacted by health-related misinformation and disinformation.
Croakey editor Jennifer Doggett reports below.
Jennifer Doggett writes:
On the 22 December, the Digital Industry Group Inc (DIGI) released a report responding to submissions received during the public consultation on the 2022 Review of The Australian Code of Practice on Disinformation and Misinformation.
DIGI is an Australian peak body, representing digital platforms such as Google, Meta, Twitter, Apple, Yahoo and TikTok, which developed the voluntary Australian Code of Practice on Disinformation and Misinformation.
The stated aim of this code is to provide the public, industry and government different avenues to strengthen tech efforts to combat misinformation.
Limitations of the code
Croakey has previously written extensively about the threat to health and democracy posed by Australia’s current media landscape, including the opportunity this provided to spread disinformation about health issues during last year’s federal election.
Our reporting has focussed on the impact of the lack of media diversity on public health, particularly important for Aboriginal and Torres Strait Islander people, and ethnically and linguistically diverse communities.
We have also reported on the concerns from public health experts and groups about the limitations of the DIGI Code to address the health impacts of their role in the Australian media landscape.
In our submission to the review of the code, Croakey raised these issues and stated that our preferred option would be a mandatory code developed independently of industry, with public interest and public health expertise informing the process.
We also provided responses to the specific questions raised in the DIGI discussion paper developed for the review.
These included the following recommendations:
- the code should cover all digital platforms, including popular digital platform services like Facebook Messenger, WhatsApp and WeChat are not covered, despite growing concerns that these platforms are potential hotspots for misinformation;
- the code should take an opt-out rather than an opt-in approach requiring signatories to provide a clear explanation for why the company is opting out;
- the definition of harm should be revised and expanded in consultation with public health and First Nations groups to include health experts and representatives, and community leaders and community leaders around these definitions, noting the code’s lack of detailed consideration of harms through a public health and health equity lens and explicitly include consideration of racism and hate speech;
- there should be wider consultation and discussion about the current the exemption for professional news content these issues, including with community, public interest, public health and media representatives and harmonisation of regulation across media and digital platforms;
- there would be substantial public benefit in including messaging services within the scope of the code, with some caveats, given their role in spreading and amplifying misinformation;
- there should be wider consultation and discussion about the dissemination of misinformation and disinformation as part of issues-based and political content in the media and across other platforms, including with community, public interest, public health, First Nations and media representatives; and
- the code should define sponsored content.
Croakey also asked DIGI the following questions highlighting the need for increased engagement with the public health community on this issue:
- Has DIGI reached out to public health organisations, such as the Public Health Association of Australia (PHAA), the National Aboriginal Community Controlled Health Organisation (NACCHO), the Australian Health Promotion Association (AHPA) and the Lowitja Institute for input into the code’s operations, and with invitations to participate in reviewing frameworks, complaints etc? Despite misinformation and disinformation being a significant concern for public health, as acknowledged by the ACMA report, it seems there is little engagement of public health expertise in the code’s operations.
- What measures has DIGI taken to ensure relevant health and community sector organisations and leaders are aware of the code, the review, and have had full opportunity to participate and contribute?
The revised code
Croakey is disappointed that in its response to this consultation process DIGI appears to have ignored many of the recommendations made in our submission and in particular has not directly addressed our repeated calls for increased engagement with public health experts and groups and First Nations leaders and communities.
We acknowledge that DIGI has made some minor improvements to the code in the review report, including the following:
- Allowing smaller digital platforms to be signatories to the code (with reduced requirements)
- Changing the definition of the threshold for harm from a ‘serious and imminent’ to a ‘serious and credible’ threat and updating the definition of “harm”
- Clarifiying the scope of the term ‘political advertising’ and adding a new section providing that platforms may deal with transparency of other forms of political advertising, such as issues based advertising
- Adding a definition of professional news exempted from the definition of misinformation and a list of criteria for news sources
- Updating the definition of political advertising and clarifying that some Signatories may choose to define and implement policies on issues-based advertising
- Adding new definitions of digital advertising services and sponsored content.
However, we feel that these changes fall far short of the changes needed to ensure the code can adequately address the serious and growing health threat of misinformation and disinformation online.
Health sector advocacy
Croakey will continue to report on the digital determinants of health and the importance of increased regulation of digital platforms by an independent body and for close consultation with public health groups and experts and community leaders (in particular those from Aboriginal and Torres Strait Islander communities).
We see the recent revision of the code as a missed opportunity to strengthen its role in addressing the health impacts of online misinformation and disinformation.
We urge other health groups to get involved in this issue and to advocate for a public health and equity lens to be applied to the regulation of digital platforms, noting that only two other health-related groups (Alannah and Madeleine Foundation and Breastfeeding Advocacy) provided submissions to the review.
Note: Croakey has approached DIGI for comment on the issues raised in this article.
Update
On 20 January, Communications Minister Michelle Rowland released this statement, published below, New ACMA powers to combat harmful online misinformation and disinformation.
The Albanese Government will legislate to provide the Australian Communications and Media Authority (ACMA) with new powers to hold digital platforms to account and improve efforts to combat harmful misinformation and disinformation in Australia.
This marks a major step forward in addressing the spread of online misinformation and disinformation which has grown rapidly in scale and speed.
The ACMA will be given new information-gathering and record-keeping powers to create transparency around efforts by digital platforms to respond to misinformation and disinformation on their services, while balancing the right to freedom of expression so fundamental to democracy.
The ACMA will also be empowered to register an enforceable industry code and to make a standard, should industry self-regulation measures prove insufficient in addressing the threat posed by misinformation and disinformation. This graduated set of powers includes measures to protect Australians, such as stronger tools to empower users to identify and report relevant cases.
These powers are consistent with the key recommendations in the ACMA’s June 2021 report to government on the adequacy of digital platforms’ disinformation and news quality measures. They are intended to strengthen and support the voluntary code arrangements undertaken by industry through the Digital Industry Group Inc (DIGI) and will also extend to non-signatories of the DIGI Code,
The new framework will focus on systemic issues which pose a risk of harm on digital platforms, rather than individual pieces of content posted online.
Digital platforms will continue to be responsible for the content they host and promote to users. In balancing freedom of expression with the need to address online harm, the code and standard-making powers will not apply to professional news and authorised electoral content, nor will the ACMA have a role in determining what is considered truthful.
The Government intends to undertake public consultation on the powers through the release of an exposure draft Bill in the first half of 2023 and introduce legislation in Parliament later this year following consultation.
Minister Rowland said: “Misinformation and disinformation poses a threat to the safety and wellbeing of Australians, as well as to our democracy, society and economy.
“A new and graduated set of powers will enable the ACMA to monitor efforts and compel digital platforms to do more, placing Australia at the forefront in tackling harmful online misinformation and disinformation. The Albanese Government will consult with industry and the public on an exposure draft of legislation in the first half of this year and looks forward to constructive engagement with stakeholders and industry.”
See here for Croakey’s archive of stories on the regulation of digital platforms