Regulation of digital platforms such as Facebook is a public health necessity, according to Jordan Guiao and Peter Lewis from the Centre for Responsible Technology.
Hear more from Peter Lewis at a #CroakeyLIVE webinar this Friday, from 11am AEST-12 noon.
He will join Croakey Health Media Co-chair Associate Professor Megan Williams and Editor-in-Chief Dr Melissa Sweet in examining the public health imperatives for enhanced regulation of digital platforms. Register here to attend.
Jordan Guiao and Peter Lewis write:
Moves this week by the Therapeutic Goods Administration (TGA) to stymie Clive Palmer’s latest foray into political advertising highlight the different rules that apply between the traditional media and the new social media platforms.
Whereas the TGA has warned that Palmer and the regional radio station running his anti-vax ads breach their responsibilities as advertisers and broadcasters, in the online environment, it’s up to platforms to make their own call.
On Facebook and other social networks, this sort of disinformation is circulating in groups and targeted networks, far away from the gaze of health professionals.
When dangerous misinformation does come to attention, platforms can be prompted to act – Facebook to its credit has taken down content from MP Craig Kelly. But such actions remain at the discretion of the platform.
Free of enforceable rules and driven by a business model that preferences content that excites and enrages users so as to keep them producing behavioural data for longer, these digital platforms have become their own public health problem.
Efforts to mitigate disinformation have been minimalist. In Australia and abroad the preference for voluntary industry codes and protocols set down feel-good statements of intent without sheeting home legal responsibility.
Underwhelming
In Australia, efforts aimed at addressing online disinformation – the Australian Code of Practice on Disinformation and Misinformation overseen by the Australian Communications and Media Authority (ACMA) has so far been an underwhelming endeavour.
It’s been developed by industry lobby group Digi, who volunteered to lead the project and then delivered a code that even the ACMA chair was sceptical would deliver on its brief.
The Code itself is voluntary, relying on tech companies to clean up their platforms. There are no material penalties or fines for any breaches, and therefore no real incentives for platforms to make any substantive changes.
The biggest elephant in the room remains ignored – that is, the core business model of many online platforms is driven by engagement algorithms that actively encourage and profit from disinformation. Until this is addressed any initiatives will merely be band-aid solutions that remain peripheral from the central issue.
The current Code asserts that an annual status report is enough to keep track of the tech companies’ efforts to combat disinformation, aggregating the thousands of content issues that manifest on a daily basis.
To their credit, the DIGI consulted with civic groups who voiced concern over the Code, but the concessions were predictably ineffectual. But after consultation DIGI recommended an oversight board that meets every six months to address issues, with no clear guidance on the make-up of the Board or whether it would include members with public health credentials.
This type of self-regulation has echoes of the Facebook Oversight Board, which has been roundly criticised as lacking the power and resources to meet its lofty brief to act as a virtual Supreme Court to advise Facebook on contentious takedowns and suspensions. It is fully funded by Facebook, and several exceptionally qualified people have joined the Board including academics, journalists, former politicians.
To date the Board has made less than a dozen rulings, the most notable being the decision to confirm the ban on President Trump from the platform for a defined period, at which point Facebook will have the discretion to readmit. Facebook was perhaps the only beneficiary because through this process, they are seen to be acting on disinformation, and have outsourced accountability and action to third parties without being compelled to change policy in any meaningful way.
One of the main drivers of disinformation on online platforms is the US legislation Section 230 of the Communications Decency Act, which indemnifies online platforms for the content they host. For decades platforms have been invoking Section 230 as their shield to not act on disinformation and other content issues.
Several legislators are now calling for reform and some for outright repeal of Section 230, which could create massive legal liabilities for any platforms that host user generated content.
Some caution against unintended consequences of any reform, including further entrenching Big Tech power, and instead call for more targeted proposals, like updating criminal laws for digital platforms (e.g. to include incitement to violence online).
Get real
The penalties for online platforms need to be real, in order for them to take their responsibilities seriously. Weak reforms like the Australian Disinformation Code are inadequate responses.
But it’s not enough to just regulate online platforms. Policy makers must also support the journalists and media organisations that strive to stem the tide of disinformation. Public interest journalism is critical for providing accurate, scientific and credible information when it’s needed most.
Online platforms have been benefiting from public interest journalism and news media organisations, including not-for-profit media organisations, to legitimise their platforms while at the same time undermining the core businesses of those publishers.
Legislation like the News Media Bargaining Code, which forces a commercial agreement between platforms and registered news organisations, is a start. But broader initiatives are needed to bolster funding and support for critical journalists and public interest media organisations.
This should include more stable funding and multiple policy levers, both short and long term, to support not-for-profit media and community media. A digital tax could be an ongoing initiative to ensure Big Tech continues to pay their dues to the local Australian landscape.
Journalists have been on the frontline with providing safe and accurate information during the pandemic. Just as vaccines save lives, we need public interest journalism to inoculate us from the harms of disinformation.
Declaration: Peter Lewis is a director of Croakey Health Media
See Croakey’s archive of stories on the digital platforms.
Support our public interest journalism, for health.
Other ways to support.