Efforts to address the public health and safety concerns surrounding digital platforms are underway, but many questions and concerns remain. Alison Barrett takes a deep dive into regulatory developments in Australia and elsewhere.
Alison Barrett writes:
The Australian Government is taking long-overdue action to regulate the digital environment in a number of areas, but public health experts warn that the efforts risk being ad hoc and piecemeal.
Governments must take a more systematic approach, informed by public health evidence and expertise, if powerful commercial online and digital entities are to be held responsible for creating safer online and digital environments for users, they say.
Currently, an array of government plans, inquiries, draft bills and amendments is in some form of progress as the Australian Government attempts to “modernise Australia’s laws for the digital age” and address the harmful impacts of digital and online technology.
While welcoming government concern and action on the issue, Professor Kathryn Backholer, from the Institute for Health Transformation at Deakin University and Vice President of Policy for the Public Health Association of Australia (PHAA), told Croakey “the approach seems ad hoc, potentially missing opportunities for a coordinated, comprehensive approach”.
Dr Aimee Brownbill, Policy and Research Manager at FARE, said “the current discussions and processes are falling short of addressing fundamental issues with the design of digital platforms themselves, particularly when it comes to the design of these platforms as data-driven digital powerhouses”.
Brownbill told Croakey that it is time responsibility is placed on digital platforms to design their systems in ways that are safe for everyone, including children and people most at risk of harm.
Current developments
The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill is open for feedback until 30 September, Scams Prevention Framework: Exposure Draft Legislation is open for feedback until 4 October, and the Joint Select Committee on social media and Australian society is due to submit its final report on 18 November 2024.
Also relevant is the Privacy and Other Legislation Amendment Bill, tabled in Parliament last week. As Croakey reported recently, a range of media policy options are also in play, including consideration of a ‘digital tech tax’, and a response to Meta’s refusal to renew media funding agreements under the News Media Bargaining Code.
Professor Fran Baum AO, at The Stretton Institute, The University of Adelaide, told Croakey that “Meta not paying for news content is a threat to our democracy because it means conventional news outlets are less viable, and we need a mix and variety of media to help democracy flourish”.
In addition, Prime Minister Anthony Albanese recently announced plans to ban social media for children. Federal legislation will be informed by engagement with States and Territories through National Cabinet and draw upon recent work by former Chief Justice, Robert French, who was commissioned by the South Australian Government to review possible approaches for state legislation preventing access to social media for children younger than 14 years old.
Meanwhile, the Digital Platform Regulators Forum (DP-REG), established in March 2022, is “an information-sharing and collaboration initiative between Australian independent regulators with a shared goal of ensuring Australia’s digital economy is a safe, trusted, fair, innovative and competitive space”.
Its members are the Australian Competition and Consumer Commission, the Australian Communications and Media Authority, Office of the Australian Information Commissioner and eSafety Commissioner, and it seeks to focus on “regulatory coherence and clarity”.
Chaired by Melanie Drayton, Acting Deputy Commissioner of the Office of the Australian Information Commissioner, the Forum has produced joint working papers on emerging technologies and digital platforms and submissions to relevant inquiries and consultations.
It has examined the impact of algorithms, and prioritised understanding and assessing the benefits, risks and harms of generative AI and how the technology intersects with the regulatory remit of each DP-REG member.
States and territories
At a state and territory level, SA Premier Peter Malinauskus is supportive of legislation for age restrictions on social media in SA, but said “a national framework will work best”.
The Report of the Independent Legal Examination into Banning Children’s Access to Social Media, undertaken by French, outlines a draft bill with a proposed model for South Australia based on two statutory duties of care on social media providers.
- Not to allow access to a non-exempt social media service by children younger than 14, nor by children 14-15 years old without parent’s consent
- Positively take reasonable steps to prevent such access in relevant age range.
He recommends that access to beneficial and low risk social media services – including dedicated educational or eHealth services – should be allowed, and that ongoing research and evaluation is required.
South Australian and New South Wales governments are partnering to deliver a two-day, two-state Social Media Summit on 10 and 11 October to further explore and address the impacts of social media.
NSW Premier Chris Minns has been supportive of addressing the harms of social media and digital technology on children, last year banning the use of mobile phones in schools.
Victorian, Queensland and Western Australian premiers have stated their support of the work commissioned by the South Australian Government and of age restrictions on social media. The ACT, Northern Territory and Tasmanian Governments have not explicitly commented either way.
The Victorian Government will undertake a consultation with parents, schools and children in the coming months to consider diverse needs and circumstances.
Queensland Health recommends parents limit access to social media for children younger than 14 years old.
Federal Opposition Leader Peter Dutton has previously stated his support for an age restriction on social media.
Global context
Global momentum is growing for more effective regulation of digital platforms and emerging technologies such as artificial intelligence (AI).
In a call to action published on 23 September, UNICEF urged all stakeholders to prioritise children’s rights in the provision, regulation, design, management and use of digital technologies.
“Digital environments that work for adults may create risks for children,” said the UNICEF statement.
“We see how the rapid uptake and development of digital technologies have not been accompanied by the legislation, services, and education required to ensure their safe and empowering use.”
Further details are in UNICEF’s full set of recommendations for the Global Digital Compact, a new international agreement which was annexed to the Pact for the Future, adopted by world leaders at the recent Summit for the Future at the United Nations.
The Global Digital Compact is billed as the first comprehensive global framework for digital cooperation and AI governance, and includes commitments by world leaders to make the online space safe for all, especially children.
As the pace and power of emerging technologies creates new possibilities, but also new risks for humanity – “some of which are not yet fully known” – risks need to be identified and mitigated “in ways that advance sustainable development and the full enjoyment of human rights”, the Pact says.
Varying approaches
In his review of legislation, French found a variety of regulatory approaches exist globally.
Under the European Union’s Digital Services Act, effective since 1 January 2024, providers of online platforms accessible to minors should implement appropriate and proportionate measures to “ensure a high level of privacy, safety and security of minors”, according to French.
The Act does not require age verification in all cases, but providers are prohibited from targeted advertising on minors. The EU Commission has established a task force with member states on age verification.
The Online Safety Act in the United Kingdom imposes duties of care on providers of regulated services to identify, mitigate and manages risks of illegal and harmful content to children.
In May 2024, UK’s online safety regulator, Ofcom, published a set of proposed measures to improve children’s safety online including robust age-checks to stop children accessing harmful content, algorithms set to recommend content which isn’t harmful to children, and better moderation of harmful content.
Currently no federal, age-based legislation exists in the United States prohibiting children’s use of social media. However, the Children’s Online Privacy Protection Act 1998 prevents the collection and use of personal data for children younger than 13 years “by any commercial website or online service”.
As a result, most social media services officially require users to be at least 13 years old to open an account.
Kristy Schirmer, Principal Consultant at Zockmelon Health Promotion and Social Media Consulting, told Croakey it is important to remember that the social media age limit of 13 years “is not based on the social and emotional needs of young people”, but is related to the US online privacy and data laws.
The Kids Online Safety Act, being discussed in US Government, focuses on duties of care and the establishment of a set of safeguards for minors, parental tools and identifying which users are youth.
Responses in Australia
The public health and wider health sectors are already stretched and their capacity to respond to policy developments in this space is further undermined by the short consultation periods involved.
As Charles Maskell-Knight noted in The Zap this week, “submission deadlines of under a fortnight for complex pieces of legislation places incredible pressure on potential submitters, and probably detracts from the number and quality of submissions received”.
However, public health and health promotion experts Kristy Schirmer and Carmel Williams welcome the Government’s recognition of the harms caused by Big Tech.
Schirmer said one strategy for “a co-ordinated approach to regulating Big Tech might be led by an overarching body, made up of relevant groups including public health, to oversee and lobby for a safer online experience, putting the needs of the most vulnerable users first”.
Williams, Director of the Centre for HiAP Research Translation based in South Australian Health and Medical Research Institute and School of Public Health, University of Adelaide, told Croakey a multi-pronged strategy involving governments and communities is needed.
“It’s good to see governments beginning to invest and explore in this space,” she said. As well as incrementally increasing legislation and regulation, Williams said it is important to grow community understanding about the benefits and risks of social media and digital determinants of health.
A better-informed community can be “activated to…call for and demand and support the work that governments are trying to do”, Williams said. “Ultimately, to truly address this, we need some sort of global action…and [these companies] need to be made more accountable”.
Social media ban
The Prime Minister’s plans to introduce a minimum age for access to social media and relevant digital platforms have drawn mixed reactions.
While it is generally agreed that social media can be harmful to children, including for their mental and physical health, focus and development, some groups say that an outright ban is not the right strategy.
Mental health organisations including Black Dog Institute, Gayaa Dhuwi (Proud Spirit) Australia, Beyond Blue and headspace, said in a joint statement that the proposed ban risks cutting young people off from mental health support.
“We agree reform is necessary. But a blanket ban is not the answer,” they said. According to the statement, 73 percent of young people use social media for mental health support, often serving as their “front door to the mental health system”.
The statement recommends co-designing reform and new safety features with young people, giving users control to reset their algorithm – and thus, limiting the harmful content they are exposed to – and mandating safety features and increased social media literacy programs for users under 16.
“We also need social media platforms to step up, and take responsibility for their products, and make sure that young people are not exposed to harmful content,” they said.
Schirmer said that as a social media consultant, registered health promotion practitioner and a mum to a 10 and 13-year-old, she supports the ban in principle.
However, she also recognises the benefits of being online, “especially for isolated and marginalised groups and that leading mental health groups have advocated for a more nuanced and balanced approach, with social media being an important gateway into health and social supports”.
She told Croakey she would like to see more resources for marginalised and isolated young people including in-person groups, sport, arts and recreation offerings, more regulated and safe online platforms and strengthened school mental health and wellbeing services “so kids weren’t forced to build community in settings that can cause harm and expose them to other negative inputs”.
Backholer told Croakey we need an online environment designed with our children’s health as the priority, rather than profit motives of Big Tech.
“The social media ban must not absolve platforms of their responsibility to create safer spaces for all users,” she said.
“If the Government pursues this policy, the focus should shift to holding platforms accountable, making it less about ‘banning kids’ and more about enforcing rules that compel platforms to comply with the law.”
The Federal Government intends to introduce the legislation before the end of this year. The Government is running an age assurance trial for people aged 13 to 16 years to determine implementation approaches and policy design.
Meanwhile, in the same week as the Federal Government’s announced plan for age restrictions on social media, Instagram introduced its Teen Accounts with age restrictions for people younger than 16 years.
The Teen Accounts will have default settings for private accounts, sensitive content control and strict messaging settings, as well as time limit reminders and a sleep mode enabled.
Misinformation and disinformation
In a stakeholder briefing with the Department of Infrastructure, Transport, Regional Development, Communications and the Arts yesterday, Croakey heard about the Combatting Misinformation and Disinformation Bill’s aims of increasing the transparency and accountability of major digital platforms and their responses to seriously harmful mis- and disinformation.
After receiving more than 24,000 responses on draft legislation released for public consultation in 2023, the Department undertook further targeted consultations, resulting in revisions so that the current bill put more emphasis on safeguarding freedom of speech, amongst other changes.
The current bill does not intend to cover all dissemination of content that may be considered false, but addresses dissemination of content that is reasonably verifiable as false, misleading or deceptive, and reasonably likely to cause or contribute to serious harm.
Serious harms identified are: harm to the operation or integrity of an electoral or referendum process in Australia; harm to public health in Australia including the efficacy of preventative health measures; vilification of a group in Australian society; intentionally inflicted physical injury to an individual in Australia; imminent damage to critical infrastructure or disruption of emergency services in Australia; and imminent harm to the Australian economy.
Some concerns have been raised about the definition of public health used in the explanatory memorandum, as “intended to include the government system for providing for the health needs and services of all Australians, including preventative health measures, on the understanding that, if this system and these measures are undermined, the health of Australians will consequentially be undermined”.
Professor Sharon Friel told Croakey that it is fantastic that the Government is developing legislation to tackle online misinformation and disinformation, but that it will do little to prevent harm to public health if the legislation focuses only on health services. “The legislation must address the misinformation and disinformation related to the environmental, social and commercial determinants of health such as fossil fuels, plastics and junk foods,” she said.
Professor Kathryn Backholer also suggested the definition of public health was overly narrow and wouldn’t encompass other instances of misinformation and disinformation that may cause serious harm to public health in Australia.
“The Bill needs to walk a fine line between protecting the Australian people from misinformation and disinformation that may cause serious harm, and freedom of speech and expression,” she said.
“So, whilst I fully believe that more needs to be done to prevent mis/dis information about broader public health issues that certainly cause serious harm, such as climate change, I’m not sure if this Bill is the place to do it. There has already been major push back and I think taking the Bill further risks any political momentum on the issue at all.”
In a snap briefing last night, the Australian Democracy Network urged stakeholders to make submissions on the bill urging tougher regulatory action, and for climate denial to be included as a type of misinformation. Rebecca Zosel’s posts from the discussion are here.
Professor Daniel Angus, Director of QUT Digital Media Research Centre at Queensland University of Technology wrote in The Conversation, “the new version of the bill suggests the Government listened to some expert recommendations from the consultation process, but ignored many others”.
According to Angus, the draft legislation was criticised for its potential to limit free speech. He says the new bill “remains cautious” with protections for political discourse and public interest communication including satire, humour, and professional news content.
However, intent is very hard to prove, he says, and the bill also doesn’t cover mainstream media, which he says is problematic as some mainstream media outlets are “prominent contributors to the spread of misinformation”.
The Department told Croakey they have seen significant misinformation in the media about the bill.
Federal inquiry into social media
The Joint Select Committee on Social Media and Australian Society was appointed in mid-May to inquire into the influence and impacts of social media on Australian society, with its final report due on or before 18 November 2024.
Chaired by Sharon Claydon, federal member for Newcastle, the committee is exploring the decision of Meta to abandon deals under the News Media Bargaining Code, the role of Australian journalism, news and public interest media in countering misinformation and disinformation on digital platforms, the impact of algorithms on mental health, and the use of age verification to protect Australian children from social media.
The committee received 217 submissions including from social media platforms, News Corp, human rights organisations, the Local and Independent News Association, as well as approximately 20 health and related organisations and professionals including FARE, PHAA and the Australian Medical Association.
Scams Prevention Framework
The Department of Treasury is seeking feedback on the Scams Prevention Framework exposure draft legislation before 4 October 2024.
Informed by a consultation, the Framework introduces principles-based obligations which regulated entities will be required to adhere to “to take reasonable steps to prevent, detect, report, disrupt and respond to scams”.
Many of the measures in the draft are designed to put more responsibility on banks, telecommunications and digital platforms, including social media, paid search engine advertising and direct messaging services.
Given scams cost $2.7 billion in Australia in 2023, and result in emotional, psychological and physical impacts on victims of scams, it is another important piece of legislation.
Mohiuddin Ahmed, Senior Lecturer of Computing and Security at Edith Cowan University, said in The Conversation the framework needs to be “broader in scope if it is to achieve its aim”.
“It is important to focus on the entire ecosystem of scams”, Ahmed writes. We should look at the “entire end-to-end pipeline of scams and hold everyone who is a part of that accountable”, including how criminals access contact details of people, via the dark web, automatic random number generator, or Google.
A summary of the Scams Prevention Framework reforms can be read here.
Privacy and Other Legislation Amendment Bill
The Privacy and Other Legislation Amendment Bill 2024 was tabled last week, referred to the Legal and Constitutional Affairs Legislation Committee for inquiry and report by 14 November 2024. Submissions close on 11 October.
Brownbill, at FARE, told Croakey that while “there were small steps in the right direction” in the bill, including the requirement for the Office of the Australian Information Commissioner to develop a Children’s Online Privacy code, it “fell short of addressing foundational issues when it comes to protecting people’s privacy online”.
She said this includes measures to “help curtail” the extensive use of data that enables harmful digital marketing practices.
“One of the key issues of concern is the way platforms use data-driven marketing models that target people in harmful ways,” Brownbill said.
“When we consider companies marketing alcohol using these digital platforms, this means that people most at risk of harm are being targeted with alcohol ads, including when they are feeling their most vulnerable.
“It also means that children are being tagged as interested in alcohol and targeted with alcohol advertising by these platforms – and this is all happening under the radar”.
Similar concerns were raised in The Conversation this week by Dr Priya Dev, Lecture and Academic in Data Science, Digital Assets and Distributed Ledgers at Australian National University.
Dev traced the origins of a spate of unsolicited phone calls back to a 2014 marketing campaign and a “network of hidden connections between data brokers, telemarketers and large organisations”.
Dev said the government’s privacy reforms “are a small step in the right direction. But until data brokers are required to obtain explicit consent before trading personal information, they fall far short of being a giant leap forward”.
Peter Lewis, Executive Director of Essential Media, wrote in The Guardian this week that measures to verify age on social media platforms “will create honey pots of high-value personal information”, especially when our privacy laws are not tight.
A Guardian Essential report found that overall 67 percent of adults support the Government’s proposed ban on social media for children, while more than three-quarters support stricter policy measures on privacy and personal information.
AI disclaimer
Interviews conducted in researching this story were recorded and transcribed by artificial intelligence and machine learning speech-to-text transcription app, Otter.ai. Comments quoted in this story were verified with the audio recording.
See Croakey’s archive of articles on digital platforms