Introduction by Croakey: On Twitter, Croakey is using the hashtag #RegulateDigitalPlatforms as a resource for archiving useful articles about the wide-ranging ways Big Tech companies are shaping our health.
Recent reports shared at the hashtag examine how Facebook failed to enforce its own rules to curb an oil and gas industry misinformation campaign over the climate crisis during the United States presidential election.
Another recent article argues that social media companies could do much more to stop the spread of misinformation and disinformation about COVID-19 vaccines.
At the hashtag you will also find this pithy tweet, “Disinformation pays”, referring to Facebook’s latest reported earnings, with more than US$29 billion revenue recorded for the second quarter of 2021. CNBC also noted that Facebook has 3.51 billion monthly users, across its main app, Instagram, Messenger and WhatsApp.
Meanwhile, Hannah Pierce and Julia Stafford from Cancer Council Western Australia have been investigating how online platforms are enabling alcohol marketing to children. (You may wish to share the article using the hashtag, #RegulateDigitalPlatforms…).
Hannah Pierce and Julia Stafford write:
Facebook, Instagram, Google, YouTube, TikTok. Digital platforms have opened a world of opportunities for alcohol companies to promote their products, in a setting that’s often described as ‘dark’ – only visible to those in the target audience, and not open to public scrutiny.
This raises a concerning question that researchers and policy-makers struggle to answer: what are Australian children actually seeing online when it comes to alcohol?
Digital platforms provide alcohol marketers with the ability to activate age restriction controls to prevent children’s access to alcohol promotions, and since 2017 this has been a requirement in the industry’s marketing code of practice.
But with no compliance monitoring mechanisms, it is unknown if alcohol companies are indeed using these controls.
Our new research, published in Public Health Research and Practice, aimed to examine just this – the extent to which the dominant alcohol companies in Australia have activated age restriction controls on their official brand accounts on Facebook and Instagram. Our findings paint a concerning picture about what kids might be seeing online.
What did we find?
Accessing children
We identified the 195 brands owned by the top three beer, wine, and spirit companies in Australia, and located 153 official Facebook and 151 official Instagram brand accounts.
In assessing the presence of age restriction controls on the accounts, we found 28 percent of Instagram and five percent of Facebook accounts did not have controls activated. Only one wine company and one spirit company had activated controls across all their accounts.
What does this mean?
Despite the requirement that alcohol marketers activate age restriction controls on social media, some alcohol brand accounts on Facebook and Instagram are accessible to Australian kids.
It appears some of the top alcohol companies are not complying with even the most basic marketing rules, despite all nine companies being signatories to the industry code.
Following the release of our research, one alcohol company blamed “a process error” as the reason for the absence of age restriction controls on two of their accounts, noting they had “acted swiftly to put these settings in place again”.
But don’t be fooled into thinking this will stop harmful marketing from reaching children online.
Firstly, the controls rely on users registering with their correct date of birth, and adults not sharing devices with children – two conditions that are unlikely to be met in many households.
Secondly, research has shown young people are exposed to alcohol promotions through other avenues online, such as in e-sports games and via social media influencers. Some alcohol producers in Australia are also using TikTok to promote their products, despite the platform’s young audience.
Voluntary codes don’t work
Do digital platforms have a role in regulating unhealthy marketing?
Our findings add to the body of evidence demonstrating that voluntary industry codes fail to protect kids from harmful alcohol marketing. It’s evident the alcohol industry cannot be trusted to regulate its own marketing.
Some may argue that there could be a role for digital platforms in implementing restrictions on alcohol marketing. Currently, Facebook’s advertising policy, which also applies to Instagram, simply requires companies to comply with all applicable industry codes.
But implementing stronger controls presents an inherent conflict of interest for Facebook given the advertising revenue they receive. It is questionable how far they would voluntarily go to restrict marketing in a way that effectively protect kids.
As an example, Facebook recently announced changes to “how advertisers can reach young people” to give “young people a safer, more private experience”.
Soon, advertisers will only be allowed to target ads to people under 18 based on their age, gender, and location – they will no longer be able to target teens based on their ‘interests’.
While this may sound like a step forward, organisations including the Foundation for Alcohol Research and Education and Reset Australia have highlighted that Facebook will still be collecting children’s data, and advertisers can still target them based on other personal information.
This latest move from Facebook could be seen more like an attempt to avoid stronger external regulation than a genuine effort to prioritise children’s wellbeing.
If children are really to be protected from harmful marketing online, mandatory government regulation, including an effective monitoring system, is likely to be required.
But how do we achieve this?
Can we turn to other jurisdictions for guidance?
Australia isn’t alone in wrestling with how to effectively regulate alcohol marketing on digital platforms.
Global perspectives
The World Health Organization says that while the majority of countries have at least some restrictions on alcohol marketing, almost half of countries have no restrictions on the Internet and social media. Possible reasons for this lack of regulation include the sheer pace with which digital platforms have evolved, leaving policy-makers playing constant catch-up, and the vast power of these platforms that allows them to delay government regulation.
Despite the challenges, several jurisdictions have introduced controls on digital marketing.
In 2015, Finland placed some restrictions on alcohol advertising on social media, largely focussed on banning the use of user-generated content (for example, alcohol companies sharing pictures or videos created by consumers). In 2018, a complete ban on all alcohol marketing, including online, was introduced in Lithuania.
And while the UK doesn’t have comprehensive restrictions on alcohol marketing, in June 2021 the UK Government announced that online paid-for advertising of unhealthy food would be restricted to help address obesity. While this move is not perfect, as brands can continue to advertise within ‘owned media’ spaces online (including blogs, websites, or social media pages), it demonstrates that if there is political will, governments can take action to regulate marketing on digital platforms.
So where to from here?
Governments face immense challenges in regulating digital platforms – as Associate Professor Kathryn Backholer recently wrote, Big Tech “arguably has more power and influence over public health than Big Tobacco, Big Alcohol and Big Food combined.”
For those in public health, adding Big Tech to the list of industry opponents may be demoralising.
But when our children are able to access alcohol-related content on social media, stronger controls are urgently needed.
The Federal Government must create higher standards for how alcohol is promoted across all media channels, including social media. Continuing to call out the alcohol and tech industries’ tactics remains crucial in achieving a mandatory, effective regulatory system.
Hannah Pierce is the Alcohol Policy & Research Coordinator and Julia Stafford is the Alcohol Program Manager at Cancer Council Western Australia.
See Croakey’s archive of stories about the impact of digital platforms upon health.
Support our public interest journalism, for health.
Other ways to support.