The immense market power of Facebook and Google is creating wide-ranging but poorly understood public health challenges, as outlined in a recent landmark report from the Australian Competition and Consumer Commission (ACCC).
The ACCC’s final report from its Digital Platforms Inquiry, released on 26 July, makes sweeping recommendations for increased regulation of these platforms, including to better protect the privacy and interests of consumers.
It is important reading for public health advocates, some of whom are holding a videoconference meeting this Friday (30 August) to discuss the public health opportunities arising from the report’s 23 recommendations.
The meeting, co-hosted by the Public Health Association of Australia (PHAA) and the Foundation for Alcohol Research and Education (FARE), aims to encourage the sector to engage with the Government’s consultation process, which is seeking views on “practical options for implementation, timing and any impediments or challenges”.
Submissions are due on 12 September, and Treasury, together with the Attorney General’s Department and the Department of Communications and the Arts, will then undertake a targeted consultation process.
Among the many public health concerns exposed in the report are Facebook and Google’s lack of accountability for spreading disinformation, low levels of digital media literacy in the community, and declining media coverage of health and science.
Further scrutiny required
The ACCC recommendations span competition law, consumer protection, media regulation and privacy law, and also call for an ongoing watch over Facebook and Google to “provide more consistent scrutiny of potentially anti-competitive behaviour and consumer harms”.
The report also recommends reforms to support public interest journalism, especially for regional, local and non-profit journalism. It supports reforms to enable deductible gift recipient (DGR) status for not-for-profit organisations that “create, promote or assist the production of public interest journalism”.
It says that philanthropically-funded and not-for-profit journalism could perform a more significant role in addressing the risk of under-provision of public interest journalism in Australia, noting the increasing prevalence and success of this kind of journalism overseas.
Journalist Margaret Simons, Associate Professor at Monash University, has described the report as “potentially one of the most important documents in recent national history, with the potential to affect every area of life”.
Michael Thorn, Chief Executive of FARE, told Croakey the report’s recommendations offered an important opportunity for addressing critical public health concerns, including around consumer data protection, digital marketing and the need for a strong regulatory framework for the platforms.
“We want people to be aware of the report and the recommendations, and to make a submission or joint submissions to the consultation process,” he said. This briefing was prepared for the meeting.
Dr Ruth Armstrong, a Croakey editor, will join the meeting on Friday on behalf of Croakey Health Media, whose submission to the inquiry is cited several times in the report.
Addressing harmful market power
When releasing the 619-page report, ACCC Chair Rod Sims called for governments and regulators to ensure much greater transparency and oversight of Facebook and Google.
He said the ACCC now had five separate investigations underway into the platforms as a result of information that had come to light during the inquiry, and he believed that more would follow.
He said the report’s recommendations sought to address harms arising from the market power of Google and Facebook, including:
- Consumers do not understand how much of their data is being collected and how it is being used. The “privacy” policies are long, complex and opaque.
- Consumer trust in data is vital but is being undermined by the platforms. As consumers become more aware of these issues, Sims warned they may withdraw from engaging in a data economy, which risked reducing the potential societal advantages of data useage.
- The use of data and analytics in advertising can exploit behavioural biases and consumer vulnerabilities “on a scale that we’ve never seen before”, with political implications.
- The rise of the digital platforms had caused significant harm to public interest journalism.
- The digital advertising market is opaque and unfair.
- Trust in news and information is in decline, increasing the potential for misinformation and bias, and affecting debate on key societal issues.
Watch Rod Sims
Setting the scene
Each month, about 19.2 million Australians use Google Search, 17.3 million access Facebook, 17.6 million watch YouTube (which is owned by Google) and 11.2 million access Instagram (which is owned by Facebook).
The Google and Facebook business models are based on attracting a large number of users and building rich data sets about their users and using these to sell advertising.
About 95 percent of general searches in Australia are performed through Google, with this company earning almost 96 percent of all search advertising revenue in Australia.
The ACCC concluded that Google is largely insulated from dynamic competition.
Facebook and Instagram together are estimated to have 51 percent of the online display advertising market in Australia, with no other online supplier of display advertising having a market share of greater than five percent.
The ACCC found that no other businesses come close to the level of tracking undertaken by Google and Facebook.
It is estimated that more than 70 percent of websites have a Google tracker and more than 20 percent of websites have a Facebook tracker. It is also estimated that of the apps available on the Google Play store, 88 percent send user data back to Google and 43 percent send user data back to Facebook.
Advocating for public interest journalism
The ACCC’s research has highlighted concerns with the reduced production of particular types of news and journalism, including local government and local court reporting, which it says are important for the healthy functioning of the democratic process (as outlined forcefully from a United States perspective in this article by Senator Bernie Sanders).
The report recommends that grants of $50 million per annum be made to support local reporting, which it defines as original journalistic coverage of matters relevant to local and regional communities – such as local courts, local issues and local government (what a shame health is not mentioned here!).
Census data shows that from 2006 to 2016, the number of Australians in journalism-related occupations fell by nine percent overall, and by 26 percent for traditional print journalists (including those journalists working for print/online news media businesses).
Data provided by the main media companies show the number of journalists in traditional print media businesses fell by 20 percent from 2014 to 2018. This is at a time when Australia’s population and economy were growing strongly.
Highlighting significant issues for rural health, data collected by the ACCC show that between 2008 and 2018, 106 local and regional newspaper titles closed across Australia, representing a net 15 percent decrease in the number of these publications.
These closures have left 21 local government areas previously covered by these titles without coverage from a single local newspaper (in either print or online formats), including 16 local government areas in regional Australia.
The ACCC also carried out a quantitative assessment of print articles published in all metropolitan and national daily newspapers by the three largest Australian news publisher groups, and found a significant fall in the number of articles published covering local government, local court, health and science issues during the past 15 years.
The reduction was for both the absolute number of articles published in each of these categories and the percentage of total articles published attributed to these categories.
This analysis found that the publications:
- published 26 percent fewer articles on local government issues in 2018 than at the peak of local government coverage in 2005 (a drop from around 11,400 articles a year to around 8,400 articles a year)
- published 40 percent fewer articles on local court matters in 2018 than at the peak of local court reporting in 2005 (a drop from around 11,900 articles a year to around 7,200 articles a year)
- published 30 percent fewer articles on health issues in 2018 than at the peak of health reporting in 2004 (a drop from around 21,600 articles a year to around 13, 300 articles a year)
- published 42 percent fewer articles on science in 2018 than at the peak of science reporting in 2006 (a drop from around 6,400 articles a year to around 3,700 articles a year).
The report found that digital platforms are now acting as gateways to news and information on the internet for many Australians, and have considerable influence in shaping the news viewed by Australians and perform curatorial functions when surfacing information.
However, the atomisation of media content and the risk of misinformation and disinformation being spread on digital platforms make it difficult for consumers to evaluate the veracity, trustworthiness and quality of the news content they receive online.
The report says this may have the effect of undermining democratic processes, as the ability of consumers to recognise high-quality news is essential for a well-functioning democracy.
As a consequence of digital platforms’ personalisation of content to users, it can also be difficult to establish the level of disinformation or misinformation presented to consumers on digital platforms.
There have been frequent examples of disinformation and malinformation campaigns attempting to affect democratic processes in the United States, the United Kingdom and the European Union; and there is growing public concern about highly inaccurate and misleading information being surfaced to Australian consumers.
These issues present a compelling argument to address these concerns as a public policy issue in Australia. Disinformation and malinformation is not accidental. Some individuals and businesses deliberately spread inaccurate information in a systematic way to try to influence public opinion by targeting individuals or groups, or simply to make money.
While public interest journalism contributes to a healthy democracy, disinformation and malinformation does the opposite.
To the degree that online consumption makes it harder for public interest journalism to reach audiences, but easier for disinformation and malinformation to do so, this is clearly a significant public policy concern.”
The ACCC concluded that few consumers are fully informed of, fully understand, or effectively control, the scope of data collected and the bargain they are entering into with digital platforms when they sign up for, or use, their services.
The report says:
There is a substantial disconnect between how consumers think their data should be treated and how it is actually treated.
Digital platforms collect vast troves of data on consumers from ever-expanding sources and have significant discretion over how this user data is used and disclosed to other businesses and organisations, both now and in the future.
Consumers also relinquish considerable control over how their uploaded content is used by digital platforms.”
Digital platforms also tend to understate to consumers the extent of their data collection practices while overstating the level of consumer control over their personal user data.
The report identifies that some groups are at increased risk of manipulation:
The extensive amount of data collected by digital platforms may include data that identifies (or infers) an individual’s vulnerabilities.
The detriments identified above can be especially harmful to vulnerable consumers by placing them at risk of being targeted with inappropriate products or scams, discriminated against, or inappropriately excluded from markets.”
Submissions highlighted that the risks associated with data collection and use could be particularly acute for children, and the ACCC recommends that digital platforms be required to minimise the collection of children’s personal information, and there should be additional restrictions where children’s personal information is collected, used or disclosed for targeted advertising or online profiling purposes.
In addition to risks posed to young children, the ACCC said psychological profiling of consumers may facilitate discrimination against certain groups on the basis of their willingness to pay as well as for their gender, race or sexual orientation.
Tools that target consumers based on their online profiles and browsing history may also result in unfair exclusion to accessing products and services. For example, consumers with a low socioeconomic background would be harmed if online profiling is used to distinguish between high-value and low-value customers, particularly in essential services markets.
Surveillance in action
In June 2018, an ACCC staff member downloaded their Facebook data. They found that Facebook had stored their ‘active’ user activity information, such as photos and comments posted on Facebook.
They also found that Facebook stored data that had been collected passively, such as names and phone numbers of the user’s contacts from the user’s mobile device, even though those contacts were not the user’s Facebook friends.
Despite having location tracking turned off in their Facebook account settings, the staff member’s downloaded data showed Facebook had a comprehensive record of IP addresses matched to 53 different locations where the user had logged into their Facebook account.
The Facebook data showed that Facebook had also linked over 500 ad interests to the user’s profile and matched the user to contact lists provided by 127 advertisers, including frequent flyer programs and private health insurance companies.
In November 2018, an ACCC staff member downloaded the Google data attached to their Google family account.
The data downloaded covered 51 products and services, accessed through Google, that the Google family account had interacted with between 2011 and 2018.
The ACCC staff member found a wide variety of data had been stored to the account, including some data collected from 2011, covering a period which included multiple additions and changes to devices used by the family.
This data included a non-chronological list of every Android mobile app installed from 2014-2018 (comprising 2 482 Excel rows of data); orders made in the Google Play Store, including time of purchase, phone number, card type and expiry date, as well as the IP address it was purchased from; and the names and email addresses from a Google group set up and used in April 2011.
It also included a recording of every question asked to the family’s Google Assistant (by various family members including children) between January 2018 and June 2018 (when the staff member’s Google Home was active).
Location data was collected by a number of different products and services. For example, every photo stored had attached geodata, latitude and longitude and timestamp of when the photo was taken.
Data stored with location history included latitude and longitude information.
The staff member also found that Google had stored copies of photos from 2011 to 2018, including photos which came from previous devices, and that had not been transferred to new devices or stored on the cloud.
The ACCC noted that digital platforms have also provided an important new avenue for scammers to exploit consumers and businesses. The number and sophistication of these scams is rapidly increasing.
Based on complaints received by the ACCC between 2014 and 2018, reports of scams occurring via social media have increased by 188 percent in the past four years, and the value of losses incurred via scams on social media jumped by 165 percent.
As an example, in the week of 6-12 May 2019, the ACCC scamwatch team received 165 reports of scams where Facebook was mentioned, with an estimated $70,000 in losses.