Introduction by Croakey: The power of Big Tech companies like Google/Alphabet and Meta is a critical determinant of health and exerted in many ways, some of which are obvious and others that are not.
An important federal court judgement in the United States recently found Google/Alphabet had breached antitrust legislation in two areas of operations: general search services and general text advertising.
The judgement mentions “power” 141 times, mostly in relation to monopoly power, “the power to control prices or exclude competition”.
It also examines the “power of default”; for example, how Google pays vast sums (more than $US 26 billion in 2021) to browser developers, mobile device manufacturers and wireless carriers to ensure the Google search engine is installed as the default option and that no other search engines are loaded.
The implications of these arrangements probably are not widely understood or appreciated by general community, much less how they shape access to information, advertising and news, and thus influence the health of people and planet.
The judgement noted that Google’s dominance has gone unchallenged for well over a decade. In 2009, 80 percent of all search queries in the US already went through Google; by 2020, it was nearly 90 percent, and even higher on mobile devices at almost 95 percent.
Bing, during that same period, never held a market share above 11 percent, and today it stands at less than six percent, while Yahoo, once considered Google’s closest competitor, now holds less than 2.5 percent of the market (presumably the judgement is referring to the US market).
The trial also heard concerns, CNN reported this week, that Google’s market power has given it an unfair advantage in the development of artificial intelligence because the enormous amount of search data provided through these default agreements is used to develop AI models.
The outcomes of the federal court decision, described by CNN as a “staggering court defeat” for Google, are not yet clear as the company plans to appeal, and penalties are yet to be determined.
However, the case has renewed political and public attention on the power of digital platforms; an analysis published by The Conversation suggests the US judgement, alongside recent EU rulings, may lead to opening up of the tech market to other competitors.
The latest judgement noted that Meta has been “wildly successful” in selling social ads on Facebook and Instagram; between 2018 and 2021, the company’s ad revenue grew by about 150 percent.
All of which underscores the importance of research that interrogates how Big Tech companies exercise their power and with what effect, especially for public health.
On that note, an interesting new publication by researchers from the University of Queensland, Queensland University of Technology, Curtin University, Monash University and the Foundation for Alcohol Research and Education (FARE) has taken a deep dive into what is known as “tuned advertising”, which is explained as “a dynamic and unfolding process where ads are continuously algorithmically ‘optimised’ to users in real time”.
The paper encourages readers to reconceptualise the nature of advertising online, and to understand that power of Alphabet and Meta’s advertising models rest “both in their capacity to translate social life into data that trains algorithmic models and the opacity of those models to public observability”.
The authors propose “the emergence of a new cultural form where power is located in the capacity to tune and shape our patterns of consumption and ways of life”.
Below, two of the researchers involved, Dr Aimee Brownbill and Lauren Hayden, encourage Croakey readers to engage with current policy opportunities to address the power of Big Tech.
Aimee Brownbill and Lauren Hayden write:
In our new paper, published in Internet Policy Review, we take a deep dive into tuned advertising on digital platforms and ask how this dark, dynamic and hyper-targeted advertising model can be observed to hold companies accountable for their digital marketing practices.
What is ‘tuned’ advertising? You’ve likely noticed this when you pause for a second and think – how bizarre, I was just talking about this product to my friend and now it’s showing up on my feed.
We present tuned advertising as a dynamic and unfolding process where ads are continuously algorithmically ‘optimised’ to people in real time.
While we might commonly think of ‘targeted’ advertising as targeting of characteristics like our age, gender, ethnicity, and interests, the hyper-targeted tuned advertising driving what we see online is much more algorithmically driven and insidious.
The concept of ‘tuned’ advertising better captures the real-time experimentation that digital platforms conduct on us, utilising the extensive data they access and generate about us, to see what ads they can serve us that most likely lead us to buy a product. Think – what you click on, the comments you make, what you share, what you bought 30 minutes ago, the locations you visited yesterday, what your friends you caught up with last night are searching for online.
Targeted advertising assumes a static audience, such as women or pet-owners. By contrast, tuned advertising aims to connect with our changing moods, interests and desires.
What does this all mean and why should we care?
The deeply invasive capability for digital advertising to be tuned to a specific audience, a specific person, or a specific moment can make it all the more powerful as persuasive communication aiming to get us to click the ad and to buy more of the advertised products.
This can have many harmful implications for ourselves and our communities, and we share one example for you to ponder – what does this mean when advertising for harmful and addictive products are tuned to an individual?
What harms will it create when a person is identified, at the moment they are feeling the most vulnerable, as the perfect target of advertisements for harmful and addictive products like alcohol and gambling?
When a person is seeking help online for gambling or alcohol use, only for marketing algorithms to flag them as the perfect target for more gambling and alcohol ads?
What about when social media companies are monitoring kids’ moods, to identify when they are feeling overwhelmed and anxious, and target them with ads in these moments?
Observation of this advertising is a current challenge that we are taking up, like many others, to create more transparency and accountability for digital platforms and companies engaged in the data-driven digital marketing economy. So that harms can be promptly identified and, better-yet, prevented. So that we can all safely enjoy the benefits of engaging online, in this digital era we now live in.
What can you do right now?
The ability for companies to collect, use and share extensive amounts of data for marketing purposes underpins the tuned, hyper-targeted, advertising model.
At the moment, the Australian Government is looking to implement privacy reforms through amending the Privacy Act, which determines what data companies can collect about us and what they can do with this data.
The Government is proposing important measures, including:
- Prohibiting direct marketing and targeting to children and the trading of children’s personal information
- Implementing requirements for companies to regard a child’s best interest when considering the collection and use of their data.
While the Privacy Act has now been under review for years, momentum is lagging.
Digital Rights Watch is asking people to prompt the Government to act now, by submitting a brief letter to decision makers calling for government action before the next Federal election. Click here to submit a letter via their website.
Decision makers must implement measures that give us more power over how our data is collected and used and stop this real-time experimentation without our consent.
These are crucial changes to help keep us all safe online.
• Learn more about what FARE is doing to keep Australians safe online here.
Further reading
The Conversation: Social media algorithms are shrouded in secrecy. We’re trying to change that
See Croakey’s archive of articles on digital platforms