Introduction by Croakey: Many red flags about the use of artificial intelligence (AI) in healthcare are raised in a recent Australian Medical Association submission to a Senate inquiry into AI.
The submission calls for stringent regulation of AI, warning that potential risks from “a rushed or unregulated adoption of AI in healthcare are considerable”.
“It is imperative that AI technologies are applied in a way that does not exacerbate disparities in healthcare, including but not limited to those related to race, gender or socioeconomic status,” says the submission.
Interestingly, it does not address concerns about the environmental toll of AI, which the article below suggests is an important consideration for those working to make health services more environmentally sustainable.
The article below was first published at The Conversation under the headline, ‘Power-hungry AI is driving a surge in tech giant carbon emissions. Nobody knows what to do about it’.
(Below the article, Professor Enrico Coiera Director of the Centre for Health Informatics, Australian Institute of Health Innovation, and Foundation Professor in Medical Informatics, Macquarie University, responds to some of the issues it raises.)
Gordon Noble and Fiona Berry write:
Since the release of ChatGPT in November 2022, the world has seen an incredible surge in investment, development and use of artificial intelligence (AI) applications. According to one estimate, the amount of computational power used for AI is doubling roughly every 100 days.
The social and economic impacts of this boom have provoked reactions around the world.
European regulators recently pushed Meta to pause plans to train AI models on users’ Facebook and Instagram data. The Bank of International Settlements, which coordinates the world’s central banks, has warned AI adoption may change the way inflation works.
The environmental impacts have so far received less attention.
A single query to an AI-powered chatbot can use up to ten times as much energy as an old-fashioned Google search.
Broadly speaking, a generative AI system may use 33 times more energy to complete a task than it would take with traditional software.
This enormous demand for energy translates into surges in carbon emissions and water use, and may place further stress on electricity grids already strained by climate change.
Energy
Most AI applications run on servers in data centres. In 2023, before the AI boom really kicked off, the International Energy Agency estimated data centres already accounted for 1–1.5 percent of global electricity use and around one percent of the world’s energy-related CO₂ emissions.
For comparison, in 2022, the aviation sector accounted for two percent of global energy-related CO₂ emissions while the steel sector was responsible for seven to nine percent.
How is the rapid growth in AI use changing these figures?
Recent environmental reporting by Microsoft, Meta and Google provides some insight.
Microsoft has significant investments in AI, with a large stake in ChatGPT-maker OpenAI as well as its own Copilot applications for Windows. Between 2020 and 2023, Microsoft’s disclosed annual emissions increased by around 40 percent, from the equivalent of 12.2 million tonnes of CO₂ to 17.1 million tonnes.
These figures include not only direct emissions but also indirect emissions, such as those caused by generating the electricity used to run data centres and those that result from the use of the company’s products. (These three categories of emissions are referred to as Scope 1, 2 and 3 emissions, respectively.)
Meta too is sinking huge resources into AI. In 2023, the company disclosed its Scope 3 emissions had increased by over 65 percent in just two years, from the equivalent of five million tonnes of CO₂ in 2020 to 8.4 million tonnes in 2022.
Google’s emissions were almost 50 percent higher in 2023 than in 2019. The tech giant’s 2024 environmental report notes that planned emissions reductions will be difficult “due to increasing energy demands from the greater intensity of AI compute”.
Water
Data centres generate a lot of heat and consume large amounts of water to cool their servers. According to a 2021 study, data centres in the United States use about 7,100 litres of water for each megawatt-hour of energy they consume.
Google’s US data centres alone consumed an estimated 12.7 billion litres of fresh water in 2021.
In regions where climate change is increasing water stress, the water use of data centres is becoming a particular concern.
The recent drought in California, where many tech companies are based, has led companies including Google, Amazon and Meta to start “water positive” initiatives.
These big tech firms have announced commitments to replenish more water than they consume by 2030. Their plans include projects such as designing ecologically resilient watershed landscapes and improving community water conservation to improve water security.
Climate risk
Where data centres are located in or near cities, they may also end up competing with people for resources in times of scarcity. Extreme heat events are one example.
Globally, the total number of days above 50°C has increased in each decade since 1980. July 2023 was the hottest month ever recorded.
Extreme heat translates to health impacts on local populations. A Lancet 2022 study found that even a 1°C increase in temperature is positively associated with increased mortality and morbidity.
On days of extreme heat, air conditioning can save lives. Data centres also like to keep cool, so their power use will spike with the temperature, raising the risk of blackouts and instability in electricity grids.
What’s next?
So what now? As we have seen, tech companies are increasingly aware of the issue. How is that translating into action?
When we surveyed Australian sustainability professionals in July 2023, we found only six percent believed data centre operators provided detailed sustainability data.
Earlier this year we surveyed IT managers in Australia and New Zealand to ask what they thought about how AI applications are driving increased energy use. We found 72 percent are already adopting or piloting AI technologies.
More than two-thirds (68 percent) said they were concerned about increased energy consumption for AI needs. However, there is also significant uncertainty about the size of the increase.
Many IT managers also lack the necessary skills to adequately address these sustainability impacts, regardless of corporate sustainability commitments.
Education and training for IT managers to understand and address the sustainability impacts of AI is urgently required.
Author details
Gordon Noble is a Research Director with the Institute for Sustainable Futures at the University of Technology Sydney (UTS) focusing on sustainable finance.
Fiona Berry is Research Principal at the Institute for Sustainable Futures at University of Technology Sydney. Passionate about local food systems, community engagement and interdisciplinary research.
Both authors work on projects funded by corporations, government and philanthropic organisations.
Comments by Professor Enrico Coiera
The six-point statement below was provided in response to Croakey’s request for a health-related comment on the article above.
1. Large scale machine learning and especially Large Language Models (LLMs) do require a lot of power when training.
Here’s a recent review – https://www.nature.com/articles/s42256-023-00670-0.pdf
Quotes from the JAMIA special issue editorial on digital health and climate, which was edited by AIHI’s Prof Enrico Coiera and Prof Farah Magrabi.
“Computational resources such as cloud computing are energy intensive and data center energy use represented 1% of global energy usage in 2018”: Masanet E, Shehabi A, Lei N, Smith S, Koomey J. Recalibrating global data center energy-use estimates. Science 2020;367:984-6.
“The energy cost of developing multi-billion parameter AI models is also not inconsiderable”: Dodge J, Prewitt T, Tachet des Combes R, et al. Measuring the Carbon Intensity of AI in Cloud Instances. 2022 ACM Conference on Fairness, Accountability, and Transparency; 2022. p. 1877-94.
2. The big companies doing AI are moving to renewables or are already there – data centres for Google and AWS are all green. Google was the first major company to match energy use with 100% renewable energy in 2017.
3. The use of AI is not especially power hungry – it’s building the model that consumes the power.
4. There are few reliable estimates of this use, and they are dated – the recent models are just so much bigger.
5. These large models are generic not healthcare specific eg ChatGPT – so if they are being reused in healthcare that might be a good thing in terms of sustainability, with the appropriate governance in place.
6. We are moving to small language models optimised for healthcare, so this might all be moot.
See Croakey’s archive of articles on AI and health