Will Artificial Intelligence Cause an Emissions Crisis?
With the rise of deep learning artificial intelligence (AI) and large language models (LLMs), the topic of excess energy usage, as well as greenhouse gas (GHG) emissions, has become increasingly significant for some investors. We believe these concerns are valid, however factors including improvements in technology, the use of clean energy sources, the proliferation of hyper-specific AI applications, and AI-generated industry efficiencies, should off-set the industry’s growing energy demands.
AI and Energy Consumption
LLM AI models like ChatGPT, Bard, and Llama, are trained using variations of GPUs. These devices are extremely energy intensive, with the industry standard H100 chipset pulling up to 700W per hour – which, assuming 61% utilisation over a year, is more than the average US household.1 Searches performed on ChatGPT, as well as other generative AI, are also estimated to cost more than 10x as much energy as a normal Google search.2 In a 2023 study, analysts estimated that ChatGPT was consuming around 564 megawatt hours of electricity per day (at peak popularity), roughly equivalent to 19,000 households.
Furthermore, as AI models have advanced, the amount of data that is required to be processed has also increased exponentially. GPT 1 was trained on a dataset of 11,000 unpublished books. GPT 2, was trained using 1.5 billion parameters. GPT 3.5, which was released early last year, used around 175 billion parameters.3 Finally, GPT 4, the most advanced iteration, has been trained on an estimated 1.8 trillion parameters.4 Naturally, an increase in the size of datasets subsequently requires an increase in GPU power and usage – further contributing to emission fears.
Technology Advancement
While the above metrics may heighten concerns over the growth of AI and its consequence on energy consumption and emissions, it is key to note that a version of this exact concern has already played out in the datacentre industry.
Over the past decades, increasing internet usage and needed infrastructure invited similar fears of excessive energy consumption by datacentres. Between 2015 and 2022, internet users almost doubled, and global internet traffic increased 8x – requiring significant infrastructure development for global firms to keep up.5 Despite this, datacentre energy use over the period only increased by roughly 20-70%, implying a relatively muted CAGR that ranges from 2.6-7.8%.6 This was largely thanks to efficiency improvements in technology, increased usage of cheaper and cleaner energies, and the consolidation of less-efficient, small-scale data warehouses into more efficient, hyperscaler datacentres.
The trajectory of AI’s megatrend is not likely to be all-too different from that of the internet. Similar to how datacentres were quickly optimised to account for exponential growth at minimal energy costs, companies will seek to improve efficiency in AI as there is an economic incentive to do so.
Jensen Huang, the CEO of Nvidia, recently stated in an interview that while he believed AI datacentre capacity will likely double within the next 5 years, “you can’t assume that (companies) will just buy more computers… you have to also assume computers are going to become much faster”.7 The same can also be said about energy consumption. GPU architecture of the future is unlikely to be defined by ‘how much power it can consume’, but rather how efficiently it can use that power.
Hyper-Specific AI Models
Increasingly efficient hardware is not the end of the story. How AI software is utilised will also play a significant role in preventing an emissions explosion. A recent study by AI app-builder Hugging Face, found that AI models built on smaller datasets can operate just as well as AI models built on much larger datasets when optimised for specific tasks.8 The study also revealed that these task-focused AI tools had carbon footprints up to 30x less than that of a larger generalised model when performing the specified task.9
These results have significant implications for the future implementations of deep-learning AI. It is our view that while there may be market share for one or two truly generalised AI tools for retail audiences in the future, not all companies will employ such tools in their workflow. Task-specific (and thus resource-efficient) AI will likely become the main use case for the majority of institutions.
Using Clean Energy
It is without doubt that the training and deployment of AI will require an immense amount of energy over the next decade, and with energy comes emissions. But that relationship can be severed with the use of renewable and clean energy sources.
The deployment of AI hardware is occurring mostly in existing datacentre infrastructure, and almost all of the major datacentre operators of the world are committed to clean energy solutions. Microsoft and Google have pledged to reach 100% clean energy usage for their datacentres by 2030.10 Amazon is aiming to do the same by 2025.11 These three companies alone account for more than 50% of the world’s large-scale datacentre capacity.12 By coincidence, these three companies are also major proponents of AI software, with Microsoft having acquired a major stake in ChatGPT owner OpenAI, Google developing LLMs such as Bard and PaLM, and Amazon’s increasing AI implementation in AWS. On a global scale, America dominates the datacentre industry with 40% of the world’s datacentres located in the country, China follows with 8%, then Japan with 6%.13 In this context, the overall decarbonisation of electricity grids will be key in keeping emissions in check.
It is key to note that datacentres have had a good track record of minimising emissions. As of 2023, roughly 1.5% of global energy consumption could be attributed to datacentres, but they only accounted for 0.6% of global GHG emissions.14 The AI industry itself, while nascent, is also unlikely to contribute excessive emissions outside of hardware implementation. The below chart shows the average Bloomberg ESG environment pillar score across multiple investment universes, among which AI does not significantly deviate nor underperform.15
AI-Generated Efficiencies
Due to AI’s ability to effectively analyse large amounts of data that would otherwise be impossible, AI can help generate significant efficiencies across multiple industries which may assist in combating climate change or reducing emissions.
Climate-friendly use cases of AI include analysing weather patterns to reduce water needs in agriculture, creating new materials that avoid resource wastage, increasing efficiency in energy grids, forecasting GHG emissions, monitoring deforestation, and much more. A recent report by McKinsey estimates that AI-driven technologies could reduce CO2 emissions in companies by 10% and reduce energy use by up to 20%.16 Another study estimates that AI will reduce global CO2 emissions by 20% by 2030, and that these technologies can save 9.7x more emissions than they generate.17
These forecasts assume a rapid growth of efficient AI integration across numerous industries. Nonetheless, it paints an optimistic picture of how AI could be beneficial, not detrimental, to the environment.
AI Growth Will Limit Environmental Impact
AI hit an inflection point in 2022 when ChatGPT showed the world the technology’s true potential. However, it is important to recognise that the industry remains extremely nascent and the technology supporting it today is far from optimal or complete. It is our view that, like with other technologies before it, the incredible costs that loom over AI today will be chased out in the pursuit of efficiency over time. The datacentre industry, which will be impacted most by the proliferation of AI, has strong clean energy commitments, and AI itself has the potential to reduce emissions in many fields. Overall, the proliferation of AI over the next decade should not represent a significant risk to the climate, especially when compared to legacy industries that remain in the process of transformation.
Related Funds
FANG: The Global X FANG+ ETF (ASX: FANG) invests in 10 companies at the leading edge of next-generation technology that includes household names and newcomers.
SEMI: The Global X Semiconductor ETF (ASX: SEMI) invests in leading companies along the semiconductor value chain including designers, technology developers and manufacturers.