AI Backlash Coming?
By Rob Sigler, MBA
October 8, 2024
The inconvenient truth about AI, and the data centers that power it, is that they consume huge quantities of power to perform their computationally intense “learning.” According to Business Insider, ChatGPT, one of the most well-known AI models, is using approximately 500,000 kilowatt-hours of electricity to respond to its roughly 200 million daily requests. To put that in context, the average US household uses 30 kilowatt-hours/day according to the Unites States Energy Information Association. Said differently, ChatGPT uses enough energy to power nearly 17,000 homes. At what point do data centers start to drive up electricity prices, cause power shortages, and seed a retail consumer revolt?
The 10/8/24 Wall Street Journal article, entitled Arizona Voters at Breaking Point Over Cost of Electricity, illustrates that this problem isn’t simply esoteric. Data centers consume 7% of total power in AZ, shy of some of the other states in the Union, but at the higher end of the continuum. The issue becomes increasingly tricky as we move forward. If estimates prove accurate, Nvidia is on pace to sell enough AI chips to power the equivalent of 1,000 ChatGPT models by the end of 2026 (source Empirical Research, JP Morgan). Not all of these will be based in the United States of course, but if they were, that would equate to enough energy demand to power nearly 12% of all homes in America.
Our point here is simple. In the not-too-distant future, the location of data centers likely becomes a NIMBY (not in my backyard) issue. This represents an exogenous risk that needs to be considered when investing in many of the most popular technology companies.
Data centers’ share of total power consumption
Source: Electric Power Research Institute, Apollo Chief Economist, Note..States in Grey lack reliable data.