
Google processes over 5 trillion search queries per year. Attaching an AI inference call to most if not all of those will increase electricity consumption by at least an order of magnitude.
Edit: using their own 0.24Wh number, that equates to 1.2 billion kWh per year, or about the equivalent of 114.3 million USA homes.

AI is the driver of the parabolic spike in global data center buildouts. No other use case comes close in terms of driving new YoY growth in tech infra capex spend.

Please show your math.
One Nvidia H100 DGX AI server consumes 10.2kW at 100% utilization, meaning that one hour’s use of one server is equivalent to the electricity consumption of the average USA home in one year. This is just a single 8-GPU server; it excludes the electricity required by the networking and storage hardware elsewhere in the data center, let alone the electricity required to run the facility’s climate control.
xAI alone has deployed hundreds of thousands of H100 or newer GPUs. Let’s SWAG 160K GPUs = ~20K DGX servers = >200MW for compute alone.
H100 is old. State of the art GB200 NVL72 is 120kW per rack.
Musk is targeting not 160K, but literally one million GPUs deployed by the end of this year. He has built multiple new natural gas power plants which he is now operating without any environmental permits or controls, to the detriment of the locals in Memphis.
This is just one company training one typical frontier model. There are many competitors operating at similar scale and sadly the vast majority of their new capacity is running on hydrocarbons because that’s what they can deploy at the scale they need today.