Skip Navigation
88 comments
  • I found a blogpost that cites a Business Insider article that implies this claim as formulated is way off:

    Reported energy use implies that ChatGPT consumes about as much energy as 20,000 American homes. An average US coal plant generates enough energy for 80,000 American homes every day. This means that even if OpenAI decided to power every one of its billion ChatGPT queries per day entirely on coal, all those queries together would only need one quarter of a single coal plant. ChatGPT is not the reason new coal plants are being opened to power AI data centers.

    It goes on to argue that while it's true that AI related electricity use is booming, it's not because of LLM chatbots:

    AI energy use is going to be a massive problem over the next 5 years. Projections say that by 2030 US data centers could use 9% of the country’s energy (they currently use 4%, mostly due to the internet rather than AI). Globally, data centers might rise from using 1% of the global energy grid to 21% of the grid by 2030. ...

    97% of the total energy used by AI as of late 2024 is not being used by ChatGPT or similar apps, it’s being used for other services. What are those services? The actual data on which services are using how much energy is fuzzy, but the activities using the most energy are roughly in this order:

     
            * Recommender Systems - Content recommendation engines and personalization models used by streaming platforms, e-commerce sites, social media feeds, and online advertising networks.
    
        * Enterprise Analytics & Predictive AI - AI used in business and enterprise settings for data analytics, forecasting, and decision support.
    
        * Search & Ad Targeting - The machine learning algorithms behind web search engines and online advertising networks.
    
        * Computer vision - AI tasks involving image and video analysis – often referred to as computer vision. It includes models for image classification, object detection, facial recognition, video content analysis, medical image diagnostics, and content moderation (automatically flagging inappropriate images/videos). Examples are the face recognition algorithms used in photo tagging and surveillance, the object detection in self-driving car systems (though inference for autonomous vehicles largely runs on-board, not in data centers, the training of those models is data-center based), and the vision models that power services like Google Lens or Amazon’s image-based product search.
    
        * Voice and Audio AI - AI systems that process spoken language or audio signals. The most prominent examples are voice assistants and speech recognition systems – such as Amazon’s Alexa, Google Assistant, Apple’s Siri, and voice-to-text dictation services.
    
      
  • There's a misconception regarding the "consumption" of water, also a bit of a bias towards AI data centers whereas most used water is actually from energy production (via carbon, fuel or even hydroelectric) which is actually a factor to be considered when calculating the actual water use and consumption.

    Regarding energy production and water "consumption" I read some papers and as far as I could understand numbers flactuate wildly. 5-40% of the water that runs through the system ends up being consumed via evaporation (so from potentially drinkable/usable for agriculture water to mostly water that ends up in the sea).

    What I'm trying to say is that, yes, we should be very aware of the water that we consume in our big data centers but should also put a great focus on the water used by the energy that fuels the data center itself, much of the discourse ends up being "haha use water for email silly" when it should be a catalyst for a more informed approach to water consumption.

    Basically I fear that the ai industry can make use of our ignorance and eappease with some "net zero" bs completely ignoring where most of the water is consumed and how.

    And yes there are solutions to avoid using fresh water for energy production: solar/wind, using sea water, using polluted water, more sophisticated systems that actually "consume" as little water as possible. These methods have drawbacks that our governments and industry refuse to face and would rather consume and abuse our resources, I really want people to focus on that.

  • You're not gonna save the world by not using ChatGPT, just like you won't save all those slaves in Zambia by not buying from Apple, and just like you didn't destroy Twitter by joining Bluesky.

    To have real effect requires systemic change, so if you want to actually make a difference you can do things like canvassing, running for local office positions and school boards, educating friends and family about politics, or try killing a few politicians and tech CEOs. You know, basic stuff.

    Also I asked Gemini's Deep Research to research this for me because why not UwU

    Executive Summary

    Estimates for the energy consumed by ChatGPT during its training and inference phases vary considerably across different studies, reflecting the complexity of the models and the proprietary nature of the data. Training a model like GPT-3 is estimated to require around 1.3 GWh of electricity1, while more advanced models such as GPT-4 may consume significantly more, with estimates ranging from 1.75 GWh to over 62 GWh.2 Models comparable to GPT-4o are estimated to consume between 43.2 GWh and 54 GWh during training.3 These figures represent substantial energy demands, with the training of GPT-4 potentially exceeding the annual electricity consumption of very small nations multiple times over. The energy used during ChatGPT inference, the process of generating responses to user queries, also presents a wide range of estimates, from 0.3 watt-hours to 2.9 watt-hours per query.4 This translates to an estimated annual energy consumption for inference ranging from approximately 0.23 TWh to 1.06 TWh. This level of energy demand can be comparable to the entire annual electricity consumption of smaller countries like Barbados. The lack of official data from OpenAI and the diverse methodologies employed by researchers contribute to the variability in these estimates, highlighting the challenges in precisely quantifying the energy footprint of these advanced AI systems.4

    1. https://balkangreenenergynews.com/chatgpt-consumes-enough-power-in-one-year-to-charge-over-three-million-electric-cars/
    2. https://www.baeldung.com/cs/chatgpt-large-language-models-power-consumption
    3. https://www.bestbrokers.com/forex-brokers/ais-power-demand-calculating-chatgpts-electricity-consumption-for-handling-over-78-billion-user-queries-every-year/
    4. https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use
88 comments