one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

  • geissi@feddit.de
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    9 months ago

    Tbf, talking about the environmental costs of generative AI is just framing.
    The issue is the environmental cost of electricity, no matter what it is used for.
    If we want this to be considered in consumption then it needs to be part of the electricity price. And of course all other power sources, like combustion motors, need to also price in external costs.

    • 🔰Hurling⚜️Durling🔱@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      9 months ago

      It should be considered, an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI, the AI server has higher power requirements (it may not be as wide of a margin from my first comparison, but there is one), now multiply that AI server and add hundreds more and you start seeing a considerable uptick in power usage.

      • geissi@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI

        And if external costs are priced into the cost of electricity then that will be reflected in the cost of operating these devices.
        Also there are far more data servers than servers running AI which increases the total effect they have.