• Armok_the_bunny@lemmy.world
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    1
    ·
    14 days ago

    Cool, now how much power was consumed before even a single prompt was ran in training that model, and how much power is consumed on an ongoing basis adding new data to those AI models even without user prompts. Also how much power was consumed with each query before AI was shoved down our throats, and how many prompts does an average user make per day?

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      12
      ·
      edit-2
      14 days ago

      I did some quick math with metas llama model and the training cost was about a flight to Europe worth of energy, not a lot when you take in the amount of people that use it compared to the flight.

      Whatever you’re imagining as the impact, it’s probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.

      • taiyang@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        14 days ago

        I usually liken it to video games, ya. Is it worse that nothing? Sure, but that flight or road trip, etc, is a bigger concern. Not to mention even before AI we’ve had industrial usage of energy and water usage that isn’t sustainable… almonds in CA alone are a bigger problem than AI, for instance.

        Not that I’m pro-AI cause it’s a huge headache from so many other perspectives, but the environmental argument isn’t enough. Corpo greed is probably the biggest argument against it, imo.

      • boor@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 days ago

        Please show your math.

        One Nvidia H100 DGX AI server consumes 10.2kW at 100% utilization, meaning that one hour’s 42 day’s use of one server is equivalent to the electricity consumption of the average USA home in one year. This is just a single 8-GPU server; it excludes the electricity required by the networking and storage hardware elsewhere in the data center, let alone the electricity required to run the facility’s climate control.

        xAI alone has deployed hundreds of thousands of H100 or newer GPUs. Let’s SWAG 160K GPUs = ~20K DGX servers = >200MW for compute alone.

        H100 is old. State of the art GB200 NVL72 is 120kW per rack.

        Musk is targeting not 160K, but literally one million GPUs deployed by the end of this year. He has built multiple new natural gas power plants which he is now operating without any environmental permits or controls, to the detriment of the locals in Memphis.

        This is just one company training one typical frontier model. There are many competitors operating at similar scale and sadly the vast majority of their new capacity is running on hydrocarbons because that’s what they can deploy at the scale they need today.

        • Grimy@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 days ago

          I should have specified it was an earlier llama model. They have scaled up to more then a flight or two. You are mostly right except for how much a house uses. It’s about 10,500 kW per year, you’re off by a thousand. It uses in an hour about 8 hours of house time, which is still a lot though, specially when you consider musks 1 million gpus.

          https://kaspergroesludvigsen.medium.com/facebook-disclose-the-carbon-footprint-of-their-new-llama-models-9629a3c5c28b

          Their first model took 2 600 000 kwh, a plane takes about 500 000. The actual napkin math was 5 flights. I had done the math like 2 years ago but yeah, I was mistaken and should have at least specified it was for their first model. Their more recent ones have been a lot more energy intensive I think.

          • boor@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 days ago

            Thanks for catching, you are right that the average USA home is 10.5MWh/year instead of kWh. I was mistaken. :)

            Regarding the remainder, my point is that the scale of modern frontier model training, and the total net-new electricity demand that AI is creating is not trivial. Worrying about other traditional sources of CO2 emissions like air travel and so forth is reasonable, but I disagree with the conclusion that AI infrastructure is not a major environmental and climate change concern. The latest projects are on the scale of 2-5GW per site, and the vast majority of that new electricity capacity will come from natural gas or other hydrocarbons.

      • douglasg14b@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        4
        ·
        edit-2
        13 days ago

        A flight to Europe’s worth of energy is a pretty asinine way to measure this. Is it not?

        It’s also not that small the number, being ~600 Megawatts of energy.

        However, training cost is considerably less than prompting cost. Making your argument incredibly biased.

        Similarly, the numbers released by Google seem artificially low, perhaps their TPUs are massively more efficient given they are ASICs. But they did not seem to disclose what model they are using for this measurement, It could be their smallest, least capable and most energy efficient model which would be disingenuous.

  • salty_chief@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    16
    ·
    14 days ago

    So as thought virtually no impact. AI is here and not leaving. It will outlast humans on earth probably.