• @XeroxCool@lemmy.world
    link
    fedilink
    English
    107 months ago

    If I make a gas engine with 100% heat efficiency but only run it in my backyard, do the greenhouse gases not count because it’s so efficient? Of course they do. The high efficiency of a data center is great, but that’s not what the article laments. The problem it’s calling out is the absurdly wasteful nature of why these farms will flourish: to power excessively animated programs to feign intelligence, vainly wasting power for what a simple program was already addressing.

    It’s the same story with lighting. LEDs seemed like a savior for energy consumption because they were so efficient. Sure they save energy overall (for now), but it prompted people to multiply the number of lights and total output by an order of magnitude simply because it’s so cheap. This stems a secondary issue of further increasing light pollution and intrusion.

    Greater efficiency doesn’t make things right if it comes with an increase in use.

    • MudMan
      link
      fedilink
      3
      edit-2
      7 months ago

      For one thing, it’s absolutely not true that what these apps provide is the same as what we had. That’s another place where the AI grifters and the AI fearmongers are both lying. This is not a 1:1 replacement for older tech. Not on search, where LLM queries are less reliable at finding facts and being accurate but better at matching fuzzy searches without specific parameters. Not with image generation, obviously. Not with tools like upscaling, frame interpolation and so on.

      For another, some of the numbers being thrown around are not realistic or factual, are not presented in context or are part of a power increase trend that was already ongoing with earlier applications. The average high end desktop PC used to run on 250W in the 90s, 500W in the 2000s. Mine now runs at 1000W. Playing a videogame used to burn as much power as a couple of lightbulbs, now it’s the equivalent of turning on your microwave oven.

      The argument that we are burning more power because we’re using more compute for entertainment purposes is not factually incorrect, but it’s both hyperbolic (some of the cost estimates being shared virally are deliberate overestimates taken out of context) and not inconsistent with how we use other computer features and have used other computer features for ages.

      The only reason you’re so mad about me wasting some energy asking an AI to generate a cute picture but not at me using an AI to generate frames for my videogame is that one of those is a viral panic that maps nicely into the viral panic about crypto people already had and the other is a frog that has been slow burning for three decades so people don’t have a reason to have an opinion about it.