• sacredmelon@slrpnk.net
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    This is concerning, why they just dont stop the never ending updates and just stick with the latest things we have for a moment? Isnt all the tech stuff we have sufficient for the world to keep going?

  • Daxtron2@startrek.website
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    The bigger companies focus on huge model sizes instead and ever increasing them. Lots of advanced are being made with smaller and more affordable models that can be run on consumer devices but the big companies don’t focus on that as it can’t generate as much profit.

    • Sonori@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      The problem is that all of the current discussion and hype is about Chat GPT and similar whole internet models. They are not as useful as more specialized small model ones, but they also not as easy to hype.

  • Immersive_Matthew@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    The issue is how the electricity is generated not that it is needed in the first place. Such a great distraction from the real issue that it has got to be big oil that is spinning the story this way.

    Let’s all hate on AI and crypto because they are ruining the entire environment and if we just stopped them, all would be fine with the planet again /s.

  • stabby_cicada@slrpnk.net
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    So I did a little math.

    This site says a single ChatGPT query consumes 0.00396 KWh.

    Assume an average LED light bulb is 10 watts, or 0.01 kwh/hr. So if I did the math right, no guarantees there, a single ChatGPT query is roughly equivalent to leaving a light bulb on for 20 minutes.

    So if you assume the average light bulb in your house is on a little more than 3 hours a day, if you make 10 ChatGPT queries per day it’s the equivalent of adding a new light bulb to your house.

    Which is definitely not nothing. But isn’t the end of the world either.

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      I have a feeling it’s not going to be the ordinary individual user that’s going to drive the usage to problematic levels.

      If a company can make money off of it, consuming a ridiculous amount of energy to do it is just another cost on the P & L.

      (Assuming of course that the company using it either pays the electric bill, or pays a marked-up fee to some AI/cloud provider)

  • Zaktor@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    de Vries, who now works for the Netherlands’ central bank, estimated that if Google were to integrate generative A.I. into every search, its electricity use would rise to something like twenty-nine billion kilowatt-hours per year. This is more than is consumed by many countries, including Kenya, Guatemala, and Croatia.

    Why on earth would they do that? Just cache the common questions.

    It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)

    Ok, so the actual real world estimate is somewhere in the order of a million kilowatt-hours, for the entire globe. Even if we assume that’s just US, there are 125M households, so that’s 4 watt-hours per household per day. A LED lightbulb consumes 8 watts. Turn one of those off for a half-hour and you’ve balanced out one household’s worth of ChatGPT energy use.

    This feels very much in the “turn off your lights to do you part for climate change” distraction from industry and air travel. They’ve mixed and matched units in their comparisons to make it seem like this is a massive amount of electricity, but it’s basically irrelevant. Even the big AI-every-search number only works out to 0.6 kwh/day (again, if all search was only done by Americans), which isn’t great, but is still on the order of don’t spend hours watching a big screen TV or playing on a gaming computer, and compares to the 29 kwh already spent.

    Math, because this result is so irrelevant it feels like I’ve done something wrong:

    • 500,000 kwh/day / 125,000,000 US households = 0.004 kwh/household/day
    • 29,000,000,000 kwh/yr / 365 days/yr / 125,000,000 households = 0.6 kwh/household/day, compared to 29 kwh base
    • kibiz0r@midwest.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Just cache the common questions.

      There are only two hard things in Computer Science: cache invalidation and naming things.

      • Zaktor@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        It’s a good thing that Google has a massive pre-existing business about caching and updating search responses then. The naming things side of their business could probably use some more work though.

      • boonhet@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        You mean: two hard things - cache invalidation, naming things and off-by-one errors

        • kibiz0r@midwest.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Reminds me of the two hard things in distributed systems:

          • 2: Exactly-once delivery
          • 1: Guaranteed order
          • 2: Exactly-once delivery
    • frezik@midwest.social
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      Just cache the common questions.

      AI models work in a feedback loop. The fact that you’re asking the question becomes part of the response next time. They could cache it, but the model is worse off for it.

      Also, they are Google/Microsoft/OpenAI. They will do it because they can and nobody is stopping them.

  • mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    ‘How dare technology keep doing stuff?’ is a deeply weird criticism.

    This isn’t like crypto bullshit, where finance-bro jackasses did databases in the least efficient possible way. We’re pushing the boundaries of results-driven artificial intelligence, modeled on how biological brains work. Is it miraculous? Not exactly. But it’s answering a lot of questions that were exciting forty-odd years ago and suddenly exploded into relevance due to parallel computing… intended for video games.

    Bemoaning the last year-ish of outright witchcraft, based on the up-front costs of training models that will run on a phone, is a perspective that seems more performative than plausible.

    • stabby_cicada@slrpnk.net
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      4 months ago

      I deeply dislike the line of argument that goes “we shouldn’t bother reducing our personal energy consumption because 100 corporations produce 70% of greenhouse gases” or similar arguments. Of course we should. Because it’s the right thing to do.

      But it’s also true: those 100 corporations and their ilk absolutely promote a false narrative that personal responsibility is the solution to climate change, in order to prevent climate regulation that might harm their bottom line.

      And frankly, I think that’s what’s going on here with panic over AI power consumption. Corporate lobbyists and PR creating yet another distraction to slow the course of climate regulation and guilting ordinary people for doing ordinary things in the process.

      • skuzz@discuss.tchncs.de
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Personal responsibility has always been capitalism’s mechanism for normalizing corpo behavior. The fake Native American trash commercial in the 70s, banning home cleaners that business can still use at industrial scale, buying new electric cars being somehow carbon better than just not being a vehicle consumer every five minutes, there are examples going even further back in time, but my brain doesn’t currently have enough caffeine to dig further back.