• 1984@lemmy.today
    link
    fedilink
    arrow-up
    42
    arrow-down
    1
    ·
    7 months ago

    Oh really? Let’s calculate the power usage or billions of devices downloading and serving ads.

    • freebee@sh.itjust.works
      link
      fedilink
      arrow-up
      28
      ·
      7 months ago

      How much would we save if we’d somehow be able to debloat and deshittify the Internet and all devices? Climate impact, overconsumption of unnecessary crap, mental health care…

  • Karna@lemmy.ml
    link
    fedilink
    arrow-up
    21
    arrow-down
    2
    ·
    7 months ago

    Well, since when Silicon Valley cared about environment or Global warming? It’s all about $ always.

  • Aurenkin@sh.itjust.works
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    7 months ago

    An interesting topic but the article has virtually no information on it and what was there was unsourced and confusing. Maybe I’m just tired and not seeing it but damn, the taking 50 Belgiuns to the moon comparison really got me confused. I agree in general though, new technologies take energy and we need to decarbonise our energy generation as quickly as possible.

    I’d actually be really interested in an actual deep dive into this topic though. What kind of tasks are people using these assistants for and how does energy use of an assistant compare with how people would do that before? I’m sure it’s more energy intensive but it’d be interesting to understand more at least for me.

    • erwan@lemmy.ml
      link
      fedilink
      arrow-up
      12
      ·
      7 months ago

      I agree that the article is a bit confusing, but we can’t keep increasing energy consumption and hope decarbonization will fix it.

      From an environmental point of view energy is never free. Also as long as we still use fossil fuels, any new usage of renewable (e.g. run AI on solar panels) is energy that could have been use to replace fossil usage.

      • Kashif Shah@lemmy.sdf.org
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        Energy consumption has a dark underbelly of rare earth mineral consumption that is often just swept under the rug of shiny new thing. Ooh.

      • Aurenkin@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        7 months ago

        What do you propose, exactly? We have the technology right now to decarbonise our grid, it’s even the sensible move economically now. Are you saying we should all stop having kids and building anything new that uses electricity? I’m assuming that’s not your position but that’s what I took from reading your comment.

        • erwan@lemmy.ml
          link
          fedilink
          arrow-up
          5
          ·
          7 months ago

          Simply be mindful of our energy usage, and not just rely on decarbonization. We need both because decarbonization will not happen overnight.

          Historically, worldwide our production of renewable have kept growing, the percentage have been growing, but fossil fuels usage have also kept growing.

          Now we get a new technology that is using even more energy, maybe we should work on energy efficiency and use that tech sparingly instead of building more data centers so incels can get their voice chat AI girlfriend, and say “we’ll just install more solar panels and windmills”.

          • Aurenkin@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            7 months ago

            Sure but… what do you propose? Saying be mindful of our energy use isn’t actionable. Are you saying we should cap energy use and have a bidding system for industries who want to use new capacity, have a carbon price so industries are encouraged to use non carbon producing energy? I still don’t understand what you’re suggesting. Or maybe if you think entertainment is a waste of energy we should ban non educational use of video on the internet as I’m sure that is an insane amount of energy use worldwide.

      • Aurenkin@sh.itjust.works
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        7 months ago

        I have no idea, that’s kind of my point. I’m not trying to argue that it’s not much, or that it’s a lot, or that it’s worth it or not, just saying I have no idea and neither that article nor any of the ones you linked gave me the answer.

        I think it’s an important consideration, so I’d love more information but it seems that it’s not available. Maybe it’s hard to calculate because things like the energy used and exact amount of compute are trade secrets or something, I don’t know. It’d be nice to know though.

        • Kashif Shah@lemmy.sdf.org
          link
          fedilink
          arrow-up
          5
          ·
          7 months ago

          From earth.org: “Data centres are typically considered black box compared to other industries reporting their carbon footprint; thus, while researchers have estimated emissions, there is no explicit figure documenting the total power used by ChatGPT. The rapid growth of the AI sector combined with limited transparency means that the total electricity use and carbon emissions attributed to AI are unknown, and major cloud providers are not providing the necessary information.”

  • GolfNovemberUniform@lemmy.ml
    link
    fedilink
    arrow-up
    9
    arrow-down
    5
    ·
    7 months ago

    I don’t really care about other commenters saying that the article doesn’t have a reliable enough source. I know that commercial LLMs are terrible resource consumers and since I don’t support their development I think they should be legally banned for this very reason.

    • Kashif Shah@lemmy.sdf.org
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      7 months ago

      That is a very valid and reasonable opinion, sorry to see it downvoted.

      There will be strong disagreement with you, however, on the case that LLMs are a big enough resource hog to require outright banning for just that reason.

      If you are looking for Big Tech hit boxes, try for things like writing laws that require all energy consumption in datacenters to be monitered and reported using established cross-disciplinary methods.

      Or getting people to stop buying phones every year. Or banning disposable vapes.

      • GolfNovemberUniform@lemmy.ml
        link
        fedilink
        arrow-up
        7
        ·
        7 months ago

        I knew it’s going to be downvoted. People here mostly support AI. But I don’t and what I meant is that I just would love the governments to ban it (obviously). The energy efficiency is the most simple reason to tell them so yea. Sorry everyone but I’m old schooled. Put your fancy AI bells and whistles away and embrace efficient, old and proven ways of computing such as using GUI, TTY and search engines (that still consume a lot but not as inefficiently). They at least don’t consume 10 MW (or a few seconds of full load CPU time and 200Gb of space if it’s a local LLM) to calculate 2+2*2 or give you a link to a Wikipedia article that explains what a helicopter is (cough cough Bing cough cough). And they hallucinate way less often too.

      • BlameThePeacock@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        6
        ·
        7 months ago

        It’s an opinion, but it’s hardly valid. It’s a knee-jerk fear reaction to something new.

        People had the same opinion about computers, cellphones, even electricity…

        • nossaquesapao@lemmy.eco.br
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          Well, if someone questioned the environmental consequences of combustion engines a hundred years ago, they would be laughed of, but decades later we came to the conclusion that they’re terrible to tbe environment. Jumping on new things without pondering the consequences, like we mostly do as a society, isn’t very different from fearing everything new. I think it’s a good thing to have some caution and discuss the possible consequences of generative ai. I would prefer more data and less sentiment, though.

          • BlameThePeacock@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            7 months ago

            VALID - (of an argument or point) having a sound basis in logic or fact; reasonable or cogent.

            No, not all opinions are valid.

              • BlameThePeacock@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                You can have an opinion that is grounded in basis of logic or fact.

                In my opinion, the sunset appears pinky/purple. The basic foundation of this opinion (which others may disagree with due to slight variations in atmospheric conditions) is still rooted in fact. Someone else may think it looks red/purple. Both are basically correct, reasonably speaking.

    • BlameThePeacock@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      6
      ·
      7 months ago

      You can’t ban LLMs at this point, they’re too useful, it’s impossible to track their use, they could be run anywhere on the globe, and even open source models that you can run locally exist.

      The cat is out of the bag as they say.

      • GolfNovemberUniform@lemmy.ml
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        7 months ago
        1. They are just toys basically
        2. Local LLMs are not that bad. Of course they’re 100x less efficient than a native calculator or search engine but very few % of people use them and tracking will probably use even more energy so it’s not that big of a deal. I don’t have much against research of AI so training is quite justified too (in terms of energy, not using data without permission). It’s only large commercial cloud-based solutions with enormous infrastructures that should probably be banned
        • BlameThePeacock@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          7 months ago
          1. No they aren’t. They’re saving me multiple hours a week at my job. They’re a productivity multiplier for many tasks even in their current early state.

          2. If you don’t think every single cellphone manufacturer isn’t trying to jam a local model into their newest devices, I’ve got a bridge to sell you.

  • IcePee@lemmy.beru.co
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    7 months ago

    I wonder how they measured this. Could it just be that they get more utilisation? Even per capita is probably not adequate either. You would need a measure that’s an analogue of per capita. Maybe per result? For instance I could spend half an hour attempting to get just the right set of keywords to bring up the right result, or I could spend 5 minutes in a chat session with an AI honing the correct response.

    • SkyNTP@lemmy.ml
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      7 months ago

      The wording of the article implies an apples to apples comparison. So 1 Google search == 1 question successfully answered by an LLM. Remember a Google Search in layspeak is not the act of clicking on the search button, rather it’s the act of going to Google to find a website that has information you want. The equivalent with ChatGPT would be to start a “conversation” and getting information you want on a particular topic.

      How many search engine queries, or LLM prompts that involves, or how broad the topic, is a level of technical detail that one assumes the source for the number x25 has already controlled for (Feel free to ask the author for the source and share with us though!)

      Anyone who’s remotely used any kind of deep learning will know right away that deep learning uses an order of magnitude or two more power (and an order of magnitude or two more performance!) compared to algorithmic and rules based software, and a number like x25 for a similar effective outcome would not at all be surprising, if the approach used is unnecessarily complex.

      For example, I could write a neural network to compute 2+2, or I could use an arithmetic calculator. One requires a 500$ GPU consuming 300 watts, the other a 2$ pocket calculator running on 5 watts, returning the answer before the neural network is even done booting.

      • Kashif Shah@lemmy.sdf.org
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 months ago

        However many years it takes for these LLM fools to wake up, hopefully they can find a way to laugh at themselves for thinking that it was cutting-edge to jam the internet into a fake jellyfish brain and calling it GPT. I haven’t looked recently, but I still haven’t seen anyone talking about neuroglial networks and how they will revolutionize the applications for AI.

        There’s a big*** book, but apparently no public takers in the deep neural network space?

  • utopiah@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    7 months ago

    Might be correct but without any source for the number I can’t even share this back. Asked them, will update if I get an answer.

  • antlion@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    3
    arrow-down
    5
    ·
    edit-2
    7 months ago

    At this point I basically need to do 25 Google searches to find what I’m looking for anyway. This is a stupid comparison. When I eat cabbage and beer my digestive tract releases more GHGs than my whole day of using ChatGPT (zero). I just need to figure out how to harvest and burn my own methane so I can do more ChatGPT queries guilt-free.