A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

  • MrLLM@ani.social
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    19 hours ago

    won’t ruin your career

    Granted, but it still will suck a fuck ton of coal produced electricity.

    • Womble@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 hours ago

      One chat request to an LLM produces about as much CO2 as burning one droplet of gasoline (if it was from coal fired power, less if it comes from cleaner sources). It makes far less CO2 to talk to a chatbot for hours upon hours than a ten minute drive to see a therapist once a week.

      • MrLLM@ani.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        Sorry, you’re right. I meant the training of the LLM is what uses lots of energy, I guess that’s not end user’s fault.

        • Glog78@digitalcourage.social
          link
          fedilink
          arrow-up
          1
          ·
          4 hours ago

          @MrLLM @Womble

          Question … did someone once do a study comparing a regular fulltext indexed based search vs ai in terms of energy consumption ;)

          Second … if people would keep using “old” tech -> wouldn’t that be better for employment of people and therefor for social stability on this planet ?

          • MrLLM@ani.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 hours ago

            To your first question, nop, I have no idea how much energy takes to index the web in a traditional way (e.g MapReduce). But I think, in recent years, it’s been pretty clear that training AI consumes more energy (so much that big corpo are investing in nuclear energy, I think there was an article about companies giving up meeting 2030 [or 2050?] carbon emission goals, couldn’t find it)

            About the second… I agree with you, but I also think that the problem is much bigger and complex than that.