• Traister101@lemmy.today
      link
      fedilink
      English
      arrow-up
      10
      ·
      3 hours ago

      They can’t reason. LLMs, the tech all the latest and greatest still are, like GPT5 or whatever generate output by taking every previous token (simplified) and using them to generate the most likely next token. Thanks to their training this results in pretty good human looking language among other things like somewhat effective code output (thanks to sites like stack overflow being included in the training data).

      Generating images works essentially the same way but is more easily described as reverse jpg compression. You think I’m joking? No really they start out with static and then transform the static using a bunch of wave functions they came up with during training. LLMs and the image generation stuff is equally able to reason, that being not at all whatsoever

        • Traister101@lemmy.today
          link
          fedilink
          English
          arrow-up
          2
          ·
          28 minutes ago

          If you truly believe that you fundamentally misunderstand the definition of that word or are being purposely disingenuous as you Ai brown nose folk tend to be. To pretend for a second you genuinely just don’t understand how to read LLMs, the most advanced “Ai” they are trying to sell everybody is as capable of reasoning as any compression algorithm, jpg, png, webp, zip, tar whatever you want. They cannot reason. They take some input and generate an output deterministically. The reason the output changes slightly is because they put random shit in there for complicated important reasons.

          Again to recap here LLMs and similar neural network “Ai” is as capable of reasoning as any other computer program you interact with knowingly or unknowingly, that being not at all. Your silly Wikipedia page is a very specific term “Reasoning System” which would include stuff like standard video game NPC Ai such as the zombies in Minecraft. I hope you aren’t stupid enough to say those are capable of reasoning

        • cmhe@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          28 minutes ago

          This link is about reasoning system, not reasoning. Reasoning involves actually understanding the knowledge, not just having it. Testing or validating where knowledge is contradictionary.

          LLM doesn’t understand the difference between hard and soft rules of the world. Everything is up to debate, everything is just text and words that can be ordered with some probabilities.

          It cannot check if something is true, it just ‘knows’ that someone on the internet talked about something, sometimes with and often without or contradicting resolutions…

          It is a gossip machine, that trys to ‘reason’ about whatever it has heard people say.