• elucubra@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    27 days ago

    I’m trying out Google’s Gemma4 LLM, which is run locally, and is touted as a 100% private model.

    Asking it some questions about itself, at one point it acknowledged that chats were sent to “developers”.

    • natebluehooves@pawb.social
      link
      fedilink
      English
      arrow-up
      6
      ·
      27 days ago

      llama.cpp doesn’t have the ability to send telemetry because the next word predictor says so. you can confirm with wireshark.

    • nightlily@leminal.space
      link
      fedilink
      English
      arrow-up
      4
      ·
      27 days ago

      You mean it hallucinated a positive response to your leading question as it is meant to? You are operating on a fundamental misunderstanding of what LLMs do. Even if what you said is true, an LLM would have no knowledge of that unless it was explicitly told as such as an input - and why would they be stupid enough to do that?