Lol, saying you are “beginning a process designed to delete your data” is a very different thing to actually deleting your data.

  • dropped_packet@lemmy.zip
    link
    fedilink
    arrow-up
    17
    ·
    2 days ago

    I’m not disputing the technical aspect. But due to these realities I prefer to drastically limit the services I interact with.

    • djmikeale@feddit.dk
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      2 days ago

      Aha I misunderstood, thanks for clarifying.

      Actually for this specific context, there’s an easy solution: I reckon for llms self-hosting would be the way to go, if your hardware supports it. I’ve heard a lot of the smaller models have gotten a lot more powerful over the last year.

      • dropped_packet@lemmy.zip
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        2 days ago

        Small fine tuned models seem to be where the market as a whole is headed. Even the big players like OpenAI/Google/Meta are doing this as a means to optimize infrastructure. The Qwen3 models have been really interesting to work with.