• 0 Posts
  • 52 Comments
Joined 1 year ago
cake
Cake day: March 4th, 2024

help-circle
  • Just for what it’s worth, you don’t need CSAM in the training material for a generative AI to produce CSAM. The models know what children look like, and what naked adults look like, so they can readily extrapolate from there.

    The fact that you don’t need to actually supply any real CSAM to the training material is the reasoning being offered for supporting AI CSAM. It’s gross, but it’s also hard to argue with. We allow for all types of illegal subjects to be presented in porn; incest, rape, murder, etc. While most mainstream sites won’t allow those types of material, none of them are technically outlawed - partly because of freedom of speech and artistic expression and yadda yadda, but also partly because it all comes with the understanding that it’s a fake, made-for-film production and that nobody involved had their consent violated, so it’s okay because none of it was actually rape, incest, or murder. And if AI CSAM can be made without actually violating the consent of any real people, then what makes it different?

    I don’t know how I feel about it, myself. The idea of “ethically-sourced” CSAM doesn’t exactly sit right with me, but if it’s possible to make it in a truly victimless manner, then I find it hard to argue outright banning something just because I don’t like it.






  • I assume that this is using a highly-curated, custom model, and not some off-the-shelf GPT that just anybody can use, so it probably won’t be suggesting that patients eat glue or anything crazy.

    From what I can tell, it sounds like this is actually a fairly valid use for a chatbot, handling a lot of the tedious tasks that nurses are charged with. Most of what it seems to be doing, any untrained receptionist could also do (like scheduling appointments or reading dosage instructions), so this would free up nurses for actually important tasks like administering medications and triaging patients. It doesn’t seem like it’s going to be issuing prescriptions or anything where real judgement would be necessary.

    As long as hospital staff are realistic about what tasks the chatbot should handle, this actually seems like a pretty decent place to implement a (properly-tuned) LLM.












  • I’ve been wanting a replacement for ages now. The problem is that Discord does everything it does very well (with a few exceptions), way better than any of its competitors. It’s incredibly hard to replace, because no other product really matches it in any category. Cost, ease of use, feature set, cross-app API support… Nobody else comes close; even if you paid a ton of money for premium services to replace Discord, you’re still likely going to downgrade your overall experience.

    I really want to see more competition in this space.