

Ergorotic literature.
Hail Satan.
THE FINALS fanatic, join us at !THE_FINALS@fedia.io
THE FINALS: Season 4 Power Shift - #45 Worldwide
Ergorotic literature.
always give them a wide berth because there’s no way to know if they see you and they’ll often behave erratically and unpredictably in crosswalks
All of this applies to dealing with human drivers, too.
Australia’s usually really strict when it comes to violence in video games, but the Silent Hill series isn’t really known for intense gore. Though, the trailer looked a bit like it was going to be a bit body horror-focused (I got lots of Junji Ito vibes from it), so maybe SH:F will actually be a bit bloodier than other SH games.
Erdogan is a dictator who has the power to block X altogether.
Does it really make a difference whether it’s Erdogan or Musk who silences these people?
I assume that this is using a highly-curated, custom model, and not some off-the-shelf GPT that just anybody can use, so it probably won’t be suggesting that patients eat glue or anything crazy.
From what I can tell, it sounds like this is actually a fairly valid use for a chatbot, handling a lot of the tedious tasks that nurses are charged with. Most of what it seems to be doing, any untrained receptionist could also do (like scheduling appointments or reading dosage instructions), so this would free up nurses for actually important tasks like administering medications and triaging patients. It doesn’t seem like it’s going to be issuing prescriptions or anything where real judgement would be necessary.
As long as hospital staff are realistic about what tasks the chatbot should handle, this actually seems like a pretty decent place to implement a (properly-tuned) LLM.
Like most consumer tech, they started out pretty awesome and slowly got worse and worse as features got stripped away after countless patent infringement cases. Now pretty much all the “smart speakers” available are absolutely trash.
I’ve since disabled the mics on mine, and just use them for manually casting music.
For its part, Tesla has been trying to boost its image with the help of President Trump.
Yeah, that’s part of the problem, Elon.
Didn’t another car manufacturer have a similar “glitch” with in-car ads fairly recently? This story feels so familiar.
Good! Hopefully more devs follow suit and pull traffic away from Fandom.
Also hopeful that they manage to fix the SEO on the site, as Fandom is still the top result for “Vampire Survivors wiki”.
Anything to avoid discussing TES6.
I believe it’s supposed to be more of a spiritual successor to The Sims.
You had a golden opportunity to say we are an old school tech company
Sony doesn’t want to tread on Nintendo’s turf.
Uhh, isn’t that kinda against the whole point of being a spokesperson in the first place? To put a name and a face behind a message?
Dunno why The Verge plays along.
I’ve been wanting a replacement for ages now. The problem is that Discord does everything it does very well (with a few exceptions), way better than any of its competitors. It’s incredibly hard to replace, because no other product really matches it in any category. Cost, ease of use, feature set, cross-app API support… Nobody else comes close; even if you paid a ton of money for premium services to replace Discord, you’re still likely going to downgrade your overall experience.
I really want to see more competition in this space.
I’m pretty sure that just doing “quick searches” is exactly how he ended up with AI answers to begin with.
“Man who works 10 hours per year tells underlings to work 60 hours per week.”
Dang, there went all my legitimate plans for signal jamming.
Just for what it’s worth, you don’t need CSAM in the training material for a generative AI to produce CSAM. The models know what children look like, and what naked adults look like, so they can readily extrapolate from there.
The fact that you don’t need to actually supply any real CSAM to the training material is the reasoning being offered for supporting AI CSAM. It’s gross, but it’s also hard to argue with. We allow for all types of illegal subjects to be presented in porn; incest, rape, murder, etc. While most mainstream sites won’t allow those types of material, none of them are technically outlawed - partly because of freedom of speech and artistic expression and yadda yadda, but also partly because it all comes with the understanding that it’s a fake, made-for-film production and that nobody involved had their consent violated, so it’s okay because none of it was actually rape, incest, or murder. And if AI CSAM can be made without actually violating the consent of any real people, then what makes it different?
I don’t know how I feel about it, myself. The idea of “ethically-sourced” CSAM doesn’t exactly sit right with me, but if it’s possible to make it in a truly victimless manner, then I find it hard to argue outright banning something just because I don’t like it.