It’s not AI winter just yet, though there is a distinct chill in the air. Meta is shaking up and downsizing its artificial intelligence division. A new report out of MIT finds that 95 percent of companies’ generative AI programs have failed to earn any profit whatsoever. Tech stocks tanked Tuesday, regarding broader fears that […]
This is totally expected and also absolutely peanuts compared to Intel, who once released a processor that managed to perform floating point long division incorrectly in fascinating (if you’re the right type of nerd) and subtle ways. Hands up everyone who remembers that debacle!
Nobody? Just me?
Anyway, I totally had — and probably still have, somewhere — one of the affected chips. You could check if yours was one of the flawed ones literally by using the Windows calculator.
Making a few digits worth of wrong division way down in the not very significant bits of the answer, is way better than encouraging all your users to use an LLM to generate the answers for their quarterly reports / tax forms / do we have enough food for the winter calculations. The Pentium division fuckup was barely worth fixing unless you were doing some kind of numerical analysis or simulation or something, which is why it slipped past all the testing initially. This is astronomically worse of a fuck-up.
They even say not to use it for financial calculations or high stakes scenarios. They can’t provide an example of using it in any way that is useful for getting actual work done. It’s a solution in search of a problem.
I remember having to compensate for the Pentium float bug in the Turbo Pascal programs I was writing back then. I really didn’t understand what I was doing at the time, and the 90s version of StackOverflow (A Tripod blog?) wasn’t that enlightening…
Does AI comes up negative for most users? Surely here in Lemmy, yes. But out there I see/hear people using it -for dumb shit, mind you- all the time and being happy about it.
A lot of people are fine with getting wrong answers about shit they don’t know already. That’s what gets spread in social media and what was used for a large portion of the training data and what is available when AI does a web search.
It presents something that looks right, that is what most people care about.
This is totally expected and also absolutely peanuts compared to Intel, who once released a processor that managed to perform floating point long division incorrectly in fascinating (if you’re the right type of nerd) and subtle ways. Hands up everyone who remembers that debacle!
Nobody? Just me?
Anyway, I totally had — and probably still have, somewhere — one of the affected chips. You could check if yours was one of the flawed ones literally by using the Windows calculator.
Hah! That was my first thought, too, when I saw the headline.
Making a few digits worth of wrong division way down in the not very significant bits of the answer, is way better than encouraging all your users to use an LLM to generate the answers for their quarterly reports / tax forms / do we have enough food for the winter calculations. The Pentium division fuckup was barely worth fixing unless you were doing some kind of numerical analysis or simulation or something, which is why it slipped past all the testing initially. This is astronomically worse of a fuck-up.
They even say not to use it for financial calculations or high stakes scenarios. They can’t provide an example of using it in any way that is useful for getting actual work done. It’s a solution in search of a problem.
Yeah, and I’m only supposed to use this bong for smoking tobacco. It said so very very clearly when I bought it so you know they mean it.
Oh no, I remember that well. I was in high school 👴
If only that recall had actually bankrupted the company. I wonder where we would be today…
But we can’t bankrupt Microsoft. Bill Gates can jump over a chair.❤️
The floating point bug we are talking about was in Intel Pentium processors. Also we need to bring back that news clip of Gates more often.
They’re talking about another company
OEM Problem, right?
Can you expand on this question? I don’t understand.
@dual_sport_dork @silence7 good times
I remember having to compensate for the Pentium float bug in the Turbo Pascal programs I was writing back then. I really didn’t understand what I was doing at the time, and the 90s version of StackOverflow (A Tripod blog?) wasn’t that enlightening…
If I remember correctly the Intel floating point thing didn’t come up as a negative for most users like AI does.
Does AI comes up negative for most users? Surely here in Lemmy, yes. But out there I see/hear people using it -for dumb shit, mind you- all the time and being happy about it.
A lot of people are fine with getting wrong answers about shit they don’t know already. That’s what gets spread in social media and what was used for a large portion of the training data and what is available when AI does a web search.
It presents something that looks right, that is what most people care about.
I remember too, buddy. It’s important to never forget.
Edit: oh, I guess it’s important to forget.