- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
Yeah, but endurance. and accuracy. and longevity. How about those?
China scientists
So, Chinese scientists?
Link to the actual paper: https://www.nature.com/articles/s41586-025-08839-w
The repro and verification will take time. Months or even years. Don’t trust anyone who says it’s definitely real or definitely bunk. Time will tell.
Damn. I just pulled all my stock out quantum computing and thru it all into this…
Easy when you have zero
Does flash, like solid state drives, have the same lifespan in terms of write? If so, it feels like this would most certainly not be useful for AI, as that use case would involve doing billions/trillions of writes in a very short span of time.
Edit: It looks like they do: https://www.enterprisestorageforum.com/hardware/life-expectancy-of-a-drive/
Manufacturers say to expect flash drives to last about 10 years based on average use. But life expectancy can be cut short by defects in the manufacturing process, the quality of the materials used, and how the drive connects to the device, leading to wide variations. Depending on the manufacturing quality, flash memory can withstand between 10,000 and a million [program/erase] cycles.
For AI processing, I don’t think it would make much difference if it lasted longer. I could be wrong, but afaik, running the actual transformer for AI is done in VRAM, and staging and preprocessing is done in RAM. Anything else wouldn’t really make sense speed and bandwidth wise.
Oh I agree, but the speeds in the article are much faster than any current volatile memory. So it could theoretically be used to vastly expand memory availability for accelerators/TPUs/etc for their onboard memory.
I guess if they can replicate these speeds in volatile memory and increase the buses to handle it, then they’d be really onto something here for numerous use cases.
Clickbait article with some half truths. A discovery was made, it has little to do with Ai and real world applications will be much, MUCH more limited than what’s being talked about here, and will also likely still take years to come out
The key word is China, let us not kid ourselves. Otherwise it would be just another pop sci click but now it can be an ammunition in the fight with imperialist degenerated west or some bs like that
By tuning the “Gaussian length” of the channel, the team achieved two‑dimensional super‑injection, which is an effectively limitless charge surge into the storage layer that bypasses the classical injection bottleneck.
Which episode of Star Trek is this from?
The one where there’s a problem with the holodeck.
Do you have any idea how little that narrows it down?
It’s the one where Barclay gets obsessed with his Holodeck program.
I don’t think so. I just rewatched it. It’s the one where Data finds out something to make himself more human. Picard tells him something profound and moving.
They’re just copying the description of the turbo encabulator.
AI AI AI AI
Yawn
Wake me up if they figure out how to make this cheap enough to put in a normal person’s server.
normal person’s server.
I’m pretty sure I speak for the majority of normal people, but we don’t have servers.
Yeah, when you’re a technology enthusiast, it’s easy to forget that your average user doesn’t have a home server - perhaps they just have a NAS or two.
(Kidding aside, I wish more people had NAS boxes. It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives. In a good day. I do have a USB floppy drive and a DVD drive just in case.)
Hello fellow home labber! I have a home built xpenology box, proxmox server with a dozen vm’s, a hackentosh, and a workstation with 44 cores running linux. Oh, and a usb floppy drive. We are out here.
I also like long walks in Oblivion.
Man oblivion walks are the best until a crazy woman comes at you trying to steal your soul with a fancy sword
It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives.
Equally disheartening is knowing that both of those have a shelf-life. Old USB flash drives are more durable than the TLC/QLC cells we use today, but 15 years sitting unpowered in a box doesn’t have very good prospects.
lol yeah, the lemmy userbase is NOT an accurate sample of the technical aptitude of the general population 😂
Ikr…Dude thinks we’re restaurants or something.
You… you don’t? Surely there’s some mistake, have you checked down the back of your cupboard? Sometimes they fall down there. Where else do you keep your internet?
Appologies, I’m tired and that made more sense in my head.
Well obviously the internet is kept in a box, and it’s wireless. The elders of the internet let me borrow it occasionally.
You can get a Coral TPU for 40 bucks or so.
You can get an AMD APU with a NN-inference-optimized tile for under 200.
Training can be done with any relatively modern GPU, with varying efficiency and capacity depending on how much you want to spend.
What price point are you trying to hit?
What price point are you trying to hit?
With regards to AI?. None tbh.
With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.
With regards to AI?. None tbh.
TBH, that might be enough. Stuff like SDXL runs on 4G cards (the trick is using ComfyUI, like 5-10s/it), smaller LLMs reportedly too (haven’t tried, not interested). And the reason I’m eyeing a 9070 XT isn’t AI it’s finally upgrading my GPU, still would be a massive fucking boost for AI workloads.
You’re willing to pay $none to have hardware ML support for local training and inference?
Well, I’ll just say that you’re gonna get what you pay for.
No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.
Too bad the US can’t import any of it.
they can if they pay 6382538% tariffs.
or was it 29403696%?
“These chips are 10,000 times faster, therefore we will increase our tariffs to 10,100%!”
Brother, have you heard of buses? Even INSIDE cpus/socs bus speeds are a limitation. Also i fucking hate how the first thing people mention now is how ai could benefit from a jump in computing power.
Edit: I havent dabbled that much in high speed stuff yet but isnt the picosecond range so fast that the capacitance of simple traces and connectors between chips influence the rising and falling edge of chips?
Wow, finally graphene has been cracked. Exciting times for portable low-energy computing
This sounds like that material would be more useful in high performance radars, not as flash memory
It‘s likely BS anyway. Maybe it’s just me but reading about another crazy breakthrough from China every single day during this trade war smells fishy. Because I‘ve seen the exact same propaganda strategy during the pandemic when relations between China and the rest of the world weren‘t exactly the best. A lot of those headlines coming from there are just claims about flashy topics with very little substance or second guessing. And the papers releasing the stories aren‘t exactly the most renowned either.
It’s definitely possible they’re amplifying these developments to maintain confidence in the Chinese market, but I doubt they’re outright lying about the discoveries. I think it’s also likely that some of what they’ve been talking about has been in development for a while and that China is choosing now to make big reveals about them.
Whatis?
I don’t remember the scene but it looks like memory from an Android. Not sparkly enough for star trek, maybe terminator 2?
Is that fast enough to put an LLM in swap and have decent performance?
Note that this in theory speaks to performance of a non volatile memory. It does not speak to cost.
We already have a faster than NAND non volatile storage in phase change memory . It failed due to expense.
If this thing is significantly more expensive even than RAM, then it may fail even if it is everything it says it is. If it is at least as cheap as ram, it’ll be huge since it is faster than RAM and non volatile.
Swap is indicated by cost, not by non volatile characteristics.
Well, I’ll never see it, unless TI or another American company designs their own version.
Or they can do what they normally do, steal it and then sue to be considered the original invento/founder and then make a Hollywood film about how they invented it/found it.
It’s called an iPod, it’s not an mp3 player.
They are called airpods, they are not earbuds.
It’s called an airplane, not an aeroplane.
Hopefully they can use it against covid.