- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
Not possible.
Why? If they looked at how current tech works then they could easily develop the same tech 10000x faster
Yeah… At best click baity as fuck, at worst a complete scam.
Any time there is a 10x or more in a headline you are 10x or more likely to be right by calling it BS.
Whenever they say X whatever times, I doubt it right away, because they always interpret the statistics in the dumbest ways possible. You have a solar panel that is 28% efficient. There is no way it can be 20x times as efficient, that’s just clickbait.
You just fucking wait. Trump is bringing manufacturing to the US. And when that plant opens someday you’ll be so sorry you doubted.
I’m sure the foxconn plant in Wisconsin will fire up ANY DAY NOW! drums fingers
This article appeared in my feed just above another article about how China has the world’s first operational thorium reactor. Meanwhile, the US is about to fight a civil war over whether vaccination causes measles and stripping away the last of our social programs in order to get our wealthiest people another 2% subsidy.
China and Russia worked very hard to get these rich stupid people in power.
It really started in 2016 when US security agencies released a joint report showing Russia was spreading misinformation to help Trump win the election.
Surprisingly, the “liberal tears compilations” and “something about an email server people didn’t understand wasn’t actually illegal” actually worked and drowned out the warnings from our security agencies.
I don’t think China will be any better of a world leader tbh.
I see humanity’s future as a boot stepping on a human face forever, unless humanity globally rejects kings, oligarchs, and dictators.
Don’t forget the genius DNC folk, including HRC thought a pied piper strategy of boosting the circus peanut was a good idea.
If the Russians and Chinese did anything it was just capitalizing on an unforced error by the hubris of the centrist. One again, bernie would have won, but that was more distasteful to the ruling class than fascism.
Oddly, the DNC’s position on the republican candidate in the circus that was the 2016 primary wasn’t likely all that influential or determinative.
trump figured out that running a political campaign as entertainment and leveraging the power of, well, just lying about everything was possible in the modern media environment. republicans had been working for decades on tilling the ground for an authoritarian that they could manage, but got themselves owned instead. Oops.
Not my future, I will try to die in a way that even an omnipotent AI can’t bring back.
Gonna make sure to bring as many of those fuckers with me as possible.
You rely on professional fabrications of misinformation to tell you the truth about who is producing misinformation? Don’t fall for crude propaganda. When empires end they do some self-destructive things. It’s normal.
This is observable reality.
You should pay attention
It really started in 2016 when US security agencies released a joint report showing Russia was spreading misinformation to help Trump win the election.
Compare russia to the British and consider who is the bigger villain.
Fuck the idiotic Americans that won’t bother to immunize, never mind understanding science as a whole.
No. We don’t want them to breed…
China scientists
So, Chinese scientists?
Chientists
Probably because is an ethnicity and nationality. There are ethnic Chinese people all over the world and a few countries and regions are made of a majority of ethnic Chinese but are not related to China. Calling them the same thing is playing into the PRC’s “all ethnic Chinese pledge their allegiance to China” nonsense.
Isn’t that true for every (older) country though?
Perhaps but I haven’t encountered that myself. I’m ethnic Chinese that’s a citizen of another ethnic Chinese majority nation so I’ve encountered this specific type a lot more.
Isreal like that game of pretend. They believe anti zionist Jews are traitors.
It’s a reasonable assumption that someone in China is Chinese.
The reverse, however, isn’t true. It may be somewhat understandable but not entirely reasonable to assume someone who is Chinese is from China which is what I’m trying to say.
I think it’s a slightly different connotation. “China scientists” infers scientists residing in China while not presuming their ethnicity, while “Chinese scientists” implies their ethnicity but not their location.
You literally never hear “America scientists” even if some of them might be from another country. Same with every single other country I can think of, except China.
US scientists works in the same way.
No they are people who study the China Science.
Real talk, why is discussion around people and subjects in China so fucking weird?
If it’s not referring to the entire population when it only applies to the government or a subset of them as a global “the Chinese” or doing silly shit like “China scientists” everyone’s grammatical skills suddenly tank when even broaching a topic even tangential to the PRC.
No, it’s people who study fine tableware.
Seriously, for me a “China scientist” is someone doing research on China, like a space scientist would do research on astronomy and similar. But I’m not a native English speaker, so, idk
Someone doing research on China is a chiologist.
Same as someone doing research on biology is a biologist.
Biology -> biolog -> biologist
China -> chin -> chinist?
The wording of the headline would be different if it were trying to convey that.
Me stutter? No think so!
Him legend.
Yeah, but endurance. and accuracy. and longevity. How about those?
And price and maye write more than 1 single bit
Link to the actual paper: https://www.nature.com/articles/s41586-025-08839-w
The repro and verification will take time. Months or even years. Don’t trust anyone who says it’s definitely real or definitely bunk. Time will tell.
Speaking of, did you hear there’s a new room temperature super conductor?
Damn. I just pulled all my stock out quantum computing and thru it all into this…
Easy when you have zero
By tuning the “Gaussian length” of the channel, the team achieved two‑dimensional super‑injection, which is an effectively limitless charge surge into the storage layer that bypasses the classical injection bottleneck.
Which episode of Star Trek is this from?
The one where there’s a problem with the holodeck.
Do you have any idea how little that narrows it down?
It’s the one where Barclay gets obsessed with his Holodeck program.
I don’t think so. I just rewatched it. It’s the one where Data finds out something to make himself more human. Picard tells him something profound and moving.
I think I saw that one. It’s the one where Ricker sits down on a chair like he’s mounting a small horse.
Maybe it’s the episode where Picard does this:
They’re just copying the description of the turbo encabulator.
AI AI AI AI
Yawn
Wake me up if they figure out how to make this cheap enough to put in a normal person’s server.
normal person’s server.
I’m pretty sure I speak for the majority of normal people, but we don’t have servers.
“Normal person” is a modifier of server. It does not state any expectation of every normal person having a server. Instead, it sets expectation that they are talking about servers owned by normal people. I have a server. I am norm… crap.
Yeah, when you’re a technology enthusiast, it’s easy to forget that your average user doesn’t have a home server - perhaps they just have a NAS or two.
(Kidding aside, I wish more people had NAS boxes. It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives. In a good day. I do have a USB floppy drive and a DVD drive just in case.)
Hello fellow home labber! I have a home built xpenology box, proxmox server with a dozen vm’s, a hackentosh, and a workstation with 44 cores running linux. Oh, and a usb floppy drive. We are out here.
I also like long walks in Oblivion.
Man oblivion walks are the best until a crazy woman comes at you trying to steal your soul with a fancy sword
lol yeah, the lemmy userbase is NOT an accurate sample of the technical aptitude of the general population 😂
It’s pretty disheartening to help someone find old media and they show a giant box of USB sticks and hard drives.
Equally disheartening is knowing that both of those have a shelf-life. Old USB flash drives are more durable than the TLC/QLC cells we use today, but 15 years sitting unpowered in a box doesn’t have very good prospects.
Ikr…Dude thinks we’re restaurants or something.
You… you don’t? Surely there’s some mistake, have you checked down the back of your cupboard? Sometimes they fall down there. Where else do you keep your internet?
Appologies, I’m tired and that made more sense in my head.
Well obviously the internet is kept in a box, and it’s wireless. The elders of the internet let me borrow it occasionally.
You can get a Coral TPU for 40 bucks or so.
You can get an AMD APU with a NN-inference-optimized tile for under 200.
Training can be done with any relatively modern GPU, with varying efficiency and capacity depending on how much you want to spend.
What price point are you trying to hit?
I just use pre-made AI’s and write some detailed instructions for them, and then watch them churn out basic documents over hours…I need a better Laptop
What price point are you trying to hit?
With regards to AI?. None tbh.
With this super fast storage I have other cool ideas but I don’t think I can get enough bandwidth to saturate it.
With regards to AI?. None tbh.
TBH, that might be enough. Stuff like SDXL runs on 4G cards (the trick is using ComfyUI, like 5-10s/it), smaller LLMs reportedly too (haven’t tried, not interested). And the reason I’m eyeing a 9070 XT isn’t AI it’s finally upgrading my GPU, still would be a massive fucking boost for AI workloads.
You’re willing to pay $none to have hardware ML support for local training and inference?
Well, I’ll just say that you’re gonna get what you pay for.
No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.
Precisely.
I have a hard time believing anybody wants AI. I mean, AI as it is being sold to them right now.
I mean the image generators can be cool and LLMs are great for bouncing ideas off them at 4 AM when everyone else is sleeping. But I can’t imagine paying for AI, don’t want it integrated into most products, or put a lot of effort into hosting a low parameter model that performs way worse than ChatGPT without a paid plan. So you’re exactly right, it’s not being sold to me in a way that I would want to pay for it, or invest in hardware resources to host better models.
Clickbait article with some half truths. A discovery was made, it has little to do with Ai and real world applications will be much, MUCH more limited than what’s being talked about here, and will also likely still take years to come out
The key word is China, let us not kid ourselves. Otherwise it would be just another pop sci click but now it can be an ammunition in the fight with imperialist degenerated west or some bs like that
Too bad the US can’t import any of it.
they can if they pay 6382538% tariffs.
or was it 29403696%?
“These chips are 10,000 times faster, therefore we will increase our tariffs to 10,100%!”
Brother, have you heard of buses? Even INSIDE cpus/socs bus speeds are a limitation. Also i fucking hate how the first thing people mention now is how ai could benefit from a jump in computing power.
Edit: I havent dabbled that much in high speed stuff yet but isnt the picosecond range so fast that the capacitance of simple traces and connectors between chips influence the rising and falling edge of chips?
That’s pretty much my understanding. Most of the advancements happened in memory speeds are related to the physical proximity of the memory and more efficient transmission/decoding.
GDDR7 chips for example are packed as close as physically possible to the GPU die, and have insane read speeds of 28 Gbps/pin (and a 5090 has a 512-bit bus). Most of the limitation is the connection between GPU and RAM, so speeding up the chips internally 1000x won’t have a noticeable impact without also improving the memory bus.
Does flash, like solid state drives, have the same lifespan in terms of write? If so, it feels like this would most certainly not be useful for AI, as that use case would involve doing billions/trillions of writes in a very short span of time.
Edit: It looks like they do: https://www.enterprisestorageforum.com/hardware/life-expectancy-of-a-drive/
Manufacturers say to expect flash drives to last about 10 years based on average use. But life expectancy can be cut short by defects in the manufacturing process, the quality of the materials used, and how the drive connects to the device, leading to wide variations. Depending on the manufacturing quality, flash memory can withstand between 10,000 and a million [program/erase] cycles.
For AI processing, I don’t think it would make much difference if it lasted longer. I could be wrong, but afaik, running the actual transformer for AI is done in VRAM, and staging and preprocessing is done in RAM. Anything else wouldn’t really make sense speed and bandwidth wise.
Oh I agree, but the speeds in the article are much faster than any current volatile memory. So it could theoretically be used to vastly expand memory availability for accelerators/TPUs/etc for their onboard memory.
I guess if they can replicate these speeds in volatile memory and increase the buses to handle it, then they’d be really onto something here for numerous use cases.
Wow, finally graphene has been cracked. Exciting times for portable low-energy computing