rvlobato@lemmy.ml to Open Source@lemmy.ml · 9 months agoAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comexternal-linkmessage-square15fedilinkarrow-up1180arrow-down12cross-posted to: programming@programming.devtechnology@lemmy.worldlinux@lemmy.worldprogramming@lemmy.mltechnology@lemmy.worldlinux@lemmy.mlhackernews@lemmy.smeargle.fans
arrow-up1178arrow-down1external-linkAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comrvlobato@lemmy.ml to Open Source@lemmy.ml · 9 months agomessage-square15fedilinkcross-posted to: programming@programming.devtechnology@lemmy.worldlinux@lemmy.worldprogramming@lemmy.mltechnology@lemmy.worldlinux@lemmy.mlhackernews@lemmy.smeargle.fans
minus-squaremayooooo@beehaw.orglinkfedilinkarrow-up10arrow-down1·9 months agoA serious question - when will nvidia stop selling their products and start asking for rent? Like 50 bucks a month is a 4070, your hardware can be a 4090 but thats a 100 a month. I give it a year
minus-squarepoVoq@slrpnk.netlinkfedilinkarrow-up11·9 months agoIt’s more efficient to rent the same GPU to multiple people the same time, and Nvidia is already doing that with GeforceNow.
minus-squareumbrella@lemmy.mllinkfedilinkarrow-up3·9 months agowhenever the infrastructure is good enough they can keep the hardware and stream your workload to you.
minus-squareLudrol@szmer.infolinkfedilinkarrow-up3·9 months agoWhen the AI and data center hardware will stop being profitable.
A serious question - when will nvidia stop selling their products and start asking for rent? Like 50 bucks a month is a 4070, your hardware can be a 4090 but thats a 100 a month. I give it a year
It’s more efficient to rent the same GPU to multiple people the same time, and Nvidia is already doing that with GeforceNow.
whenever the infrastructure is good enough they can keep the hardware and stream your workload to you.
When the AI and data center hardware will stop being profitable.