corbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 4 days agocan't beat the classicsinfosec.pubimagemessage-square44fedilinkarrow-up1435arrow-down111
arrow-up1424arrow-down1imagecan't beat the classicsinfosec.pubcorbin@infosec.pub to Lemmy Shitpost@lemmy.worldEnglish · 4 days agomessage-square44fedilink
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up13arrow-down2·4 days agoAs always, technology isn’t the enemy, it’s the corporations controlling it that are. And honestly the freely available local LLMs aren’t too far behind the big ones.
minus-squarelmuel@sopuli.xyzlinkfedilinkEnglisharrow-up8·4 days agoWell in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs. The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.
As always, technology isn’t the enemy, it’s the corporations controlling it that are. And honestly the freely available local LLMs aren’t too far behind the big ones.
Well in some ways they are. It also depends a lot on the hardware you have of course. A normal 16GB GPU won’t fit huge LLMs.
The smaller ones are getting impressively good at some things but a lot of them are still struggling when using non-English languages for example.