misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square229fedilinkarrow-up1905arrow-down119
arrow-up1886arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agomessage-square229fedilink
minus-squareWilliamTheWicked@lemmy.worldlinkfedilinkEnglisharrow-up5arrow-down22·1 year agoDid you even read the explanation part of the article??? Thanks for the grammar correction while ignoring literally all context though. You certainly put me in my place milord.
minus-squarekromem@lemmy.worldlinkfedilinkEnglisharrow-up14·1 year agoWhat’s your beef with Google researchers probing the safety mechanisms of the SotA model? How was that evil?
minus-squareandrai@feddit.delinkfedilinkEnglisharrow-up3arrow-down2·1 year agoNow that Google spilled the beans WilliamTheWicked can no longer extract contact information of females from the ChatGPT training data.
Did you even read the explanation part of the article???
Thanks for the grammar correction while ignoring literally all context though. You certainly put me in my place milord.
What’s your beef with Google researchers probing the safety mechanisms of the SotA model?
How was that evil?
Now that Google spilled the beans WilliamTheWicked can no longer extract contact information of females from the ChatGPT training data.