Xatolos@reddthat.com to Technology@lemmy.worldEnglish · 2 years ago'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly bannedwww.tomshardware.comexternal-linkmessage-square7fedilinkarrow-up110arrow-down10
arrow-up110arrow-down1external-link'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly bannedwww.tomshardware.comXatolos@reddthat.com to Technology@lemmy.worldEnglish · 2 years agomessage-square7fedilink
minus-squaretomas@lm.eke.lilinkfedilinkEnglisharrow-up1·edit-22 years agosummary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.
summary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.