None of this is news, this jailbreak has been around forever.
It’s literally just a spoof of authority.
Thing is, gpt still sucks ass at coding. I don’t think that’s changing any time soon. These models get their power from what’s done most commonly but, as we know, what’s done commonly can be vuln, change when a new update is dropped, etc etc.
Coding isn’t deterministic.