News

A researcher is warning of an un-patchable bug affecting hundreds of millions of iPhones that gives attackers system-level access to handsets via an unblockable jailbreak hack.
A ChatGPT jailbreak used rare languages to bypass built-in safety features, but OpenAI might have already fixed it.
Be warned that jailbreak tweaks aren't going to work as well, if at all Live Photos ought to work much more smoothly on older iPhones since the feature doesn’t actually require new hardware.