“It’s a double-edged sword,” says Bill Marczak, a senior researcher at the cybersecurity watchdog Citizen Lab. “You’re going to keep out a lot of the riffraff by making it harder to break iPhones. But the 1% of top hackers are going to find a way in and, once they’re inside, the impenetrable fortress of the iPhone protects them.”
Marczak has spent the last eight years hunting those top-tier hackers. His research includes the groundbreaking 2016 “Million Dollar Dissident” report that introduced the world to the Israeli hacking company NSO Group. And in December, he was the lead author of a report titled “The Great iPwn,” detailing how the same hackers allegedly targeted dozens of Al Jazeera journalists.
He argues that while the iPhone’s security is getting tighter as Apple invests millions to raise the wall, the best hackers have their own millions to buy or develop zero-click exploits that let them take over iPhones invisibly. These allow attackers to burrow into the restricted parts of the phone without ever giving the target any indication of having been compromised. And once they’re that deep inside, the security becomes a barrier that keeps investigators from spotting or understanding nefarious behavior—to the point where Marczak suspects they’re missing all but a small fraction of attacks because they cannot see behind the curtain.
This means that even to know you’re under attack, you may have to rely on luck or vague suspicion rather than clear evidence. The Al Jazeera journalist Tamer Almisshal contacted Citizen Lab after he received death threats about his work in January 2020, but Marczak’s team initially found no direct evidence of hacking on his iPhone. They persevered by looking indirectly at the phone’s internet traffic to see who it was whispering to, until finally, in July last year, researchers saw the phone pinging servers belonging to NSO. It was strong evidence pointing toward a hack using the Israeli company’s software, but it didn’t expose the hack itself.
Sometimes the locked-down system can backfire even more directly. When Apple released a new version of iOS last summer in the middle of Marczak’s investigation, the phone’s new security features killed an unauthorized “jailbreak” tool Citizen Lab used to open up the iPhone. The update locked him out of the private areas of the phone, including a folder for new updates—which turned out to be exactly where hackers were hiding.
Faced with these blocks, “we just kind of threw our hands up,” says Marczak. “We can’t get anything from this—there’s just no way.”
Beyond the phone
Ryan Storz is a security engineer at the firm Trail of Bits. He leads development of iVerify, a rare Apple-approved security app that does its best to peer inside iPhones while still playing by the rules set in Cupertino. iVerify looks for security anomalies on the iPhone, such as unexplained file modifications—the sort of indirect clues that can point to a deeper problem. Installing the app is a little like setting up trip wires in the castle that is the iPhone: if something doesn’t look the way you expect it to, you know a problem exists.
But like the systems used by Marczak and others, the app can’t directly observe unknown malware that breaks the rules, and it is blocked from reading through the iPhone’s memory in the same way that security apps on other devices do. The trip wire is useful, but it isn’t the same as a guard who can walk through every room to look for invaders.
Despite these difficulties, Storz says, modern computers are converging on the lockdown philosophy—and he thinks the trade-off is worth it. “As we lock these things down, you reduce the damage of malware and spying,” he says.
This approach is spreading far beyond the iPhone. In a recent briefing with journalists, an Apple spokesperson described how the company’s Mac computers are increasingly adopting the iPhone’s security philosophy: its newest laptops and desktops run on custom-built M1 chips that make them more powerful and secure, in part by increasingly locking down the computer in the same ways as mobile devices.
“iOS is incredibly secure. Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction,” says security researcher Patrick Wardle.