Security Researchers Aren’t Inclined to Participate in Apple’s Bug Bounty Program; Here’s Why

BY Chandraveer Mathur

Published 9 Sep 2021

Apple laptop generic

For five years now, Apple has lured ethical hackers with a bounty of up to $1 million to point out critical security flaws. However, many familiar with Apple’s program claim the company is slow to fix reported bugs and doesn’t always pay hackers.

Bug bounty programs have become a preferred way to find vulnerabilities in the tech community. They encourage hackers to report problems instead of abusing them. However, Apple’s “insular culture” has reportedly damaged the program, creating a security blind spot.

Luta Security helped the Department of Defense set up its bug bounty program. The company CEO and founder Katie Moussouris claims that in the bug bounty program, “the house always wins.” She argued that consumers would have to pay the price for Apple’s bad reputation in the security industry with insecure products.

Apple’s bug bounty program started off in 2016. By 2019, the company had opened the initiative for all researchers. A former and current Apple employee both said that Apple has a massive backlog of bugs waiting to be patched. Moussouris rightly pointed out that companies should have healthy internal bug fixing mechanisms before they challenge commoners to report vulnerabilities and scale operations. “What do you expect is going to happen if they report a bug that you already knew about but haven’t fixed? Or if they report something that takes you 500 days to fix?” she said.

Apparently, Delayed Bug Fixes Aren’t the Only Issue

Moving to the financial aspect, researchers claim Apple’s bounty system also has issues. Case in point, the program pays up to $100,000 for vulnerabilities that allow attackers to gain “unauthorized access to sensitive data.”

Earlier this year, researcher Cedric Owens found one such vulnerability that could allow bad actors to bypass the Mac’s security and install malicious software. He shared his findings with Apple and it fixed the bug. However, Owen was only paid $5,000. That’s five percent of what Owens believed he deserved. Other researchers concur the vulnerability could’ve allowed access to “sensitive data.”

Omaha-based security researcher Sam Curry teamed up with friends and submitted a new bug report to Apple every couple of days. Apple paid $50,000 for one of the bugs. In total, the group earned approximately $500,000. Curry noted that Apple takes longer than the rest of the industry to pay researchers for bug bounties. Curry believes this is because Apple is well aware of its poor reputation in the tight-knit security research community.

At least one iOS engineer, Tian Zhang, went on the record to say that Apple ignored his bug reports and didn’t pay him for discovering a vulnerability. Interestingly, Apple went ahead and fixed the bug he reported. He remarked that as an engineer, one would want to ensure the safety of products built for other people. “On the other hand, it seems like Apple thinks people reporting bugs are annoying and they want to discourage people from doing so,” he continued.

Apple’s Culture of Secrecy Is at Odds with Transparency Ethical Hackers Stand By

Jay Kaplan, the founder of crowdsourced security research company Synac, claims that Apple was forced to launch the bug bounty program and embrace the public security researcher culture. He noted that thanks to Apple’s aforementioned poor reputation, researchers aren’t encouraged to report bugs to Apple. Instead, they’re “going to security conferences and speaking about it publicly and selling it on the black market, (sic)” said Kaplan.

Exploits for Apple platforms can fetch approximately $2 million on grey and black markets, just shy of the $2.5 million for equivalent Android vulnerabilities.

Apple Remains Adamant, Terms Bug Bounty Program ‘Runaway Success’

Ivan Krstic, Apple’s head of security engineering, categorically labeled the bounty program as a “runaway success.” However, when asked about why a researcher wasn’t paid for a flaw he discovered, Krstic said, “when we make mistakes, we work hard to correct them quickly, and learn from them to rapidly improve the program.”

[Via The Washington Post]