Justin Sullivan / Getty Images
Starting in September, Apple will launch a bug bounty program that pays outside security researchers to find vulnerabilities in iOS and iCloud, the company announced today.
In a presentation at the BlackHat cybersecurity conference in Las Vegas, Apple Head of Security Engineering and Architecture Ivan Krstić announced the program, which is the first of its kind for the company.
The bounty program will comprise an initial preselected group of several dozen researchers and pays anywhere from $25,000 dollars, for vulnerabilities that allow access to user data from a special partition called a “sandbox,” to $200,000, for vulnerabilities in iOS devices' most fundamental code. The latter prize, among the biggest in the industry, indicates the seriousness of an exploit that could allow a hacker total access to a locked iPhone.
“If you sell those to a government, those are million dollar bugs,” said Rich Mogull, an analyst at Securosis.
There's already been a market among private security companies (and likely government agencies) for Apple software vulnerabilities for some time. In 2015, security startup Zerodium agreed to pay a seven-figure sum to hackers for an iOS exploit. And the FBI implied earlier this year it paid $1 million to an outside firm to gain access to the phone of San Bernardino shooter Syed Farook.
Now Apple wants to get hackers working on its behalf. Bug bounty programs, which incentivize hackers to improve software security rather than wrecking it, date back at least two decades to Netscape, though the programs have become much more popular in recent years. Some of tech's biggest companies, including Facebook, Google, and Microsoft, have their own bounties. And a range of startups now exist to set up bug bounties, of which there are now hundreds. Apple, which is known for tightly controlling its software, had been a notable bug bounty holdout until now.
Though many bug bounty programs are open to the public — meaning anyone can hunt down and submit security flaws — according to Bugcrowd's 2016 “The State of Bug Bounty” report, invitation-only programs such as Apple's have grown in recent years. (Google, Facebook and Microsoft all run public programs.) Apple did not disclose which security researchers it's picked to participate in its bounty program.
According to Ben Bajarin, principal analyst at the market research firm Creative Strategies, Apple has worked with outside security firms in the past. Going public, Bajarin said, is a way to formalize those relationships and expand the program, perhaps eventually beyond the group of preselected researchers Apple is starting off with.
“The goal here is to broaden this and let more people in,” Bajarin said. “I think they’re just slightly controlling it to begin with.”
Indeed, bug bounties can be a smart way for companies to improve relationships with hackers and the security research community in general. Good bug bounty programs pay fairly and account for the difficulty of hacks. The companies behind them are quick to fix discovered flaws, and they also engender goodwill through features like leaderboards and halls of fame for the top hackers. Bad programs pay poorly, and patches don't appear for months.
Apple, which earlier this year fought a much-publicized battle with the FBI over access to the San Bernardino shooter's iPhone, is reportedly working to make iOS devices so secure even the company can't crack them. In this context, the introduction of a bug bounty program can be seen as an indication that Apple's internal security teams are finding it increasingly difficult to produce vulnerabilities.
According to Mogull, the Apple bug bounty requires so-called “exploitable proof of concepts” — not just annoying bugs, but actual vulnerabilities that could be used in the real world.
“Apple produces the most secure consumer devices on the market,” he said. “The specific bugs they are looking for are not easy to find.”