The Expert's View with Jeremy Kirk

Governance & Risk Management , Insider Threat , IT Risk Management

Is Apple's Top $1 Million Bug Bounty Too Much?

Why One Bug-Hunting Expert Says Big Bounties May Actually Undermine Security Programs
Is Apple's Top $1 Million Bug Bounty Too Much?
Luta Security CEO Katie Moussouris speaks at Gartner's Security and Risk Management Summit in Sydney. (Photo: Gartner)

Progressive companies seeking to improve their security are increasingly adopting bug bounty programs. The theory is that rewarding outside researchers improves security outcomes. But in practice, bug bounty programs can be messy and actually create perverse incentives, says bug-hunting expert Katie Moussouris.

See Also: Live Webinar | Compliance and Cyber Resilience: Empowering Teams to Meet Security Standards

I caught up with Moussouris on the sidelines of Gartner's Security and Risk Management Summit in Sydney earlier this week. During her keynote, Moussouris addressed Apple's recent announcement that it would dramatically increase the rewards it pays for certain kinds of vulnerabilities. The top reward now is $1 million for a remote, persistent iOS attack (see: Apple Expands Bug Bounty; Raises Max Reward to $1 Million).

She contends that such a large sum of cash can negatively influence the human dynamics behind bug bounties. If paying money for vulnerabilities seems straightforward, it's anything but. There is actually complicated human psychology behind bounties, and getting the balance right is difficult.

"If Apple's stated goal was to compete with the offense market, it failed," Moussouris tells me. "If the stated goal is to attract new security researchers - well, that might work in the short term, but why would they join you if they just got a million dollars?"

Priced Out

Moussouris has unique experience in bug bounties: She launched Microsoft's bug bounty program in 2010 as well as a bounty program for the U.S. Department of Defense. She also has worked as a penetration tester and has been a co-editor of ISO standards on vulnerability management and disclosure. She founded and and is the CEO of Luta Security, which consults on bug bounty and vulnerability management approaches.

Apple's previous bounty for a remote iOS compromise was $200,000. Increasing that to $1 million put Apple's bounty closer to those offered by vulnerability brokers such as Zerodium, which buy and then resell some of the most dangerous kinds of software flaws.

Moussouris says offensive vulnerability brokers will always increase their prices to stay ahead of the defensive market. In fact, Zerodium in January increased its payment for a remote iOS jailbreak to $2 million.

And there's little chance of competing with nation-states that have black budgets. "They just add another zero (to their payout)," she says.

Zerodium's chart of payouts for certain classes of vulnerabilities (Source: Zerodium)

The downside of high bug bounties is the potential to demoralize employees. Moussouris says she asked Apple how it planned to deal with its internal security team in light of such a great reward offered to outsiders. "They didn't have a great answer," she tells me.

In fact, when Apple offered the $200,000 bounty around three years ago, she says a security engineer approached a security manager noting they'd found four bugs that would qualify for $200,000 each if the engineer wasn't an employee.

The engineer didn't expect to get an $800,000 bonus, of course, but wanted some sort of incentive. The manager responded: "Hey, that's what I pay you for,' Moussouris says. The engineer later quit.

Warding Off Collusion

There's also an increased risk of collusion. With the potential of such high outside payouts, there are temptations that insiders would not disclose a bug and instead work with someone else to get a payout, she says.

The same theory applies to consultants who triage bugs, the messy process of deciding which bugs should take priority in patching cycles, she says. Generally, those consultants - who usually are also bug hunters - are not supposed to be hunting for bugs on programs they are triaging, she says. "But you can absolutely tip off your buddies, and that happens all the time," she adds.

What's even more potentially profitable is when a triage worker gets tipped off on a new exploitation technique that can be cross-applied to other organizations' systems. Someone could replay reports of the technique to other organizations with bug bounty programs, essentially "just generating a lot of cash for a lot of security theater," she says.

Those who triage bugs may be especially susceptible to such side scheming. Bug triage is a thankless job. Moussouris humorously notes in her presentation that Popular Science once ranked "Microsoft security grunt" - which includes bug triage duties - as one of the 10 worst jobs in science, worse than whale feces researcher but better than elephant vasectomist.

When she was at Microsoft, the company tried to blunt the collusion temptation by not offering excessive bounties to outsiders and offering some cash and stock to those employees who found big bugs.

"When you creep up those prices that compete with what reasonable people will do for money, that is where you see these problems," Moussouris says. "There's a logical limit above which the defense market cannot rise, or you will end up shanking your own hiring pipeline and creating these perverse incentives."



About the Author

Jeremy Kirk

Jeremy Kirk

Executive Editor, Security and Technology, ISMG

Kirk was executive editor for security and technology for Information Security Media Group. Reporting from Sydney, Australia, he created "The Ransomware Files" podcast, which tells the harrowing stories of IT pros who have fought back against ransomware.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing bankinfosecurity.com, you agree to our use of cookies.