Bug Bounties and the Cobra Effect

Are bug bounty programs allowing software companies to skirt their responsibility to make better, more secure products from the get-go?

Oleg Brodt, R&D Director of Deutsche Telekom Innovation Labs, Israel, and Chief Innovation Officer for Cyber@Ben-Gurion University

May 26, 2021

5 Min Read
Dark Reading logo in a gray background | Dark Reading

During British rule in India in the second half of the 19th century, the British were concerned about the abundance of venomous cobra snakes across Delhi. To mitigate the threat, the British government started offering bounties for every dead cobra snake that hunters turned in.

As expected, the bounty program was a success, as dead cobras started pouring in. Soon enough, more volunteers joined the effort, and it seemed that cleaning Delhi out of deadly snakes was a matter of time. Nevertheless, as months went by, the pace of beheaded-snake deliveries did not decline. On the contrary: Unlike as the government anticipated, it increased.

Authorities started to investigate. To their surprise, Delhi was full of cobra breeding farms. Once the street snake population started to decline, it was much harder for the hunters to keep up with their bounty rewards. Instead, the locals set up snake breeding farms and now had an unlimited supply of dead cobras, yielding a constant income of reward money.

Once the authorities realized their original intentions backfired, they shut the program down in disappointment. But the breeders — now stuck with worthless cobras — released the snakes free, yielding an even larger population of venomous animals wandering the streets. In other words, the program contributed to an increase — rather than a decrease — in the number of wild snakes, at taxpayers' expense. Consequently, the phenomenon of achieving opposite results from the original goal was dubbed the "cobra effect."

What the Cobra Effect Has to Do With Bug Bounties
This anecdote can teach us a valuable lesson about cybersecurity bug bounty programs. Most of them offer monetary compensation, corporate swag, and leaderboard "glory" to bug hunters who disclose cybersecurity vulnerabilities. Despite good intentions, an entire ecosystem of bug bounty hunting has emerged. There are now specialized courses, trainings, books, conferences, and program management companies dedicated to bug bounties. It seems bug hunting became an industry of its own — almost none of which existed a decade ago — with a growing army of bug hunters.

Unlike cobras, vulnerabilities cannot be "bred." Therefore, at least theoretically, at some point, the bug hunters should dry out the swamp of vulnerabilities. Nonetheless, in practice, it is rather convenient for software vendors to transfer the liability of eliminating vulnerabilities in their products to bug hunters, who are much cheaper than maintaining dedicated security personnel.

You see, there are two main ways to tackle software vulnerabilities. You can either prevent them with extensive secure-by-design development, code testing, static and dynamic analysis, and fuzzing, or you can detect them by looking for vulnerabilities after the code is in production. Hopefully, you are doing both. However, while secure-by-design coding entails hiring experienced security personnel, properly training developers, and delaying release cycles to finish security testing, bug hunting is much cheaper. Vendors can just delegate their bug-finding responsibilities to an army of freelancers and, instead of paying salaries, pay for success.

In other words, in a world without bug hunters, companies must invest more in better coding practices and bear more responsibility for their products' security in the first place, before the product hits the market. The cobra effect in cybersecurity bug bounty programs allows vendors to run away from their responsibility to make better, more secure products from the get-go, solve the problem at its source, or at least try.

Carrots or Sticks?
Indeed, there is room for bug bounty programs. However, as time goes by, it's become clearer that bug bounties are not a magical solution to a difficult problem. We must make sure that we set the right incentives for them to work as intended.

We must give carrots to companies that take security seriously by providing them with legal "safe harbor" provisions that protect them from cybersecurity-related civil litigation. If they work to protect society, society should pay them back in the same coin. To be eligible for legal protection, they must, at a minimum, demonstrate that all their software developers are properly trained to write secure code; their code complies with secure development standards; they bake security into the products from the start; and they take their vulnerability disclosure policy seriously by fixing bugs within a predefined time frame. Obviously, such bug disclosures can be done by bug hunters, but bug bounties should be a small part of the overall solution.

Conversely, companies that don't take society's security seriously must get the stick. They should not be eligible for any legal protection if they are sued for lousy security. Affording whistleblower protection for security researchers is important since some companies threaten researchers who find vulnerabilities in their products with legal action.

Let's try to solve the vulnerabilities problem at its source. It will never be perfect, but we should try, nonetheless. In most developed countries, we get clean water to our homes. It has been sanitized and filtered by the water companies and authorities. The situation is quite different in third-world countries. People who are lucky enough to get water to their homes must filter it themselves. For society as a whole, central filtration is cheaper, more effective, and less time consuming. The societal benefit of cybersecurity is the same. Let's shift the focus to cleaning vulnerabilities at the source, rather than shipping out insecure software while counting on bug hunters to solve the problem.

Indeed, we have made positive progress since the days of Charlie Miller and the "no more free bugs" movement, and bug bounties programs reinforced cybersecurity for a while. But it's time to rethink the balances they create and make sure we are on the right path. After all, our goal is to make security better — not to create more cobra effects.

About the Author

Oleg Brodt

R&D Director of Deutsche Telekom Innovation Labs, Israel, and Chief Innovation Officer for Cyber@Ben-Gurion University

Oleg Brodt serves as the R&D Director of Deutsche Telekom Innovation Labs, Israel. He also serves as the Chief Innovation Officer for Cyber@Ben-Gurion University, an umbrella organization responsible for cybersecurity-related research at Ben Gurion University, Israel. Prior to joining DT Labs and Cyber@BGU, Oleg was an attorney specializing in technology and high tech and represented a broad spectrum of local and international clients.

Oleg is a veteran of an elite technological unit of the Israeli Defense Forces, and he is an author of several cybersecurity patents and research papers. In addition, to CISSP, CCNP, Linux LFCA, and other technological certifications, Oleg holds bachelor's and master's degrees in international business law as well as a degree in business and management from the Inter-Disciplinary Center, Herzliya, Israel. Oleg serves as a member of the Israeli National Committee on Artificial Intelligence, Ethics, and Law, and is a member of the Israel Bar High-Tech committee.


Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights