What Drives A Developer To Use Security Tools -- Or Not

National Science Foundation (NSF)-funded research by Microsoft Research, NC State, and UNC-Charlotte sheds light on what really makes a software developer scan his or her code for security bugs.

Dark Reading logo in a gray background | Dark Reading

Software developers are most likely to run security vulnerability scans of their code if their peers are doing so.

The infamous cultural gap between software developers and cyber security may be just as much about mindset and psychology than technology when it comes to the use of security tools, a team of computer science and psychology researchers from Microsoft Research, North Carolina State University (NC State), and the University of North Carolina-Charlotte have found.

"The power of seeing someone you know using the tool was the most substantial way it predicted their likelihood of using" it, says Emerson Murphy-Hill, the lead researcher for the National Science Foundation (NSF)-funded project, and an associate professor of computer science at NC State University. That's because seeing a tool in a real-world setting--especially in use by someone you work with--is a more effective "testimonial" than an outside recommendation, he says.

Another major factor in security tool adoption by developers is corporate culture, where managers encourage developers to employ security tools, according to the research. "Developers are fairly independent people. I figured management wouldn't have that big of a role to play" in determining their use of the tools, Murphy-Hill says. "[But] What management says made a difference: developers actually paid attention to whether management encourages the use of tools."

And interestingly, developers who said they worked on products in which security was important were not much more likely to use security tools than other programmers, the researchers found in a survey of developers from 14 companies and 5 mailing lists. They came up with nearly 40 predictors of security tool use, which they detailed in an academic paper that they will present next week at the Symposium on the Foundations of Software Engineering in Bergamo, Italy. They also will present two other related papers -- Questions Developers Ask While Diagnosing Potential Security Vulnerabilities with Static Analysis and Bespoke Tools: Adapted to the Concepts Developers Know.

The second study (and paper) looked at whether security tools provide the information developers really need to determine if there's a legitimate problem in the code and if so, how to fix it. They armed 10 developers--novice and experienced ones--with an open-source static-analysis security tool called Find Security Bugs to scan for bugs in a vuln-ridden open source software program.

The programmers found that the tool offered multiple resolution options, but not sufficient contextual information about the pros and cons of each fix. "We found that this made it difficult for programmers to select the best course of action," Murphy-Hill says. The tool also failed to connect the dots between notifications that were related to the same problem, for instance, which caused more confusion.

"A lot of research has been about the technical stuff, how we can add more power, do more to make it more sophisticated, how to find more and more bugs," Murphy-Hill says. "But there's the other side of it, too: some [developer] has to deal with the output of those tools. What's making the difference for them? How do they make a choice of whether to do something with the vulnerabilities, ignore the tool, or spend more time on code … Where do they spend their time and how do we make these tools better."

He acknowledges that a more polished commercial tool might have some resulted in some different experiences for the developers in the research, but the study wasn't about focusing on an individual tool. "When you're looking at a vulnerability with tainted data potential, you have to figure out where the data came from," he says.

Plus, one security tool may be better for one user than another, he says.

That's where the so-called "bespoke" security tools come in for developers: "Tools that consider the programmer as a person are more valuable," he says. The researchers are developing prototype tools that automatically learn and adapt to the programmer's expertise and interest, and they hope that will inspire tool vendors. 

About the Author

Kelly Jackson Higgins, Editor-in-Chief, Dark Reading

Kelly Jackson Higgins is the Editor-in-Chief of Dark Reading. She is an award-winning veteran technology and business journalist with more than two decades of experience in reporting and editing for various publications, including Network Computing, Secure Enterprise Magazine, Virginia Business magazine, and other major media properties. Jackson Higgins was recently selected as one of the Top 10 Cybersecurity Journalists in the US, and named as one of Folio's 2019 Top Women in Media. She began her career as a sports writer in the Washington, DC metropolitan area, and earned her BA at William & Mary. Follow her on Twitter @kjhiggins.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights