Cybersecurity In-Depth: Feature articles on security strategy, latest trends, and people to know.

Rethinking How You Work With Detection and Response Metrics

Airbnb's Allyn Stott introduces maturity model inspired by the Hunting Maturity Model (HMM) to complement MITRE ATT&CK to improve security metrics analysis.

Dart on target, holding up a piece of paper reading "METRICS"
Source: Dzmitry Skazau via Alamy Stock Photo

Sorting the false positives from the true positives: Ask any security operations center professional, and they'll tell you it's one of the most challenging aspects of developing a detection and response program.

As the volume of threats continues to rise, having an effective approach to measuring and analyzing this kind of performance data has become more critical to an organization's detection and response program. On Friday at the Black Hat Asia conference in Singapore, Allyn Stott, senior staff engineer at Airbnb, encouraged security professionals to reconsider how they use such metrics in their detection and response programs — a topic he broached last year's Black Hat Europe.

"At the end of that talk, a lot of the feedback I received was, 'This is great, but we really want to know how we can get better at metrics,'" Stott tells Dark Reading. "That's an area where I've seen a lot of struggles."

The Importance of Metrics

Metrics are critical in assessing the effectiveness of a detection and response program because they drive improvement, reduce the impact of threats, and validate investment by demonstrating how the program lowers risk to the business, Stott says.

"Metrics help us communicate what we do and why people should care," Stott says. "That's especially important in detection and response because it's very difficult to understand from a business perspective.”

The most critical area for delivering effective metrics is alert volume: "Every security operations center I've ever worked in or ever walked foot in, it's their primary metric," Stott says.

Knowing how many alerts are coming in is important but, by itself, is still not enough, he adds.

"The question is always, 'How many alerts are we seeing?'" Stott says. "And that doesn't tell you anything. I mean, it tells you how many alerts the organization receives. But it doesn't actually tell you if your detection and response program is catching more things."

Effectively leveraging metrics can be complex and labor-intensive, adding to the challenge of effectively measuring threat data, Stott says. He acknowledges that he has made his share of mistakes when it comes to engineering metrics to assess the effectiveness of security operations.

As an engineer, Stott routinely evaluates the effectiveness of the searches he conducts and the tools he uses, seeking to get accurate true- and false-positive rates for detected threats. The challenge for him and most security professionals is connecting that information to the business.

Implementing Frameworks Properly Is Critical 

One of his biggest mistakes was his approach in focusing too much on the MITRE ATT&CK framework. While Stott says he believes it provides critical details on threat actors' different threat techniques and activities and organizations should use it, that doesn't mean they should apply it to everything.

"Every technique can have 10, 15, 20, or 100 different variations," he says. "And so having 100% coverage is kind of a crazy endeavor."

Besides MITRE ATT&CK, Stott is proposing a new maturity model that he developed and calls the Threat Detection and Response (TDR_ Maturity Model. The TDR model was inspired by the the SANS Institute's Hunting Maturity Model (HMM), which helps describe an organization's existing threat-hunting capability and provides a blueprint for improving it.

"Rather than test across all of the MITRE ATT&CK framework, you're actually working on a prioritized list of techniques, which includes using MITRE ATT&CK as a tool," he says. "That way, you're not just looking at your threat intel but also at security incidents and threats that would be critical risks for the organization."

"It gives you the ability to, as a metric, say where you're at as far as your maturity today and how the investments you're planning to make or the projects you're planning to do will increase your maturity," Stott says.

Stott also developed a new framework called the Streamlined, Awareness, Vigilance, Exploration, and Readiness (SAVER) Framework, which aims to help detection and response practitioners build better metrics.

Use of these guidelines for metrics requires buy-in from CISOs, since it means gaining organizational adherence to these different maturity models. Nevertheless, it tends to be driven by a bottom-up approach, where threat intelligence engineers are the early drivers.

Read more about:

Black Hat News

About the Author(s)

Jeffrey Schwartz, Contributing Writer

Jeffrey Schwartz is a journalist who has covered information security and all forms of business and enterprise IT, including client computing, data center and cloud infrastructure, and application development for more than 30 years. Jeff is a regular contributor to Channel Futures. Previously, he was editor-in-chief of Redmond magazine and contributed to its sister titles Redmond Channel Partner, Application Development Trends, and Virtualization Review. Earlier, he held editorial roles with CommunicationsWeek, InternetWeek, and VARBusiness. Jeff is based in the New York City suburb of Long Island.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights