5 Considerations For Post-Breach Security Analytics
Preparing collection mechanisms ahead of time, preserving chain of custody on forensics data, and performing focused analysis all key in inspecting security data after a compromise
November 19, 2013
Some of the most important security analytics tasks that organizations perform is done with the pressure of a running clock and exacting standards for how data is preserved and manipulated. Unlike day-to-day log analysis, post-breach inspection of security data requires special considerations in the collection and handling of information following a compromise.
1. Collecting Relevant Data
The importance of the clock ticking in a compromise situation is one of the most crucial to remember when conducting analytics on forensic data, for two reasons. First, investigators need to figure out what went wrong in order to stop active compromise situations and prevent further damage from occurring. Second, minimizing the breach notification window with ample public information is crucial from a regulatory, legal, and PR perspective.
"When a breach has been detected, it's really important to have instant visibility from multiple viewpoints because you need to actually understand the breach, scope out the damage, and remediate," says Lucas Zaichkowsky, enterprise defense architect for AccessData.
Some of the types of data that can come into play within a forensic analysis include log files from multiple sources, information on affected endpoints, such as structured file data or data in memory, as well as volatile data, such as open network connections or running processes on systems, says J.J. Thompson, managing partner at Rook Consulting.
"You're going to want to collect anything that is in scope for the incident, so you're going to want to make sure you collect all of the system logs, database logs, and network logs that you can possibly get your hands on," he says, "and make sure that those are accessible and available for future analytics. That's step one."
Depending where an initial log review starts to lead the incident response team, that's where deeper collection of data within host logs will occur. This is in contrast to standard security operations analytics, where host data happens "significantly less frequently," Thompson says.
[How do you know if you've been breached? See Top 15 Indicators of Compromise.]
2. Make Data Collection A Possibility
Unfortunately, many organizations struggle to gain timely visibility into security data because they didn't prepare enough data collection mechanisms in advance of the incident to offer that immediate lens into what happened within the infrastructure impacted by a breach.
"A lot of time, people will find out what they need to collect once they see the indicators of compromise and realize that collecting that information from then on is kind of a moot point," says Chris Novak, global managing principal of investigative response for Verizon Enterprise Solutions, who recommends that organizations test themselves with mock incidents and walk through a collection scenario before their hair is on fire. "A mock incident is a way to really have those teachable moments as to what exactly it is that you need to be prepared for."
In addition to shortfalls in data collection mechanisms, the mock incident could uncover a frequently lacking piece of foundational information: namely, an up-to-date network diagram. Novak says he is frequently surprised by how many organizations might have a fully detailed rendering of the physical building a data center is hosted in while lacking a network map counterpart.
3. Preserve Data For Longer Than You Think You'll Need It
As organizations think about what types of data to routinely collect, they should also be mindful of keeping it long enough as a precautionary measure to allow for a lengthy enough backward look at the data to pinpoint the initial compromise. According to Zaichkowsky, the longest time he has witnessed between an initial discovery of compromise and forensics trail to initial infiltration of "victim zero" was 456 days.
"That's a long attack life cycle that they need to be able to reconstruct what happened," he says.
As a rule of thumb, Zaichkowsky recommends organizations retain at least a year's worth of relevant log data, with three months' worth online and ready to search at a moment's notice.
In addition to this precautionary groundwork, once a breach has been discovered, those retention windows on the in-scope data should lengthen considerably. After an investigation is complete, organizations should secure and archive that collected data in case it is needed for a rainy day. That could mean for legal purposes, but also in case that compromise went deeper than initially thought.
"A lot of times companies will go through the process, remediate, and then when they find three months later the attack was resumed, they realize the attacker is still in the system, but all of the relevant data was deleted after the investigation," he says.
4. Establish A Chain Of Custody
As Zaichkowsky mentioned, analytics of forensics data will lead to inspection of data that's rarely looked into on a day-to-day basis. As an investigation team digs into a collection of volatile and legally sensitive data, they must not only think about preservation of data that will lead to swift mitigation of risk, but also preservation of evidence in a legally admissible way.
"Things typically start with the preservation of the evidence: not powering off systems so we can collect volatile data and maintaining a proper chain of custody," Novak says.
Establishing chain of custody is an imperative for cases where litigation or legal proceedings could occur. The key is being able to document how data was obtained, by whom, when it was obtained, and maintaining the integrity of the data state to prove it was never tampered with during the investigation process, Thompson says.
"It's really about making sure that you can show counsel that this evidence was obtained using forensically sound mechanisms, it was not altered, and you have that evidence available for opposing counsels, advisers, consultants, and experts to analyze it there themselves and see if they come to the same conclusions," he says.
Typically, the best practice is to pull the entire binary or data in full, duplicate it, and keep a hashed copy prior to running analytics on the working copy of data in order to show it hasn't been altered in any way, Zaichkowsky says.
5. Go Down The Rabbit Hole Without Getting Lost
With evidence bagged and tagged and data ready for analysis, the hard work still awaits investigators who must roll up their sleeves and inspect the data. While the mantra for forensics collection of data is to collect as much as you can that could tie to the incident, that scope needs to be tightened once it is time to run analysis.
"Usually what happens is you have massive scope creep and an overconsumption of that forensics data -- you collect so much you feel like you have to analyze the same amount," says Novak, who instead recommends customers use an "evidence-based" approach to the investigation. "How did you recognize the problem? Start there and only expand as much as you need."
Thompson agrees, stating that organizations should let the indicators of compromise lead the investigation into the paths of analysis. One way he gets his analysts to tighten their focus is to go through an exercise where they literally draw a box on a piece of paper and write out the components that led them to believe there was a compromise. The idea is to draw out lines and start brainstorming within that box similar to how a detective would work through evidence in a physical crime case. With that picture in front of them, it is easier for analysts to list the investigative techniques to start with so they can jump down potential rabbit holes without getting lost.
"That really helps them keep on track so that they don't end up veering off course," he says.
Have a comment on this story? Please click "Add Your Comment" below. If you'd like to contact Dark Reading's editors directly, send us a message.
Read more about:
2013About the Author
You May Also Like