Behavioral Analytics in Cybersecurity: Who Benefits Most?Behavioral Analytics in Cybersecurity: Who Benefits Most?

As the cost of data breaches continues to climb, the role of user and entity behavioral analytics (UEBA) has never been more important.

Jackie Wyatt, Adjunct Professor of Cyber Studies, University of Tulsa

February 7, 2025

4 Min Read
Hand holding a magnifying glass against a dark background
Source: Igor Stevanovic via Alamy Stock Photo

COMMENTARY

Last year, the cost of a data breach rose 10%, from $4.4 million to $4.8 million, as stated by IBM's annual "Cost of a Data Breach Report." According to cybersecurity firm Vectra AI, more than 70% of security operations center (SOC) leaders fear that a real attack will be hidden under an overwhelming flood of false-positive alerts and other security noise. The resulting burnout may be contributing to the labor shortage plaguing the industry. 

As the cost of data breaches continues to climb along with the deluge of meaningless alerts on an increasingly stressed workforce, the role of behavioral analytics in cybersecurity, or user and entity behavioral analytics (UEBA), has never been more important. That role is even more critical for schools, government agencies, and hospitals and other healthcare facilities. These entities play crucial roles in our day-to-day lives but operate with limited resources. They tend to have smaller cybersecurity teams and less money, amplifying the potential perils of a breach. In fact, the smaller the team, the greater the need for UEBA, which has a number of benefits.

Weeds Out the Noise 

In the absence of behavioral analytics, every login and machine connection can result in an alert for the SOC. Without the ability to differentiate true threats from false positives, every threat detected is treated with high priority, which may cause the true positives to either get missed or not be addressed in a timely fashion. In places like schools, government offices, and medical facilities the consequences of missing an actual threat are monumental. These establishments thrive on their reputation while accumulating critical information that could mean life or death for a person.   

The danger of alert fatigue is real, causing SOC analysts to eventually ignore certain signals or attempt to address them all with no sense of which ones indicate actual risk. UEBA tracks access patterns across people, machines, and systems; eventually detecting and alerting the SOC only when there's a true risk. Not only does this approach cut out alert noise, it also limits the information bombarding the security team, reduces false positives, and virtually eliminates alert fatigue so analysts can focus on important alerts. Using behavioral analytics to detect the patterns that are normal makes it much easier to spot the ones that aren't. 

Allows for Prioritization

Using UEBA is already a no-brainer for many SOCs, but it has not been implemented for most hospitals, government institutions, and schools. Analyzing behavior patterns could have an even bigger payoff for those types of entities, in part because their teams are small. With a fully functioning, automated UEBA system, analysts know alerts are far more likely to be credible and worth their time to investigate. 

When discussing pattern detection and automation, AI naturally springs to mind. This makes perfect sense because UEBA is an excellent use case for AI. The kind of public sector/public good entities that have the most limited resources are best suited to leverage the power of AI combined with UEBA. It could virtually eliminate the disadvantage of fewer resources because a well-trained AI would only surface credible threats for human vetting. AI is not susceptible to alert fatigue, burnout, or the lure of a higher salary elsewhere. An automated system may be able to handle threat response, however its greatest potential lies in prioritizing potential threats for SOC analysts to analyze them more efficiently. This saves engineers and analysts countless hours, since they do not have to craft rules and queries to achieve the desired outcome.

Reduces Risk 

It may be difficult for some executives to wrap their heads around placing an automated system at the heart of their security operations. Some worry about all the company data being in one place. Others fear AI might miss something, but it seems that the risk of a small and overwhelmed team subject to burnout is greater. While it's possible for a problem to surface with AI, it's more likely that a problem will arise with stressed out people.  

Some companies have already seen the benefit of involving AI in their security operations. About 70% of those that responded to a Vectra AI survey said AI has already reduced burnout and increased threat detection and response abilities. Imagine moving the needle that much for the places that instruct our kids, provide healthcare, and keep the lights on. 

About the Author

Jackie Wyatt

Adjunct Professor of Cyber Studies, University of Tulsa

Jackie Wyatt is an adjunct professor of cyber studies at the University of Tulsa and a cybersecurity operations analyst at QuikTrip. Jackie received a master’s degree in cybersecurity from Maryville University. She holds the Certified Enterprise Defender (GCED) certification and Coin from SANS, marking her as an expert in network defense and incident response. Her combination of advanced education and hands-on experience makes her a valuable figure in the cybersecurity field.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights