Cloud Creates SIEM Blind Spot
Current SIEM and log management approaches for network and security devices are 'moot' in the cloud
After spending buckets of money and years of intense human resources on the deployment of comprehensive security information and event monitoring (SIEM) tools and techniques, many organizations are seeing these precious investments lose much of their value in the rush to reach for the cloud. Public cloud initiatives and, to some degree, even virtualization deployments are adding security black holes to the enterprise security monitoring framework. No "light" of visibility goes in or out of these blind spots.
To maintain security standards within increasingly distributed, virtualized, and outsourced IT infrastructure, enterprises will have to adjust if they want to understand the events that affect their infrastructure and the flow of users and data across the traditional enterprise network perimeter.
"Virtualization and cloud breaks the current model," says Mike Rothman, analyst for Securosis. "You've got no visibility of the infrastructure, by definition. So the existing methods of SIEM/log management for network and security devices anyway are moot."
This issue is amplified in the public cloud arena, where enterprises share infrastructure with other organizations dynamically and have little control or even glimpses into how everything is put together and flows.
"Most SIEM products have no difficulty providing complete visibility in virtual environments and private clouds where you have control over both the physical and virtual environments. Where system access and control are limited, so is the visibility," says Michael Maloof, CTO for TriGeo Network Security. "While cloud-based applications are a boon to productivity and data access, they rarely provide the same level of activity monitoring that you have in more traditional environments. For example, a cloud application linked to Active Directory can provide you with access control data, but not that the access originated in China."
But even when virtualized environments are actually controlled within the organization's infrastructure boundary, there remain complications in keeping track of all the activity that occurs at different virtualized layers.
"I think people can make an assumption that anything that's inside the environment is safe, but you may see rogue virtual environments coming in," says Bill Roth, chief marketing officer for LogLogic. He warns that the first step in keeping tabs on virtualized environments is to make sure there are only as many VMs as absolutely necessary. "It's so easy to spin up, and storage and processing is becoming so cheap that you run the risk of VM sprawl; companies need to be vigilant against it," he says.
No matter whether in a public or private cloud, organizations need to see to it that applications are better tuned to output monitoring information, Securosis' Rothman advises.
"We need to start instrumenting the applications to provide monitoring information and the like to provide some means of visibility," he says. "The reality is most application folks don't do a good job of building visibility into their applications. But they'll need to, given that organizations want the flexibility to run some or all of the applications in a cloud-type of environment."
Most important within the cloud environment, though, is the acquisition of key logs that offer better views into how the infrastructure that affects enterprise data really is working.
"If you're going into a cloud, demand your logs so you understand how your system is moving around so you understand what kind of performance you're getting," Roth says. "Demand your logs and demand transparency. If you can't get them, it should be a deal-breaker. Just because the cloud metaphor is opaque doesn't mean the service should be."
Maloof agrees, explaining that organizations will not be able to shift responsibility for a data breach onto their cloud provider, so they need to be proactive about keeping an eye out for potential problems.
"The fact is that while you can 'cloud-source' many applications today, that doesn't eliminate the liability associated with data loss and the need to demonstrate sound monitoring policies for compliance requirements," Maloof says.
And it shouldn't stop at logs. Organizations also need to work with providers to get a good picture into user activity and data access trends for cloud-based pools of information. This starts with improved cloud access control.
"Identity and access management systems are a critical piece of the puzzle, closely linked with well-defined policies and application-layer policy enforcement," Maloof says. "While the data and the applications exist outside the traditional network boundaries, the identity and access control systems will bridge both the physical and virtual systems."
The success of these drives for visibility really hinges on cloud provider participation, though. According to LogLogic's Roth, who is an active participant in the Cloud Security Alliance (CSA), customers are still having a hard time convincing larger providers, such as Amazon, to improve their transparency. He believes that customers need to put the pressure on. In addition, participating in organizations such as the CSA can help the industry develop standards for security monitoring within the cloud.
"We are working on a number of things that I think are going to be really important," Roth says of the CSA, which is expected to release drafts of a security monitoring matrix in November. "These are things that I think are going to be facilitate the development of cloud security."
Have a comment on this story? Please click "Discuss" below. If you'd like to contact Dark Reading's editors directly, send us a message.
About the Author
You May Also Like