Cloud Security: To Scale Safely, Think Small

Why today's enterprises need an adaptable cloud infrastructure centered around flexibility, portability, and speed.

Amrit Williams, CTO, CloudPassage

January 19, 2016

4 Min Read
Dark Reading logo in a gray background | Dark Reading

“Intelligence is the ability to adapt to change.” —Stephen Hawking

Enterprises would be wise to follow Hawking’s definition of intelligence. The modern data center, in all its incarnations, is becoming increasingly more dynamic and elastic. Chances are, your network was designed according to last century’s perimeter-based security principals and is likely composed of a hodgepodge of legacy infrastructure. From a security perspective, this is untenable. While throwing everything out and starting from scratch is both financially unfeasible and operationally unproductive, enterprises cannot continue to use last-century’s techniques to deal with today’s threats. 

Many enterprises are moving to adopt cloud infrastructure to both reduce their hardware footprint and the resulting costs and effort it takes to house and maintain the servers, and take advantage of on-demand compute and storage resources. Whether you are an enterprise evolving your data center to adopt private, public and/or hybrid cloud computing solutions, or an infrastructure-as-a-service provider offering compute and storage services to organizations, your security strategy must evolve, too. Securing data beyond that now-mythical perimeter is imperative. To accomplish this, security professionals have to let go of any residual antipathy toward automation and sever any attachments to the infrastructure-centric security mindset. In a word: adapt.

So where to begin? If you want to scale safely, you have to start by thinking small…very small.

In the DevOps world where agile development is all the rage, we’re seeing the emergence of containers and microservices -- systems and applications that are broken down into smaller, modular, self-contained components. As with computing in general, the microservices movement similarly breaks applications down into smaller, independent processes focused on specific tasks that communicate with each other.

The security use case

In this article, I’m going to focus on the security use case for network infrastructure. You still need firewalls and intrusion detection for traffic coming in and out of the network (north-south traffic). But due to the very real potential for an attack to take advantage of lateral movement between applications and compute resources (for example, an attacker compromising a fairly insecure resource and then using that access to pivot to a more critical application or internal resource), an adaptable security strategy must now also focus on what’s going on inside the datacenter (east-west traffic) and in cloud environments at the workload level itself.

To reduce the attack surface, micro-segmentation can be used to partition the workloads and their interactions with each other into logical application groupings. Those groupings form smaller protectable units, each accompanied by its own lightweight layer of security. You still have the firewall monitoring the source of traffic with coarse-grained controls, but it’s no longer the primary sentry; it’s just one of a number of safeguards in a multilayer, multidirectional defense structure. And now, micro-segmentation at the workload level itself, and not just at the network, offers an additional layer of fine-grained controls.

This is important because some of the more nefarious attacks have been able to bypass the network level controls and easily move between workloads, compromising machine-to-machine communication. It’s an important construct to understand, especially when moving to cloud computing, since the workloads lose some of the natural perimeter provided by traditional data centers.

Automated traffic discovery

Management is actually easier at this level with micro-segmentation. Partitioning is too complex to manage manually, but automated traffic discovery and firewall orchestration tools enable the micro-segmentation itself, and the management. The tools allow network security admins to collect, aggregate, and visualize all the intricate traffic behavior. The tools also define and orchestrate all security policies and parameters, which can then be applied and enforced automatically throughout the system. Automation provides both visibility and a means by which to manage its complexity, enabling the data to be better protected.

The migration from traditional servers to IaaS can be tricky for organizations that need strong access controls, continuous monitoring, logging, and sensitive data inventory for compliance purposes. Micro-segmentation takes the burden of protecting dynamic computing environments and configuring the underlying network infrastructure (such as firewalls and VLANs) away from the lower level stack in network security admin teams. It also allows server owners themselves to set a finer grained control for their organization’s compliance and security needs. So enterprises can get on-demand and fully automated workloads at any scale, along with system integrity and security, but with the oversight and control they need. 

We’ve moved from a world of manual control and hardware to one of automation, virtualization, and the cloud. The new model offers flexibility, portability, and speed that the old paradigm just couldn’t offer. New technologies such as micro-segmentation add security by keeping things small and contained, while allowing the environment to expand to cloud scale. And most importantly, they provide the ability to adapt to meet the needs of the modern enterprise. 

Related content:

 

About the Author

Amrit Williams

CTO, CloudPassage

Amrit Williams has over 20 years of experience in information security and is currently the chief technology officer of CloudPassage. Amrit has held a variety of engineering, management and consulting positions prior to joining CloudPassage. Previously, Williams was the director of emerging security technologies and CTO for mobile computing at IBM, which acquired BigFix, an entperprise systems and security management company where Wiliams was CTO. Prior to BigFix, Williams was a research director in the Information Security and Risk Research Practice at Gartner, Inc. where he covered vulnerability and threat management, network security, security information and event management, risk management, and secure application development. Before IBM, Williams was a director of engineering for nCircle Network Security, and undertook leadership positions at Consilient Inc., Network Associates, and McAfee Associates, where he worked to develop market leading security and systems management solutions.

Keep up with the latest cybersecurity threats, newly discovered vulnerabilities, data breach information, and emerging trends. Delivered daily or weekly right to your email inbox.

You May Also Like


More Insights