NSA Winds Down Secure Virtualization Platform Development
The National Security Agency's High Assurance Platform integrates security and virtualization technology into a framework that's been commercialized and adopted elsewhere in government.
Inside DHS' Classified Cyber-Coordination Headquarters
(click image for larger view)
Slideshow: Inside DHS' Classified Cyber-Coordination Headquarters
After several years in the making and two releases, the National Security Agency is winding down new development of its secure client virtualization framework, the High Assurance Platform (HAP).
At HAP's inception, NSA wanted an integrated, networked framework of virtualization and security technology, but the market had yet to deliver one. So NSA set out to piece together the disparate hardware and software that commercial vendors had already placed on the market. "We saw all of these things," Neil Kittleson, the commercial solutions center's trusted computing portfolio manager, said in an interview. "And we saw the need to create custom policy around it to get them all to work in parallel."
Historically, intelligence agencies have used different computers for working with differing levels of classified data, but HAP allows multiple security levels -- from unclassified to top secret -- to operate on the same machine. HAP is managed by NSA's commercial solutions center, a group focused on engaging industry. The intent of the six-year-old program was to leverage purely commercial technologies, rather than relying on custom code and products designed specifically for government, as was long the norm for the intelligence community.
The HAP program was intended to push both NSA's tech boundaries and the industry's own virtualization and security offerings. This close work with vendors is central to the commercial solutions center's broader mission. For example, the office has an outreach element that has vendors come in and talk about emerging capabilities. "We want to know where they're going, understand that, and help influence development," Mike Lamont, chief of the NSA's network solutions office, said in an interview. Vendors of products being used in the HAP project include IBM, VMware, Wave Systems, and others.
"By developing this proof of concept, we've been able to take what we're doing, weave it together, and prove that it works," Kittleson said. NSA now offers a HAP developer kit, including source code and documentation, to allow organizations and vendors to plan their HAP deployment. HAP-compliant technologies have even made their way into the marketplace, with General Dynamics partnering with Dell in 2008 and HP in 2010 with its Trusted Virtual Environment.
When an organization decides to bring a new computer into the HAP environment, baseline measurements of the device (such as, for example, the BIOS configuration) are taken and stored on a remote attestation server. At boot time, HAP workstations go through a process where certain measurements get stored on the Trusted Platform Module, an embedded security chip that's found on many enterprise PCs these days. Then, upon connecting to a network, the remote attestation server verifies the new measurements against the pre-determined baselines. This remote attestation functionality is based on the open Trusted Network Connect network access control architecture.
HAP uses hardware like Intel TxT to protect memory and execution space, Intel VT-d to isolate I/O devices attached to the computer and operating system, and virtualization tweaks to enforce strict security policies. This prevents exploits like heap or stack-based memory buffer overflows as well as breaches due to direct memory access vulnerabilities and insufficient process isolation.
While many organizations have pieces of HAP's security architecture in place, the integrated security functionality is what, largely even today, makes HAP different from other client virtualization frameworks.
Top 10 Government Stories Of 2010
(click image for larger view)
Slideshow: Top 10 Government Stories Of 2010
While HAP didn't require much coding, Kittleson said, it wasn't a simple integration. Security authorization was one of the biggest challenges for the HAP team. In addition to time, the federal government and General Dynamics spent 18 months just pushing General Dynamics' Trusted Virtual Environment through the certification and accreditation process, meaning that by the time the integrated system was authorized to be deployed in government, its component parts were already a generation behind. It also was difficult, he said, to develop sufficient confidence that there was zero data leakage across virtual machines.
Lamont said he looks back at how HAP was executed and sees it as "a bridge between where we used to be with [government-specific technology] and where we want to go with commercial off-the-shelf [technology]. For future solutions, there's now a different process."
Among the other challenges were performance and enterprise integration. For example, in early tests, machines took as long as 40 minutes to boot (the time is now down to something more like normal boot time), and HAP required some configuration to ensure Active Directory would work with it.
There have been two releases of HAP thus far. The first release, made available in 2009, used local administration, manual provisioning, and manual key management, and required different wires for each security domain. It also didn't support easily sharing information across domains, and only supported three simultaneous virtual machines (VMs).
The more current version of HAP supports better measurements of system integrity, enterprise administration and remote provisioning, automated key management, VPN tunneling to enable the use of only one wire to connect to multiple networks, data-at-rest encryption, and support for more VMs.
Though new development is winding down, HAP has found a home in the government, where, for example, the military, through the Socrates High Assurance Program, has used HAP workstations for to reduce its PC footprint. According to Kittleson, there are multiple "ongoing pilots" of HAP within the Department of Defense. In addition, a number of federal agencies such as the Defense Intelligence Agency have been able to leverage some of NSA's best practices in their own client virtualization efforts.
For the past few years now, NSA has put on an annual conference that dives into HAP technologies. Last year, the conference had about 500 attendees, with dozens of government organizations represented. An exec from PricewaterhouseCoopers, which is using Trusted Computing Modules internally for strong authentication, was among the speakers. The conference will continue even as HAP's new development is wound down, but more as a trusted computing conference. "Anonymously, we've talked to a lot of companies internally about this," Lamont said.
So what's the future hold for HAP? While last year's budget documents signaled that NSA would begin work on a third generation of HAP that other documents show would have added even more security and virtualization features, this was scrapped as the commercial market began offering similar capabilities in integrated packages. Going forward, Lamont said, NSA will continue keeping a close eye on integrated security, as it always has.
"Our focus remains working on the next generation of commercial technologies that can be leveraged for information assurance purposes," he said. "It's the next generation of trusted computing technologies, and whatever the next generation of technologies are."
About the Author
You May Also Like