The Present and Future of Confidential Computing
Confidential computing protects sensitive information during processing and has recently expanded to include GPUs as well as CPUs.
July 15, 2024
Data protection lies at the heart of cybersecurity. Conventional encryption methods mainly focus on protecting data at rest and data in transit. Meanwhile, data in use — while in RAM or being computed at the central processing unit (CPU) or graphics processing unit (GPU) level — has historically lacked robust protection at scale. Confidential computing has emerged to fill that gap. By protecting data in use, in addition to existing means of protecting data at rest and in transit, confidential computing helps ensure that data remains secure during its entire life cycle.
Confidential computing has already been enabled on central processing units (CPUs) and was recently introduced as an additional security feature on graphics processing units (GPUs) to secure data while it's being processed on GPUs. This opens new possibilities for industries that rely on these powerful processors for their accelerated computing needs, including artificial intelligence (AI).
What Is Confidential Computing?
Confidential computing is defined as the protection of data in use by performing computation within a hardware-based, attested Trusted Execution Environment (TEE). These secure and isolated environments prevent unauthorized access or tampering with applications and data while they are in use. For cybersecurity, confidential computing addresses a critical vulnerability of data exposure during processing.
Confidential Computing, Hardware, and the Cloud
The effectiveness and foundational security of confidential computing depends upon the relationship between physical hardware and cloud infrastructure. At the heart of this relationship is the concept of attestation.
The Confidential Computing Consortium explains that attestation is the evidence that you use to evaluate whether or not to trust a confidential computing program or environment. The resulting deployment should provide a mechanism to validate an assertion that the program or environment is running in a TEE whose physical characteristics and configuration settings are as expected. In confidential computing, evidence of the validity of a hardware is provided by a hardware-signed attestation report. This proof helps people or organizations trust that the software is operating on secure hardware before keys are released and sensitive data is processed.
Confidential virtual machines (VMs) play a crucial role in this ecosystem. A confidential VM is a type of VM that ensures that sensitive data remains private and secure even while being processed in memory. And these VMs enable seamless onboarding of applications with no code changes required.
Confidential GPUs are an extension of confidential VMs. These GPUs are instrumental in enabling complex computations such as those executed when using AI while ensuring data also remains protected within GPU memory. Like confidential VMs, they have their own attestation process that can make sure the GPU is in confidential computing mode before sensitive data is processed within it. Such advancements aim to foster innovation, enabling multiple parties to collaborate on sensitive data without compromising security or performance.
Confidential computing allows the cloud hypervisor and host operating system (OS) to remain untrusted, effectively isolating sensitive data and processing tasks from potential threats. Even if malware or unauthorized code attempts to access encryption keys or alter authorized application code, the TEE denies access. This separation is critical for maintaining data integrity and confidentiality in cloud environments.
Bringing Confidential Computing to GPUs
While confidential computing has been available for CPUs for years, new developments extend a complete, secure computing stack from the VMs to the GPU architecture. GPU use cases include high-performance computing and parallel processing, such as 3D graphics and visualization, scientific simulation and modeling, and AI and machine learning.
Confidential computing can protect model confidentiality, ensuring that proprietary AI algorithms and data remain secure. This is particularly important in scenarios such as inference, where AI models make predictions based on input data, and fine-tuning, where models are adjusted using private datasets.
With the advent of GPU-based confidential computing, businesses can now extend the level of security traditionally afforded to CPU operations to GPU-intensive tasks.
Industry use cases that illustrate the potential of GPU-based confidential computing include:
Healthcare: Confidential computing can be applied to the analysis of medical images, such as X-rays, CT scans, and MRIs. This protects patient privacy and ensures compliance with stringent data protection regulations.
Finance: Confidential computing can secure the analysis of transactions, portfolios, and risk models. By protecting sensitive financial data and models during processing, organizations can maintain the confidentiality of proprietary information and prevent unauthorized access.
Government: Government agencies collect and process vast amounts of data, ranging from census information to tax records and national security data. Confidential computing can protect the privacy of citizens and bolster national security by ensuring that sensitive data is processed securely.
Confidential computing for GPUs is currently available for small to medium-sized models. As technology advances, Microsoft and NVIDIA plan to offer solutions that will scale to support large language models. The integration of GPUs into confidential computing frameworks represents a paradigm shift for secure computing at the highest levels of performance. Now businesses can see better results and enhanced security for a wide range of applications, from AI to sensitive data analytics.
Read more about:
Sponsor Resource CenterYou May Also Like