Skip to Content

Confidential computing in the cloud

Philip Wainhouse
Jun 26, 2024

Public cloud providers are growing their managed service offerings in the Confidential Computing playing field and Capgemini is helping customers to design the most appropriate solutions for their set of use cases and data classifications

Capgemini works with a variety of organizations that handle sensitive data in the cloud, such as financial, health or personal information (PII) and all can benefit from the additional security that the Confidential Computing market offers. The market is expected to continue to grow, especially in the public sector and sovereign cloud domains, where the additional protection can help compliance with regulation and local privacy laws.

The Capgemini Cloud Team helps customers navigate these complex challenges and partners with the leading Cloud Providers such as AWS, Azure and GCP, as well as startups such as German company Edgeless Systems to leverage best in class solutions (Constellation and Continuum AI from Edgeless Systems are examples of pioneering Confidential Computing software that runs in the cloud – isolate your entire Kubernetes cluster from the underlying cloud infrastructure or protect user prompts from the AI model provider).

In Switzerland, Capgemini works with clients across Finance, Life Sciences, Manufacturing and Luxury Goods to help protect – and process – their most sensitive data assets.

Confidential Computing in the Cloud with AI in 2024

In cloud computing things move fast. Already the landscape has shifted since the original Confidential Computing Consortium was founded in 2019. What’s new? AI has hit its tipping point and Generative AI has entered the public consciousness.

With that in mind, let’s look at the latest in Confidential Computing services from the major cloud providers, how you can run Confidential AI workloads and how Capgemini is using the technology to help customers.

A Refresher: What is Confidential Computing?

Confidential Computing is a technique aimed at protecting your data during processing.

It offers extra encryption capability you can use in your solutions (either on-premise or in the cloud), to keep your most sensitive data private while it is in use.

In today’s data-driven environments, protecting your most valuable assets is paramount. In the cloud, this is an ongoing activity involving many layers of defence, from the edge, through the network, to your cloud resources and down to your applications and sensitive data.

Securing the data itself starts with encrypting it, broadly broken down into 3 categories:

  • Data at Rest: Encrypt your data while it is inactive.
  • Data in Transit: Encrypt your data when moving it, over the internet or on internal networks.
  • Data in Use: Encrypt your data when it is being processed in memory.
Confidential Cloud Computing

Cloud native managed services have been helping with the first two for some time now, providing many options as a default service, guiding on best practice, and offering customization for more sophisticated implementations.

Data in Use is the growing third category that Confidential Computing techniques seek to tackle, by providing a Trusted Execution Environment (TEE or secure enclave) for processing tasks in a hardware-based, isolated, attestable compute environment (verifiable with hardware identity).

TEE technologies and standards have been driven with the help of the Confidential Computing Consortium (CCC). This is a Linux Foundation community project initiated by a set of cloud, hardware & software providers, who collaborate via open source projects and accelerate the adoption of TEE standards and hardware-based securing of data in use.

Confidential Computing Cloud Services

At the core of Confidential Computing is the ability to process data securely and privately on any type of infrastructure or public cloud. This is comforting when customers are looking to trust as little as possible in their solutions and are increasingly adopting zero-trust security architectures.

As a result, the public cloud providers are growing their managed service offerings in the Confidential Computing playing field and Capgemini is helping customers to design the most appropriate solutions for their set of use cases and data classifications.

When choosing from the service offerings available for your Trusted Execution Environment, there are many options at different levels of the stack and with different hardware flavours. Let’s start with the most common and easiest to use, Confidential Virtual Machines.

Confidential Virtual Machines (VMs)

Confidential VMs use silicon-level defences to encrypt data in use, via security technologies offered by modern CPUs / hardware from providers such as AMD, Intel, Nvidia and AWS. Memory in RAM is encrypted and there are options for OS or temp disk encryption. Your workload is isolated from the Hypervisor, Host OS and other VMs and there is no cloud operator access, because encryption keys are generated by and reside only in, the hardware. This protects from a wide range of attacks as only the VM can read and write to memory.

You can create confidential VM instances in the cloud in much the same way as creating standard VMs, simply by selecting and enabling Confidential Computing, then choosing a machine type and CPU platform that best fits your security, performance, and cost criteria. Common options are:

Confidential Virtual Machines
  • Intel’s Software Guard Extensions or Trust Domain Extensions (Intel SGX / TDX).
  • AMD’s Secure Encrypted Virtualization (AMD SEV / SEV-SNP).
  • AWS offers their own EC2 solution based on their AWS Nitro System & Enclaves technology.

In general, the more security features offered, the more resource intensive the technology, so the trade-off comes in higher cost, lower network bandwidth and higher latency. For example, AMD SEV-SNP (Secure Nested Paging) extends SEV with additional features to prevent hypervisor attacks such as data replay and memory remapping.

You can usually get going on Confidential VMs without modifications to your existing applications and code.

VMs with Confidential App Enclaves

If you don’t need an entire VM but prefer protection for a smaller security boundary, then you can opt for a Confidential App Enclave, which protects app-specific workloads hosted on a Virtual Machine. A portion of memory is encrypted when in use and isolation can be created down to specific modules within applications. The security boundary is applied to the portion of memory within a VM used by the enclave, which means you can be more surgical with your protection needs and more optimized with cost and latency.

Virtual Machine

Confidential Containers:

If you have containerized workloads you would like to make confidential, there are container services now to support your needs. Confidential Worker Nodes facilitate encryption in-use for data processed inside your Kubernetes clusters and container workloads can be deployed the same as a standard cluster. Using the same techniques as Confidential VMs, data is encrypted in memory with node-specific, dedicated keys that are generated and managed by the processor. Key are generated by the hardware during node creation and remain there, so are unavailable to the host or cloud operator.

Confidential Data Collaboration:

In addition to processing your own data privately, Confidential Computing services are available with new capabilities for sharing and collaboration, where you can aggregate data from different sources, while retaining confidentiality individually. Organizations can perform joint data analysis or train machine learning models together in TEEs that guarantee data protection from all parties, including hardened protection against the cloud provider itself.

Other Confidential Cloud Services:

More confidential services are emerging as the space grows, such as Confidential SQL Enclaves, Confidential Databricks, or Confidential Blockchain Ledgers. All these developments help clients pick the right level of protection for different parts of their solutions, based on their cost, performance, team, or regulatory needs. And often services can simply be spun up by opting into the confidential option.

Although choice is increasing and the technology is maturing, there are still important architectural considerations and trade-offs to be made that will impact choice – for example cost sensitivity, instance startup times, or that some confidential services may not integrate so easily with other standard cloud services. Use the flexibility of the cloud to experiment with your solution before committing.

Confidential Neural Computing

For AI and machine learning, where larger scale solutions are often required, similar techniques are evolving to support AI Confidential Computing, with cloud offerings to match, powered by Nvidia H100 GPUs or Intel AMX CPU acceleration.

Confidential Neural Computing offers a framework in which to conduct AI training and inference within secure enclaves, for use cases that require end-to-end privacy or private Generative AI. These solutions leverage powerful CPU acceleration and Confidential GPUs to ensure data is protected throughout the processing pipeline, from entering the GPU to results generation, whilst also providing the necessary performance required for deep learning and inference workloads.

H100s that support Confidential Compute Mode though are in high demand, so there are cost considerations for such architectures. Common emerging use cases are in mobile Generative AI, where it is not possible yet to conduct certain tasks on device, due to the resource needs of LLMs, so processing needs to be done in the cloud. The ability to use TEEs and remote attestation to keep such data processing events private, is an attractive proposition to many customers innovating in this space.

If you would like more information on data processing use cases such as fraud prevention, drug development, proprietary analytics, or even collaborating on multi-party AL/ML training, get in touch to see how we can help.

Meet our expert

Philip Wainhouse

Philip Wainhouse

Cloud Lead & Architect, Capgemini Switzerland
Hi I’m Phil and with the cloud team here in Switzerland we’re excited to hear how you would like to leverage cloud for your business, whether you are just starting out, or you are already fully cloud native and experimenting with Serverless and Generative AI. Like many of my colleagues in our Cloud CoE, I’m an engineer and builder at heart, with over 15 years hands-on experiencing building solutions in the cloud across Banking, Insurance, Retail & Automotive. Together we’re here to help you innovate, so get in touch and say hello.