In Ernest Hemingway’s novel “The Sun Also Rises,” there is a memorable exchange between the author and the main character, Mike. When asked how he went bankrupt, Mike responds with a concise yet profound answer: “Two ways. Gradually, then suddenly.” 

Innovation happens much in the same way. Gradually, then suddenly. Ideas simmer and evolve, gaining traction until they reach a tipping point. When that time comes, innovation bursts forth, capturing the attention and imagination of all. What was once a gradual and incremental change now becomes a game-changer, transforming the landscape in unforeseen ways. 

This is the story of confidential computing, and today, we transport ourselves back in time to explore its origins. Think of it as detective work, peeling back the layers of history to discover those who paved the way for cutting-edge concepts. And what better place to start than with the visionary minds who penned the very first paper on privacy homomorphism “On Data Banks and Privacy Homomorphisms,” published in October 1978 by Ronald L. Rivest, Len Adleman, and Michael L. Dertouzosway.

Photo by
Skye Studios from Unsplash

The limitations of conventional cryptography

The thought-provoking article begins by acknowledging encryption as a well-known technique for safeguarding the privacy of confidential information. However, conventional encryption schemes, such as AES (Advanced Encryption Standard), pose limitations as they typically require data to be decrypted before any operations can be performed on it. The authors suggest that this limitation arises from the encryption functions used and hypothesise the existence of encryption functions called “privacy homomorphisms” that allow operations on encrypted data without prior decryption. By doing so, the authors put the seed for the field of privacy preserving computation, and in the process also accidentally “stumble upon” the concept of confidential computing.

To better explain the problem statement, the authors introduce us to the example of a small bank which uses a commercial time-sharing service to store its customers’ data. For today’s reader, the public cloud can be thought of as the equivalent of the paper’s time-sharing service, and we will refer to it as such for the rest of this blog. 

Because such customers’ data is security-sensitive, the bank takes the sensitive approach to store it encrypted on public cloud servers. However, ciphertext cannot provide any meaningful insights or information beyond the ability to  protect the data when stored in secondary memory. All Computation and analysis tasks require the data to be decrypted beforehand. Therefore, in order to answer any of its business questions, even the simple ones such as “what is the average loan I approved for customers last year”, the bank needs to first decrypt the data. This is problematic, because the data confidentiality and integrity can then be compromised by the thousands lines of code which compromise the cloud’s system software (host OS, hypervisor firmware), its DMA-capable devices, or its potentially malicious operators.

In order to resolve this tension between computing on the data and preserving its confidentiality, the authors discuss several approaches. Let’s explore them.  

Is private cloud the solution?

One option is for the bank to abandon the public cloud service and establish its own private cloud deployment. By doing so, it can have full control over the data and perform computation directly on the cleartext within its own premises. While such an approach does provide better data governance, it does not solve the fundamental problem at hand. After all, the private data centre still needs to operate third-party privileged system software that can be malicious or simply has security vulnerabilities, and it would still need to address insiders’ threats from within its organisation.

Homomorphic cryptographic schemes

The authors then propose the most intuitive solution you can think of. If decryption is the problem, then why not look for an encryption scheme which does away with decryption, and allows computation to be performed on the encrypted ciphertext itself. This means that data can remain encrypted throughout the computation process, thereby preserving privacy. The authors then introduce the notion of an algebraic system and hypothesise the very likely existence of encryption functions, termed privacy homomorphisms, that preserve the algebraic properties of the underlying data.

They then present several sample privacy homomorphism constructions, highlighting that while some may be weak from a cryptographic perspective, they serve as illustrations of the potential existence of useful privacy homomorphisms. These examples included different algebraic systems, such as integers modulo prime numbers, integers modulo a composite number, and radix representations.

Their hypothesis was right. Homomorphic encryption schemes did exist, and through a long gradual process of innovation and iterations, the world was introduced in 2009 to the first fully homomorphic scheme (FHE) presented by Craig Gentry in his paper titled “A Fully Homomorphic Encryption Scheme”. This scheme is based on the ideal lattice problem, which is a computationally hard problem in lattice theory.

Despite this innovation, FHE could not be used immediately in practice, because it was extremely slow. In fact, it used to take 30 minutes to complete a single logic gate on standard x86 hardware. Unfortunately, many people are still under the impression that FHE is still not practical today. This could not have been further from the truth. Today, homomorphically encrypted searches can be performed over hundreds of millions of data records and returned within seconds rather than days or even weeks. And this is just the beginning. The biggest speedup will come from dedicated hardware acceleration. Several companies are currently working on this, and are targeting a 1,000x speedup for their first generation (planned for 2025) and up to 10,000x for their second generation.

Confidential computing

The authors then go on to discuss another option that the bank can take. It consists of convincing the public cloud providers and silicon companies to modify their hardware to allow the data to exist in its decrypted form only temporarily and only within the central processing unit (CPU). The hardware modification should also enable the data to be encrypted in system memory. 

In the author’s own words: “In addition to the standard register set and ALU (A,B), a physically secure register set and ALU (C,D) are added. All communication of data between main memory and the physically secure register set passes through an encoder—decoder CE) supplied with the user’s key, so that unencrypted data can exist only within the physically secure register set. All sensitive data in main memory, in the data bank files, in the ordinary register set, and on the communications channel will be encrypted. During operation, a load/store instruction between main memory and the secure register set will automatically cause the appropriate decryption/encryption operations to be performed.”

If this definition sounds familiar, it’s because it is. What the authors proposed back in 1978 is effectively confidential computing as we know it today. Making it a reality required decades of advances to take place in hardware and system design. 

On the hardware side, silicon providers have been investing considerable resources into maturing their offerings. Just to cite a few, we have Intel SGX, Intel TDX, and AMD SEV on the X86 architecture; TrustZone and the upcoming ARM CCA for the ARM ecosystem, Keystone for RISC-V architectures, and Nvidia H100 for GPUs.

Ubuntu at the heart of confidential computing

Enabling widespread adoption of confidential computing is a collaborative endeavour involving multiple stakeholders in the industry. Among them, public cloud providers (PCPs) have emerged as key proponents of hardware trusted execution environments (TEEs). Their focus has been on facilitating a seamless migration process by adopting a “shift and lift” approach, allowing entire virtual machines (VMs) to run unchanged within the TEE.

This approach offers significant advantages. Developers can avoid the need to refactor or rewrite their confidential applications, streamlining the transition to confidential workloads. However, it does require the optimisation of guest operating systems to fully support user applications and leverage the underlying hardware TEE capabilities. Additionally, these optimisations enhance VM security during boot-up and at rest. This is exactly what Canonical has been working on. 

Photo by Mitchell Luo from Unsplash

Ubuntu confidential VMs leading confidential computing in the public cloud

Thanks to a close collaboration with the many major cloud providers, Ubuntu was the first Linux operating system to support both AMD SEV and TDX in the public cloud. Today, it only takes a few clicks to start using Ubuntu confidential VMs on Azure, AWS, Google Cloud. In the near future, we look forward to sharing more innovation across all the layers of confidential VMs, confidential containers and much more. 

Looking ahead into the near future, Ubuntu is committed to bringing innovation across all layers of the confidential computing ecosystem. From confidential VMs to confidential containers and beyond.

Ubuntu’s appeal as a natural choice for building and deploying confidential VMs is further strengthened by the availability of Ubuntu Confidential VMs at no extra cost on major cloud providers. 

The path forward

As the history of confidential computing shows, innovation is a testament to perseverance, patience, and continuous improvement. Innovation is not always an overnight success but a persistent voyage of growth and exploration. 

At Canonical, we remain committed to our ideas, recognising that breakthroughs can manifest when the time is right. For confidential computing, that time is now.

Ubuntu’s confidential computing portfolio is shaping a future where privacy and computation coexist harmoniously. We invite you to step into  that future with us, today.

Learn more about Ubuntu security

If you would like to know more about the Canonical approach to security at large, contact us. 

Additional resources

  • Ubuntu Pro | product page
  • Ubuntu Pro 20.04 on Azure Marketplace Microsoft Azure Marketplace
  • Watch our webinar to learn more about confidential computing
  • Read our blog post for “What is confidential computing? A high-level explanation for CISOs”
  • Read our blog post for “Confidential computing in public clouds: isolation and remote attestation explained”
  • Start creating and using Ubuntu CVMs on Azure
  • Is Linux Secure?

Similar Posts