Image depicting the concept of cryptography with bits and symbols arranged circularly. Cryptography, at a fundamental level, is the science of keeping secrets.

As a child, you may have used secret messages or languages to communicate with friends or siblings, and you have likely observed the use of cryptography in various aspects of our society – maintaining the confidentiality of personal, consumer, corporate, and government data. However, on top of this, cryptography’s status as an indispensable building block in digital infrastructure continues to grow with the perpetual increase in online connectivity – securing online transactions, authentication, and access to resources.

Cryptographic systems are often built on the premise that certain math problems are, computationally, very hard to solve.  Many of these problems, such as factoring certain types of large numbers, have been studied by mathematicians anywhere from decades to centuries.  In fact, mathematicians often estimate the projected security of such systems by plotting the evolution in ‘running time’ of the best-known attacks.  These predictions work well, but only in the absence of major disruptions; new algorithms or technologies drastically improve the expected running time of attacks.

Over the last two decades the field of cryptography has seen a number of trends, including the potential to link encryption and decryption capabilities to a person’s attributes; efficiency gains in areas such as Secure Multi-Party Computation (MPC), which allows multiple parties to interact on confidential data sets; new solutions for homomorphic encryption using lattice-based cryptography (which we will explore below); and, now, the threat and promise of quantum computers pushing the field to develop post-quantum cryptographic systems.

An emerging threat: the Quantum Computer

Many researchers, industrial labs, and governments are actively working on developing a quantum computer which can handle large-scale computation, such as the work at Station Q. While classical computers – phones, tablets, laptops, servers, and so on – store and process information in the form of bits (strings of zeros and ones), quantum computers process quantum bits, a two-state quantum mechanical system, called qubits. Compared to a ‘regular’ bit, a qubit can hold a value of one and zero simultaneously. The theoretical advantage to this is that each qubit can, therefore, perform copious processes simultaneously – dramatically shortening computation times, and permitting calculation of significantly more complex processes.

Small-scale quantum computers already exist, and estimates vary as to how many years it will take before researchers and engineers succeed in building a quantum computer that can handle computations involving thousands of qubits.  However, when that day does inevitably come, the consequences for the world’s e-commerce and security infrastructure will be enormous.

In 1994, Shor introduced a quantum algorithm which can factor large integers in polynomial time, given that it runs on a quantum computer which can accurately process those computations on a large enough number of qubits. A variant of this idea also will allow polynomial-time quantum attacks on all the other currently widely deployed public key cryptosystems used in industry and government today.

For reference, polynomial time is a category of complexity, which is used to place different algorithms into classes of computational complexity (based on the duration it takes to run them). ‘Big O notation’ is a way to express this computational complexity. Algorithms running in polynomial time, which is fast, can be written as O(na), where: ‘n’ is the number of bits needed for an input, and ‘a’ is a constant greater than one.

Post-Quantum Cryptography

In 2017, the National Institute of Standards and Technology (NIST) in the U.S. launched an international multi-year Post-Quantum Cryptography (PQC) competition to select cryptographic systems for the future.  A post-quantum cryptosystem is one which is not known to be breakable in polynomial time (described above) by a full-scale quantum computer.

There are currently five main proposals for post-quantum systems using hard math problems, each of which has been studied for a decade or more:

Code-based systems based on the difficulty of decoding random linear codes

An example of a linear expression in one variable is: 10 + 3x. Linear codes consist of code words, but instead of using simple components in a codeword such as ‘a’ or ‘b’, each component is far more complex: a multivariable linear expression – a complex mathematical phrase – of the input. Given a vector where some of the entries are wrong (there is an ‘error’), the decoding challenge is to find the closest codeword.

Multi-variate cryptosystems based on the difficulty of solving systems of many non-linear equations in many variables

An example of a non-linear equation in several variables is: y = 2x3z+ 3x2 w + z2w2 Current proposals are based on scores of equations in scores of variables.

Lattice-based systems based on the hardness of finding short vectors in lattices

Regularly-spaced grids of points stretching out to infinity. Think of a polka dot shirt for a basic example in two dimensions, but typical systems might use lattices in thousands of dimensions. Lattices are typically described in terms of a set of vectors (objects with both a magnitude and direction) which generate the whole lattice (basis vectors). However, it can be exponentially hard to find a short combination of those basis vectors.

Systems based on Merkle-hash trees

Signature schemes, based on hash functions, with certain properties. A fundamental part of blockchain technology that allows it to scale and use its hash-based architecture to validate large data structures.

Supersingular Isogeny Graph (SIG) systems that are based on the difficulty of finding paths between two points in a large, seemingly random, graph (see the below example).

The newest of these problems – the hardness of finding paths in SIGs – was introduced as a hard problem into cryptography in this paper, and presented at the NIST hash function contest in 2005.  Here is a picture of a very small SIG, which appeared in Science magazine in 2008.

Image of a basic Supersingular Isogeny Graph System

To get a feel for what the hard problem is underlying this proposal, pick two random points in the graph and try to find a path between them.  Then, try to imagine a graph which is 2250 (over eighteen sexvigintillion) times bigger than this one.

For more information on each of the proposed approaches, see the IEEE Security and Privacy magazine award-winning issue on Post-Quantum Cryptography which has short articles on each proposed candidate.

Homomorphic Encryption

One of the advantages of the lattice-based cryptosystems mentioned earlier is that they can  build systems that securely and privately handle computation on encrypted data. The term homomorphic encryption may seem esoteric, but it just means encryption which you can compute on.  Standard cryptosystems in use today don’t have this property: if you try to add together two ciphertexts (encrypted texts) encrypted using the Advanced Encryption Standard (AES), then the result will be gibberish.  Homomorphic encryption differentiates itself by preserving the structure of the data. It encodes data in a mathematical object, and then encrypts in a way that doesn’t affect the contained information.

image depicting the basic mathematical process that homomorphic encryption follows

Homomorphic encryption is important because it allows consumers and enterprises to store their sensitive data in the cloud in encrypted form. The cloud can compute on the data in its encrypted form and then return encrypted predictions or encrypted analytics, all without ever decrypting the data or learning anything about the client’s sensitive information.  A recently formed consortium of researchers from industry, government, and academia is focusing on standardizing this new cryptographic primitive.  My group at Microsoft Research has been working hard for more than seven years on Homomorphic Encryption, demonstrating practical applications for cloud privacy and security, and publicly released the SEAL (Simple Encrypted Arithmetic Library) in 2015.

The importance of research in mathematics, computer science and engineering

The engineers and researchers who design, build, and analyse secure systems are indispensable in today’s society. By highlighting and honouring the work of engineers, the Queen Elizabeth Prize helps to encourage future generations to study and excel in engineering, and I find it particularly inspiring that the prize carries a woman’s name. In cryptography, there are copious examples of women doing great work, and we can also look back to the code-breaking women of Bletchley Park during World War II, who helped allied forces to win the war.  I hope the QEPrize will encourage many young girls to study mathematics, computer science, or engineering, and perhaps to even build the next generation of cryptographic systems that ensure our privacy and security in the future.


Kristin Lauter

Kristin Lauter

Principal Researcher, Research Manager at Microsoft
Kristin Lauter is a Principal Researcher and Research Manager for the Cryptography group at Microsoft Research.Her research areas are number theory and algebraic geometry, with applications to cryptography.She served as President of the Association for Women in Mathematics from 2015 –2017.She is a Fellow of the American Mathematical Society and the Association for Women in Mathematics and the Polya Lecturer for the Mathematical Association of America for 2018-2020.In 2008, Lauter and her coauthors were awarded the Selfridge Prize in Computational Number Theory.
Kristin Lauter

Latest posts by Kristin Lauter (see all)

    Comments