Tuesday, September 19, 2017

Master's Thesis Paper on Post-Quantum Algorithms for Digital Signing in PKI Available Now

Mikael Sjöberg has been with us here at PrimeKey this spring exploring different algorithms suitable for use in PKI even after quantum computers becomes a reality.

As Mikael's mentor I am very pleased with his work and that he choose to do the study at PrimeKey.

The final version of the master's thesis has now been approved.  Read the abstract below or download the full version PDF.

Markus Kilås
Product Owner SignServer

Post-quantum algorithms for digital
signing in Public Key Infrastructures



One emerging threat to Public Key Infrastructures is the possible development of large-scale quantum computers, which would be able to break the public-key cryptosystems used today. Several possibly post-quantum secure cryptographic algorithms have been proposed but so far they have not been used in many practical settings. The purpose of this thesis was to find post-quantum digital signature algorithms that might be suitable for use in Public Key Infrastructures today.

To answer the research question, an extensive literature study was conducted where relevant algorithms were surveyed. Algorithms with high-grade implementations in different cryptographic libraries were benchmarked for performance. Hash-based XMSS and SPHINCS, multivariate-based Rainbow and lattice-based BLISS-B were benchmarked and the results showed that BLISS-B offered the best performance, on par with RSA and ECDSA. All the algorithms did however have relatively large signature sizes and/or key sizes.

Support for post-quantum digital signature algorithms in Public Key Infrastructure products could easily be achieved since many algorithms are implemented in cryptographic libraries. The algorithms that could be recommended for use today were SPHINCS for high-security applications and possibly BLISS-B for lower security applications requiring higher efficiency. The biggest obstacles to widespread deployment of post-quantum algorithms was deemed to be lack of standardisation and either inefficient operations compared to classical algorithms, uncertain security levels, or both.

Full report (PDF)