Other PQC Families & Broken Schemes
Overview
The NIST Post-Quantum Cryptography standardization process selected algorithms from two primary mathematical families: lattice-based cryptography (ML-KEM, ML-DSA, FN-DSA) and hash-based signatures (SLH-DSA), with code-based cryptography (HQC) added as a second KEM. But the full landscape of post-quantum cryptography is far richer — and far more treacherous — than the winners alone suggest. Multivariate polynomial schemes, isogeny-based protocols, symmetric-key constructions leveraging zero-knowledge proofs, and various hybrid approaches all competed for standardization. Some showed extraordinary promise. Some were spectacularly broken.
This page examines the PQC families that did not make it into the primary standards, the ones still under consideration in NIST’s additional signatures round, and the schemes whose catastrophic failures taught the cryptographic community hard lessons about the gap between theoretical elegance and cryptanalytic reality. For background on the NIST standardization process itself, see NIST PQC Standardization Process. For the lattice-based foundations underlying the winning algorithms, see Mathematical Foundations.
1. Multivariate Cryptography
1.1 The Multivariate Quadratic (MQ) Problem
Multivariate cryptography derives its security from the difficulty of solving systems of multivariate quadratic polynomial equations over finite fields. The core hard problem — known as the MQ problem — is stated as follows:
Given a system of m quadratic polynomials p₁, p₂, …, pₘ in n variables x₁, x₂, …, xₙ over a finite field F_q, find a vector (x₁, x₂, …, xₙ) ∈ F_qⁿ such that p₁(x) = p₂(x) = … = pₘ(x) = 0.
The MQ problem is NP-hard in the general case, and — critically — there is no known quantum speedup that fundamentally changes its complexity class. Grover’s algorithm provides at most a quadratic speedup for brute-force search, but the structured nature of the MQ problem means that classical algebraic attacks (Gröbner basis methods like F4/F5, XL algorithm, and their variants) are typically the relevant threat, not quantum algorithms.
This makes multivariate cryptography genuinely “post-quantum” in the strongest sense: the hardness does not depend on problems that Shor’s algorithm can solve. Even a universal fault-tolerant quantum computer would gain only Grover’s quadratic speedup against MQ, which is easily countered by modestly increasing parameters.
However, “NP-hard in the general case” does not mean that every specific instance is hard. The challenge in multivariate cryptography has always been constructing specific families of polynomial systems that are both (a) hard to solve without the trapdoor and (b) efficiently invertible with the trapdoor. Random MQ instances are hard, but a cryptographic scheme cannot use truly random systems — it needs structure, and structure can be exploited.
The general form of a multivariate quadratic polynomial over F_q is:
p_k(x₁, ..., xₙ) = Σᵢ≤ⱼ αᵢⱼ⁽ᵏ⁾ xᵢxⱼ + Σᵢ βᵢ⁽ᵏ⁾ xᵢ + γ⁽ᵏ⁾
A public key in a multivariate scheme is a set of such polynomials. The challenge is constructing a system where:
- The public polynomials look random (and are therefore hard to solve)
- The designer knows a secret structure (a trapdoor) that allows efficient inversion
1.2 Oil and Vinegar: The Core Construction
The Oil and Vinegar scheme, proposed by Patarin in 1997, introduced one of the most influential trapdoor constructions in multivariate cryptography.
The concept: Divide the n variables into two sets:
- Vinegar variables (v₁, v₂, …, vᵥ): These can interact with everything — they appear in all cross-terms
- Oil variables (o₁, o₂, …, oₒ): These are “immiscible” — they never multiply each other (no oᵢoⱼ terms appear)
The central quadratic map has the form:
f_k(o, v) = Σᵢ,ⱼ αᵢⱼ⁽ᵏ⁾ oᵢvⱼ + Σᵢ≤ⱼ βᵢⱼ⁽ᵏ⁾ vᵢvⱼ + linear terms
Notice what is missing: there are no oᵢoⱼ terms. This structural restriction is the trapdoor. When you fix the vinegar variables to random values, the system becomes linear in the oil variables and can be solved efficiently by Gaussian elimination.
Signing works as follows:
- Given a message hash (y₁, …, yₘ), choose random values for the vinegar variables
- The system is now linear in the oil variables — solve it
- Apply the secret invertible linear maps to recover the signature
Verification is simple and fast: plug the signature vector into the public polynomials and check that the output matches the message hash. This asymmetry — signing requires knowledge of the secret structure, verification only requires evaluating public polynomials — is characteristic of multivariate schemes and gives them excellent verification performance.
The original “balanced” Oil and Vinegar (equal numbers of oil and vinegar variables) was broken by Kipnis and Shamir in 1998 via an attack that could separate the oil and vinegar variables. The fix was straightforward: use significantly more vinegar variables than oil variables.
1.3 Unbalanced Oil and Vinegar (UOV)
Unbalanced Oil and Vinegar (UOV), proposed by Kipnis, Patarin, and Goubin in 1999, sets v >> o (typically v ≈ 2o or more). This imbalance defeats the Kipnis-Shamir separation attack because the attacker cannot efficiently distinguish oil subspaces when vinegar variables dominate.
UOV’s security rests on:
- The difficulty of the MQ problem for the public key polynomials
- The inability to recover the oil-vinegar structure from the public key
UOV has survived over 25 years of cryptanalysis — an exceptional track record in post-quantum cryptography and one of the longest unbroken runs of any post-quantum scheme aside from McEliece (1978). Its main drawback has always been large public key sizes. For 128-bit security, a UOV public key is typically in the range of 66–150 KB, depending on parameter choices. This is orders of magnitude larger than lattice-based alternatives (ML-DSA-44 public keys are 1.3 KB) and represents a genuine deployment barrier for bandwidth-constrained applications.
Despite this, UOV’s cryptanalytic maturity has given it renewed relevance. It is a candidate in NIST’s additional digital signatures round, and several optimized variants (MAYO, which compresses UOV keys using a “whipping” technique, and VOX) are under active evaluation.
1.4 Rainbow: Rise and Fall
Rainbow, designed by Jintai Ding and Dieter Schmidt in 2005, was a multi-layered generalization of UOV. Instead of a single oil-vinegar partition, Rainbow used a layered structure with multiple “levels” of variables:
Variables: V₁ ⊂ V₂ ⊂ ... ⊂ Vₗ₊₁
Layer i uses:
- Vinegar variables from V_i
- Oil variables from V_{i+1} \ V_i
This layered approach allowed Rainbow to achieve dramatically smaller signatures and faster operations compared to plain UOV, at the cost of additional algebraic structure in the public key. For the parameter set submitted to NIST:
| Parameter | Rainbow I (SL 1) | Rainbow III (SL 3) | Rainbow V (SL 5) |
|---|---|---|---|
| Public key | 161.6 KB | 861.4 KB | 1,885.4 KB |
| Signature | 66 bytes | 164 bytes | 204 bytes |
| Signing time | Fast | Fast | Fast |
Those signature sizes were extraordinary — comparable to classical ECDSA. Rainbow advanced to the Round 3 finalists in the NIST competition, and there was genuine excitement about having a non-lattice-based signature standard with compact signatures.
The Beullens Attack (February 2022)
In February 2022, Ward Beullens published “Breaking Rainbow Takes a Weekend on a Laptop,” and Rainbow’s world collapsed.
Beullens’ attack combined two key insights:
-
Rectangular MinRank attack: The layered structure of Rainbow creates a specific algebraic relationship that can be exploited through the MinRank problem — finding a low-rank matrix in a linear span of matrices. Beullens showed that the intersection of the oil spaces across Rainbow’s layers leaked exploitable information.
-
Improved solving techniques: By combining the structural weakness with refined Gröbner basis techniques and the “simple attack” (which directly targets the rectangular structure), Beullens demonstrated a practical key recovery.
The results were devastating:
| Security Level | Claimed Security | Beullens Attack Cost | Time (estimated) |
|---|---|---|---|
| Rainbow I (SL 1) | 2¹²⁸ | 2⁵³ | ~1 weekend on a laptop |
| Rainbow III (SL 3) | 2¹⁹² | 2⁷⁵ | Feasible for well-resourced attacker |
| Rainbow V (SL 5) | 2²⁵⁶ | 2⁹⁷ | Reachable with moderate compute cluster |
Rainbow was eliminated from the NIST competition in July 2022. The scheme went from finalist to dead in five months.
What Went Wrong with Rainbow
The fundamental lesson is that Rainbow’s layered structure — the very thing that made its signatures small — introduced exploitable algebraic relationships that plain UOV does not have. The multi-layer construction created dependencies between the layers’ oil spaces that could be detected and leveraged.
This is a recurring pattern in cryptographic design: optimizations that add structure can create attack surfaces. UOV’s “boring” single-layer approach, despite its larger keys, has no such vulnerability.
The Rainbow break also illustrates a subtlety about security proofs. Rainbow had a security proof — but the proof reduced to the hardness of the MQ problem for the public key polynomials as a whole. It did not account for the possibility that the layered structure might leak information about the secret key independently of solving the public system. The MinRank attack exploited structure that the security proof did not model. This is why cryptographers distinguish between “provable security” (which is always relative to a model) and “actual security” (which must withstand attacks the model did not anticipate).
1.5 GeMSS: A Different Multivariate Approach
GeMSS (Great Multivariate Short Signature) was a NIST Round 2 candidate that took a fundamentally different approach within the multivariate family. Rather than Oil-and-Vinegar, GeMSS was based on the HFE (Hidden Field Equations) construction:
- Define a univariate polynomial of bounded degree over an extension field F_{qⁿ}
- Express this polynomial as a multivariate system over the base field F_q
- Apply secret affine transformations to obscure the structure
GeMSS achieved remarkably small signatures (33 bytes at the 128-bit security level) but suffered from crippling drawbacks:
- Enormous public keys: 352 KB for SL 1, scaling to several megabytes at higher security levels
- Extremely slow signing: GeMSS signing required solving a system of polynomial equations via exhaustive search, taking hundreds of milliseconds to seconds
- Fragile security margins: The HFE family has a long history of attacks (Kipnis-Shamir, Faugère-Joux direct algebraic attacks), and GeMSS’s security margins were considered too thin
GeMSS was eliminated after Round 2. The HFE family, despite decades of research, has proven difficult to instantiate with both practical parameters and comfortable security margins.
The contrast between UOV and GeMSS is instructive for security architects: both are multivariate schemes, but UOV’s construction is algebraically simpler, with fewer structural features for an attacker to exploit. GeMSS’s use of the HFE central map introduced a richer algebraic structure that enabled more powerful attacks. In cryptographic design, simplicity is a security feature.
1.6 Current Status of Multivariate Schemes
The multivariate landscape in 2026:
- UOV: Alive and respected. A candidate in the NIST additional signatures round. Its 25+ years of cryptanalytic resistance provide strong confidence.
- MAYO: A compressed variant of UOV designed by Ward Beullens (the same cryptanalyst who broke Rainbow) that reduces public key sizes significantly (from ~66 KB to ~1.2 KB for SL 1) by deriving the public map from a smaller “seed” matrix using a technique called “whipping.” The fact that the person who broke Rainbow then designed an improved multivariate scheme speaks to the iterative nature of cryptographic progress. MAYO is a NIST additional signatures candidate.
- Rainbow: Dead. Broken beyond repair.
- GeMSS / HFE variants: Eliminated. Insufficient security margins.
2. Isogeny-Based Cryptography
2.1 Elliptic Curve Isogenies: The Concept
An isogeny is a structure-preserving map (a group homomorphism that is also a morphism of algebraic varieties) between two elliptic curves. Informally, an isogeny is a “bridge” that connects one elliptic curve to another while preserving the group structure.
Given two elliptic curves E₁ and E₂ over a finite field F_p, an isogeny φ: E₁ → E₂ is a rational map that:
- Sends the identity element of E₁ to the identity of E₂
- Preserves the group operation: φ(P + Q) = φ(P) + φ(Q)
Every isogeny has a kernel — a subgroup of E₁ that maps to the identity on E₂. Critically, the isogeny is uniquely determined by its kernel (up to isomorphism of the target curve). This means that specifying a subgroup of an elliptic curve implicitly defines a map to another curve.
The degree of an isogeny corresponds to the size of its kernel. An isogeny of degree ℓ can be computed in time O(ℓ) using Vélu’s formulas (or the improved √élu algorithm for large ℓ).
Isogeny-based cryptography exploits the computational difficulty of problems like:
- Path-finding in isogeny graphs: Given two curves E₁ and E₂, find an isogeny connecting them
- Endomorphism ring computation: Determine the full endomorphism ring of a supersingular elliptic curve
The isogeny graph of supersingular curves over F_{p²} forms an expander graph (a Ramanujan graph, in fact), meaning it is highly connected with good mixing properties. Walking randomly through this graph quickly reaches a “random-looking” curve, which is the basis for constructing cryptographic protocols.
The number of supersingular curves over F_{p²} is approximately p/12, so for cryptographic-size primes (256+ bits), the graph is enormous. Finding a path between two specific nodes in this graph — the isogeny path-finding problem — is believed to be computationally hard, even for quantum computers (no polynomial-time quantum algorithm is known for the general case, unlike the discrete logarithm problem which Shor’s algorithm solves efficiently).
graph TD
subgraph "Supersingular Isogeny Graph (simplified)"
E0["E₀"] -->|"ℓ-isogeny"| E1["E₁"]
E0 -->|"ℓ-isogeny"| E2["E₂"]
E1 -->|"ℓ-isogeny"| E3["E₃"]
E1 -->|"ℓ-isogeny"| E4["E₄"]
E2 -->|"ℓ-isogeny"| E5["E₅"]
E2 -->|"ℓ-isogeny"| E3
E3 -->|"ℓ-isogeny"| E6["E₆"]
E4 -->|"ℓ-isogeny"| E6
E5 -->|"ℓ-isogeny"| E6
E6 -->|"ℓ-isogeny"| E0
end
style E0 fill:#1a1a2e,stroke:#e94560,color:#eee
style E1 fill:#1a1a2e,stroke:#16213e,color:#eee
style E2 fill:#1a1a2e,stroke:#16213e,color:#eee
style E3 fill:#1a1a2e,stroke:#0f3460,color:#eee
style E4 fill:#1a1a2e,stroke:#0f3460,color:#eee
style E5 fill:#1a1a2e,stroke:#0f3460,color:#eee
style E6 fill:#1a1a2e,stroke:#e94560,color:#eee
2.2 SIDH/SIKE: The Promise and the Catastrophe
Supersingular Isogeny Diffie-Hellman (SIDH), proposed by De Feo and Jao in 2011, was an elegant key exchange protocol that used walks in supersingular isogeny graphs. SIKE (Supersingular Isogeny Key Encapsulation) was the KEM variant submitted to the NIST competition.
How SIDH Worked
The protocol used a supersingular curve E₀ over F_{p²} where p = 2ᵃ · 3ᵇ · f ± 1 for a small cofactor f:
- Alice chooses a random subgroup ⟨R_A⟩ of the 2ᵃ-torsion of E₀ and computes the isogeny φ_A: E₀ → E_A = E₀/⟨R_A⟩
- Bob chooses a random subgroup ⟨R_B⟩ of the 3ᵇ-torsion of E₀ and computes the isogeny φ_B: E₀ → E_B = E₀/⟨R_B⟩
- Alice sends E_A and the images of Bob’s torsion basis points under φ_A: specifically φ_A(P_B) and φ_A(Q_B)
- Bob sends E_B and the images of Alice’s torsion basis points under φ_B: specifically φ_B(P_A) and φ_B(Q_A)
- Alice uses Bob’s auxiliary points to compute E_AB = E_B/⟨φ_B(R_A)⟩
- Bob uses Alice’s auxiliary points to compute E_BA = E_A/⟨φ_A(R_B)⟩
- The shared secret is the j-invariant of E_AB = E_BA (they arrive at isomorphic curves)
SIKE’s compelling properties:
- Tiny keys: Public keys were only 330–564 bytes, the smallest of any NIST candidate
- Small ciphertexts: Comparable to classical key exchange
- Conceptual elegance: A beautiful generalization of Diffie-Hellman to isogeny graphs
SIKE advanced through Rounds 1, 2, and 3 of the NIST competition and was selected as one of four Round 4 candidates for continued evaluation. Its small key sizes made it particularly attractive for bandwidth-constrained applications. Microsoft Research invested heavily in SIKE, producing optimized implementations and even deploying experimental SIKE-based key exchange in test environments. The scheme was widely regarded as the most promising non-lattice KEM candidate.
The Castryck-Decru Attack (July 2022)
On July 30, 2022, Wouter Castryck and Thomas Decru posted a preprint titled “An Efficient Key Recovery Attack on SIDH.” The paper described a complete break of SIKE at all proposed security levels.
The attack exploited a fundamental structural weakness: SIDH publishes the images of torsion basis points.
Recall step 3 above: Alice sends φ_A(P_B) and φ_A(Q_B) — the images of known points under her secret isogeny. This auxiliary information, while necessary for the protocol to work, provides an attacker with evaluations of the secret isogeny at known points. Castryck and Decru showed that this information, combined with the theory of Kani’s theorem on products of elliptic curves (specifically, a 2003 result about reducibility of abelian surfaces), could be used to recover Alice’s secret isogeny entirely.
The attack works by:
- Constructing a product abelian surface (E₀ × E’) where E’ is a carefully chosen auxiliary curve
- Using the auxiliary torsion point information to identify a specific endomorphism of this surface
- Exploiting the Richelot isogeny structure to decompose this endomorphism and extract the secret isogeny
The computational cost was staggeringly low:
| SIKE Parameter | Claimed Security | Attack Time |
|---|---|---|
| SIKEp434 | NIST Level 1 (2¹²⁸) | ~10 minutes on a single core |
| SIKEp503 | NIST Level 2 (2¹⁴⁶) | ~1 hour on a single core |
| SIKEp610 | NIST Level 3 (2¹⁷⁴) | A few hours on a single core |
| SIKEp751 | NIST Level 5 (2²¹⁷) | Under a day on a single core |
The initial implementation by Castryck and Decru ran in about 1 hour on a single laptop for the SIKEp434 parameter set using Magma (a computer algebra system). Subsequent optimizations by Maino, Martindale, Panny, Pope, and Wesolowski reduced this to minutes and extended the attack to all parameter sets with improved efficiency. Robert, in a follow-up paper, simplified the attack further and showed it could be implemented in SageMath.
The attack was entirely classical — no quantum computer was needed. A scheme designed to resist quantum adversaries was broken by a classical algorithm running on commodity hardware. The irony was not lost on the cryptographic community.
Why the SIDH Break Was So Devastating
The SIKE break stands as one of the most dramatic events in modern cryptographic history for several reasons:
-
Speed of the break: A scheme claiming 128+ bits of security was broken in minutes to hours on consumer hardware. This is not a theoretical break requiring astronomical resources — it is a practical, implementable attack.
-
Timing: SIKE was broken during the NIST competition, after years of scrutiny. It had survived three full rounds of evaluation. The break came just days before NIST announced Round 4 selections.
-
The auxiliary point problem was known: Cryptographers had long recognized that publishing torsion point images was a potential weakness. There was an active subfield studying “auxiliary point” attacks. But the community underestimated how badly this information leaked.
-
The mathematics was “old”: Kani’s theorem, the key tool in the attack, dates to 1997/2003. The theoretical machinery needed to break SIKE had been available for nearly two decades. No new mathematical breakthroughs were required — just a novel application of existing theory. This is perhaps the most unsettling aspect — the ingredients for the attack were sitting in the algebraic geometry literature for years, waiting for someone to connect them to SIDH.
-
Complete and irrecoverable: Unlike some attacks that can be mitigated by increasing parameters, the Castryck-Decru attack targets a fundamental design choice (publishing auxiliary points). There is no parameter tweak that saves SIDH. The protocol is structurally broken.
2.3 CSIDH: A Different Isogeny Approach
CSIDH (pronounced “seaside”), proposed by Castryck, Lange, Martindale, Panny, and Renes in 2018, takes a fundamentally different approach to isogeny-based cryptography that avoids the vulnerability that killed SIDH.
The Key Difference: Class Group Actions
CSIDH works with ordinary elliptic curves (not supersingular) over F_p (not F_{p²}) and uses the class group action of the ideal class group of an imaginary quadratic order on the set of elliptic curves with a given endomorphism ring.
The critical distinction from SIDH: CSIDH does not reveal auxiliary torsion point images. The protocol is commutative by construction:
- Public parameters: A supersingular curve E₀ over F_p and a set of small primes ℓ₁, …, ℓₙ
- Alice’s secret key: A vector of small integers (e₁, …, eₙ)
- Alice’s public key: The curve E_A = [ℓ₁]^{e₁} · [ℓ₂]^{e₂} · … · [ℓₙ]^{eₙ} · E₀ (class group action applied repeatedly)
- Shared secret: Both parties apply their secret class group elements to the other’s public curve
Because the class group is commutative (it is an abelian group), Alice and Bob arrive at the same curve without needing to exchange any auxiliary point information. This eliminates the attack vector that destroyed SIDH.
The analogy to classical Diffie-Hellman is precise:
| Classical DH | CSIDH | |
|---|---|---|
| Group | (Z/pZ)* | Ideal class group Cl(O) |
| Set acted upon | Elements of (Z/pZ)* | Elliptic curves with endomorphism ring O |
| Secret key | Exponent a | Class group element [a] |
| Public key | g^a | [a] * E₀ |
| Shared secret | g^(ab) | [a] * [b] * E₀ = [b] * [a] * E₀ |
This structural parallel is why CSIDH is sometimes called “isogeny-based Diffie-Hellman” — it is the closest post-quantum analogue to the classical protocol that underpins most of modern key exchange.
CSIDH’s Current Status
CSIDH’s security picture is more nuanced than SIDH’s was:
- Classical security: Depends on the difficulty of computing class group actions and the isogeny path-finding problem. The best classical attacks involve computing the class group structure, with subexponential complexity via index calculus methods.
- Quantum security: CSIDH is vulnerable to Kuperberg’s algorithm for the hidden shift problem, which runs in subexponential quantum time. The exact cost is debated — estimates range from 2⁴⁰ to 2⁷⁰ quantum operations for proposed parameters, depending on assumptions about quantum memory availability.
- Parameter selection: The uncertainty in quantum attack costs makes it difficult to set parameters confidently. Conservative choices lead to slow performance.
CSIDH is not a NIST candidate but remains an active research topic. Its elegant mathematical structure (a genuine group action, enabling non-interactive key exchange) makes it attractive for specific protocols, but the quantum security concerns have prevented standardization momentum.
Why CSIDH Matters Despite Its Limitations
CSIDH occupies a unique niche in post-quantum cryptography because it supports non-interactive key exchange (NIKE) — a protocol where two parties can derive a shared secret using only each other’s static public keys, without any communication round trips. This is the same property that classical Diffie-Hellman provides, and it is essential for protocols like:
- Asynchronous messaging: Signal’s X3DH protocol relies on NIKE for initial key agreement when the recipient is offline
- Broadcast encryption: Efficiently distributing keys to large groups
- Credential systems: Anonymous credentials and oblivious PRFs
No lattice-based scheme supports efficient NIKE. If CSIDH’s quantum security can be established with sufficient confidence (or if Kuperberg’s algorithm turns out to require more quantum resources than feared), CSIDH or its successors could fill a genuine capability gap in the post-quantum toolkit.
2.4 SQISign: Compact Isogeny-Based Signatures
SQISign (Short Quaternion and Isogeny Signature), proposed by De Feo, Kohel, Leroux, Petit, and Wesolowski in 2020, is a signature scheme based on the Deuring correspondence — a deep connection between supersingular elliptic curves and quaternion algebras.
The Deuring Correspondence
The Deuring correspondence is one of the deepest results connecting algebraic geometry and algebra. It establishes a bijection between:
- Supersingular elliptic curves (up to isomorphism) over F_{p²}
- Maximal orders in the quaternion algebra B_{p,∞} ramified at p and infinity
This means that every problem about supersingular curves and isogenies between them can be translated into a problem about ideals in quaternion algebras — and vice versa. SQISign exploits this translation: operations that are hard in the geometric world (finding isogenies) become tractable in the algebraic world (computing ideals) when you know the secret key.
How SQISign Works
SQISign uses the Deuring correspondence directly. The scheme operates as follows:
- Key generation: Choose a secret isogeny τ: E₀ → E_A. The public key is the curve E_A. The secret key is the corresponding ideal in the quaternion algebra.
- Signing: Given a message, derive a challenge curve E₁ via hashing. Use the Deuring correspondence to translate the isogeny path-finding problem (find an isogeny from E_A to E₁) into an ideal computation in the quaternion algebra. Solve it there (which is efficient given the secret key) and translate back.
- Verification: Check that the provided isogeny is a valid connection between the public key curve and the challenge curve.
SQISign’s remarkable property is its signature compactness:
| Scheme | Security Level | Public Key | Signature |
|---|---|---|---|
| SQISign | NIST Level 1 | 64 bytes | 177 bytes |
| ML-DSA-44 | NIST Level 2 | 1,312 bytes | 2,420 bytes |
| FN-DSA-512 | NIST Level 1 | 897 bytes | ~666 bytes |
| SLH-DSA-128s | NIST Level 1 | 32 bytes | 7,856 bytes |
SQISign achieves the most compact combined (public key + signature) size of any post-quantum signature scheme, making it extraordinarily attractive for certificate chains and bandwidth-constrained applications.
SQISign’s Challenges
- Signing speed: SQISign signing is extremely slow — approximately 1–5 seconds depending on implementation and parameter set. This is orders of magnitude slower than ML-DSA or FN-DSA.
- Implementation complexity: The scheme requires arithmetic in quaternion algebras, computation of ideal lattice bases, and complex isogeny computations. Implementing it correctly (and in constant time) is a significant engineering challenge.
- Cryptanalytic maturity: SQISign is a relatively young scheme. While it avoids SIDH’s specific vulnerability (no auxiliary point leakage), the security relies on the hardness of the endomorphism ring problem, which has received less cryptanalytic attention than lattice problems.
- Verification speed: While faster than signing, verification is still slower than lattice-based alternatives.
SQISign is a candidate in the NIST additional digital signatures round. Its extraordinary compactness ensures continued interest, even if its performance characteristics limit it to specific use cases (e.g., certificate chains where signatures are computed once and verified many times, and where bandwidth is the primary constraint).
3. Other Post-Quantum Approaches
3.1 Symmetric-Key Based: Picnic
Picnic, designed by Chase, Derler, Goldfeder, Katz, Kolesnikov, Orlandi, Ramacher, Rechberger, Slamanig, and others, took a radically different approach to post-quantum signatures: instead of relying on any structured mathematical hard problem, it built signatures from symmetric-key primitives using zero-knowledge proofs.
The Construction
Picnic’s core idea:
- The secret key is a random key k for a block cipher (specifically, a variant of LowMC, chosen for its low multiplicative complexity)
- The public key is a plaintext-ciphertext pair (p, c) where c = Enc(k, p)
- Signing proves, in zero knowledge, that the signer knows k such that c = Enc(k, p) — without revealing k
The zero-knowledge proof is constructed using the MPC-in-the-head paradigm (Ishai, Kushilevitz, Ostrovsky, Sahai, 2007):
- Simulate a multi-party computation of the block cipher where the secret key is secret-shared
- Use the Fiat-Shamir heuristic to make the proof non-interactive
- The verifier “opens” a random subset of the MPC parties’ views to check consistency
Picnic’s security assumption is remarkably minimal: it essentially requires only that the block cipher is secure (specifically, that it behaves like a pseudorandom permutation). If AES is secure, then a version of Picnic using AES would be secure. This is the most conservative security assumption of any post-quantum signature scheme.
Picnic’s Fate
Picnic was a NIST Round 3 alternate candidate but was not selected for standardization. Its drawbacks:
- Large signatures: Picnic3 signatures at NIST Level 1 were approximately 12–14 KB — smaller than SLH-DSA’s small-signature variant but still substantial
- Slow verification: The ZK proof verification is computationally expensive
- LowMC concerns: The custom block cipher LowMC, chosen for its ZK-friendly structure, attracted algebraic attacks that forced parameter adjustments during the competition
- Complexity: The scheme is difficult to implement, audit, and optimize compared to lattice-based alternatives
Picnic demonstrated that minimal-assumption post-quantum signatures are possible, but the practical costs were too high for general standardization. The MPC-in-the-head paradigm continues to influence research, however, and modern descendants like FAEST (which replaces LowMC with AES, eliminating the custom cipher concern) are being studied.
Picnic’s legacy is conceptual rather than practical: it proved that post-quantum signatures can be built from nothing more than a block cipher, establishing a theoretical floor for what assumptions are truly necessary. Every other PQC signature scheme assumes the hardness of some structured mathematical problem; Picnic showed this assumption is optional — you just pay for it in performance.
3.2 Zero-Knowledge Based Approaches
Beyond Picnic, several post-quantum schemes leverage zero-knowledge proof systems:
FAEST (Fast AES-based Signature from Threshold computation) is a spiritual successor to Picnic that uses standard AES instead of LowMC. By employing the VOLE-in-the-head (Vector Oblivious Linear Evaluation) paradigm instead of traditional MPC-in-the-head, FAEST achieves significantly better performance:
- Signatures of ~5–6 KB at NIST Level 1
- Faster signing and verification than Picnic
- Security assumption: AES is a secure block cipher (no custom primitives)
FAEST is a candidate in the NIST additional signatures round and represents the state-of-the-art in symmetric-key-based post-quantum signatures.
SDitH (Syndrome Decoding in the Head) applies the MPC-in-the-head paradigm to the syndrome decoding problem from code-based cryptography:
- The secret key is an error vector e
- The public key is the syndrome s = He for a random parity-check matrix H
- Signing proves knowledge of e in zero knowledge
SDitH bridges zero-knowledge techniques with code-based assumptions and is also a NIST additional signatures candidate.
These ZK-based approaches share a common philosophical stance: trust the symmetric primitives. If you believe AES and SHA-3 are secure (and we have decades of evidence supporting that belief), then schemes built purely from these primitives inherit that confidence. The trade-off is performance and signature size — ZK proofs are inherently more expensive than direct algebraic operations.
3.3 Group-Action Based Cryptography
Group-action-based cryptography is an emerging theoretical framework that generalizes several PQC approaches. The core abstraction:
A group G acts on a set X via a map ★: G × X → X satisfying:
- Identity: e ★ x = x for the identity element e ∈ G
- Compatibility: (g · h) ★ x = g ★ (h ★ x)
The hard problem: given x and g ★ x, find g. This is a generalization of the discrete logarithm problem (where G = Z acts on a cyclic group by exponentiation).
CSIDH is the prime example: the ideal class group acts on the set of supersingular curves with a given endomorphism ring. But the framework extends further:
- Lattice-based group actions: Constructions where lattice-based groups act on sets of lattices or ideal lattices
- Code-based group actions: Actions derived from permutation groups on error-correcting codes
- Isogeny-based group actions: Generalizations of CSIDH to higher-dimensional abelian varieties
The group-action framework is attractive because it enables cryptographic protocols that are impossible or awkward to build from bare KEMs and signatures — threshold cryptography, non-interactive key exchange, oblivious pseudorandom functions, and verifiable random functions. These are protocols where the algebraic structure of a group action (particularly commutativity) is essential.
Current research challenges:
- Efficiency: Most group-action instantiations are significantly slower than lattice-based alternatives
- Quantum security: Kuperberg’s algorithm for the hidden shift problem threatens many group-action schemes, and the exact quantum complexity is unclear
- Maturity: The framework is relatively new, and concrete instantiations have limited cryptanalytic history
- Instantiation gap: The theoretical framework is elegant, but finding concrete groups and sets where the group action is both efficient to compute and hard to invert remains challenging
The group-action framework represents the frontier of PQC research — it is where new cryptographic capabilities may emerge, but also where the risk of unexpected breaks is highest. Security professionals should track this space for future capabilities while relying on standardized algorithms for current deployments.
3.4 Lattice-Isogeny Hybrids
A natural response to the SIKE break has been research into hybrid constructions that combine assumptions from different mathematical families:
- Lattice + isogeny: Use a lattice-based KEM for the primary key exchange, with an isogeny-based component providing additional structure (e.g., for key blinding or rerandomization)
- Lattice + code: Combine MLWE-based and syndrome-decoding-based components so that the scheme remains secure if either assumption holds. The combined security guarantee is “AND” — an attacker must break both to compromise the system.
- Multiple lattice assumptions: Use both structured (MLWE) and unstructured (LWE) lattice problems in the same scheme, hedging against the possibility that the ring/module structure of MLWE introduces exploitable weaknesses not present in plain LWE
The motivation is hedging: if one assumption falls, the other may survive. The cost is typically increased key/ciphertext sizes and computational overhead.
Concrete hybrid examples:
- X25519 + ML-KEM-768: Used in TLS 1.3 (RFC 9370, deployed by Chrome, Firefox, and Cloudflare). The combined shared secret is derived from both key exchanges, so an attacker must break both X25519 (classical ECDH) and ML-KEM (lattice-based) to recover the session key.
- ECDSA + ML-DSA composite certificates: Proposed in IETF drafts for X.509 certificates containing both a classical and post-quantum signature, ensuring backward compatibility while providing quantum resistance.
- Signal’s PQXDH: Combines X25519 with ML-KEM-768 for the initial key agreement in the Signal Protocol, protecting stored messages against future quantum decryption.
As of 2026, no single hybrid KEM or signature algorithm has been standardized by NIST as a monolithic primitive, but hybrid deployment strategies at the protocol level are already standard practice. For details on hybrid deployment, see Migration Strategies & Hybrid Approaches.
4. Lessons from Broken Schemes
4.1 The Casualty List
The NIST PQC competition was not just a selection process — it was a crucible that tested cryptographic designs against the full force of the global research community. The broken schemes taught lessons that are arguably more valuable than the winners themselves.
| Scheme | Family | Year Broken | Attack | Time to Break | Key Lesson |
|---|---|---|---|---|---|
| SIKE | Isogeny | 2022 | Castryck-Decru (torsion point attack via Kani’s theorem) | ~1 hour, single laptop | Auxiliary information leaks can be catastrophic |
| Rainbow | Multivariate | 2022 | Beullens (rectangular MinRank) | ~1 weekend, single laptop | Multi-layer optimization introduces exploitable structure |
| SABER | Lattice (LWR) | Not broken, eliminated | — | — | Similarity to Kyber made it redundant, not insecure |
| NTRU Prime | Lattice | Not broken, eliminated | — | — | Eliminated for redundancy, strong design |
| GeMSS | Multivariate (HFE) | Weakened | Algebraic attacks reducing security margins | — | Thin security margins collapse under sustained analysis |
| BIKE | Code (QC-MDPC) | Not broken | — | — | Advanced to Round 4, decoding failure rate analysis ongoing |
| Classic McEliece | Code (Goppa) | Not broken | — | — | Round 4; massive keys (~260 KB) limit adoption |
| Picnic | Symmetric/ZK | Not broken | — | — | LowMC concerns; performance insufficient for standardization |
| LUOV | Multivariate (UOV) | 2020 | Ding et al. (lifted UOV attack exploiting field lifting) | — | Lifting optimization was unsound |
| DAGS | Code (alternant) | 2018 | Barelli-Couvreur (structural weakness in algebraic geometry codes) | — | Novel code families need deep cryptanalytic vetting |
| RQC | Code (rank metric) | Weakened | Improved decoding attacks on rank metric codes | — | Rank metric cryptography still maturing |
| pqsigRM | Code (Reed-Muller) | 2019 | Multiple structural attacks on the underlying code | — | Using structured codes with known automorphisms is risky |
| Titanium | Lattice (MLWR) | 2018 | Parameter issues; insufficient security margin demonstrated | — | Aggressive parameter choices collapse under scrutiny |
4.2 Why “Novel Math” Can Be a Red Flag
A pattern emerges from the broken schemes:
Schemes based on newer mathematical structures failed at higher rates than those based on well-studied problems. This is not a coincidence — it is a consequence of how cryptanalysis works.
- SIKE was based on supersingular isogenies — a topic that had been studied in the cryptographic context for only about a decade before the attack
- Rainbow’s multi-layer UOV structure was a relatively recent optimization of a well-studied primitive
- DAGS used algebraic geometry codes with specific structural properties that had not been subjected to sufficient cryptanalysis
- LUOV’s “lifting” technique was a novel optimization that introduced a fatal weakness
By contrast, the survivors rest on problems with decades of cryptanalytic history:
- ML-KEM/ML-DSA: Module-LWE, studied since the 2000s, with LWE itself dating to Regev (2005) and lattice problems to the 1990s
- SLH-DSA: Hash function security, the most conservative assumption in cryptography
- HQC: Syndrome decoding, studied since McEliece (1978)
- UOV: The Oil-and-Vinegar construction, studied since 1997
This does not mean novel mathematics should be avoided — it means that cryptanalytic maturity is a form of security evidence that cannot be shortcut. A scheme with a beautiful security proof against known attacks can still fall to an attack technique that nobody has thought of yet. Time and adversarial attention are irreplaceable.
The analogy to software security is instructive: a new codebase, no matter how carefully designed, will have more undiscovered vulnerabilities than a mature codebase that has been audited and penetration-tested for decades. The same principle applies to mathematical constructions. “Hardening through adversarial exposure” is not unique to code — it applies equally to cryptographic assumptions.
4.3 How the Community Responded
The breaks of SIKE and Rainbow in 2022 triggered several important responses:
-
NIST adjusted its timeline: The additional signatures round was partly motivated by the recognition that diversity requires more candidates, and the existing candidate pool had been thinned by breaks.
-
Renewed emphasis on conservative designs: UOV’s revival as a serious candidate is a direct consequence of Rainbow’s fall. The community pivoted back to the “boring but unbroken” original construction.
-
Hybrid deployment became non-negotiable: The demonstration that heavily-studied PQC schemes could fall overnight made hybrid deployments (combining classical and post-quantum algorithms) the default recommendation for any production system. See Migration Strategies & Hybrid Approaches.
-
Cryptanalytic bounty programs expanded: Several organizations increased funding for cryptanalysis of PQC candidates, recognizing that breaking schemes during evaluation is infinitely preferable to breaking them after deployment.
-
Structured auxiliary information became a known hazard: The SIDH break specifically taught that publishing evaluations of a secret map at known points is dangerous. This lesson has influenced the design of subsequent isogeny-based and group-action-based protocols.
-
Open science proved its value: Both the Beullens and Castryck-Decru attacks were published as preprints and rapidly verified by the community. The open nature of the NIST process meant that these breaks happened during evaluation, not after deployment. A closed-door standardization process might have produced standards based on SIKE or Rainbow — the consequences of that alternative timeline are sobering to contemplate.
5. The Diversity Imperative
5.1 Why One Family Is Not Enough
As of 2026, the NIST-standardized PQC algorithms are heavily weighted toward lattice-based cryptography: ML-KEM (lattice KEM), ML-DSA (lattice signature), and FN-DSA (lattice signature) all rest on variants of the Learning With Errors problem. SLH-DSA (hash-based) and HQC (code-based) provide some diversity, but if a breakthrough attack on structured lattice problems were discovered, three of the five standards would fall simultaneously.
This is not a hypothetical concern. Consider the historical precedent:
- In the 1990s, nearly all public-key cryptography was RSA-based. The discovery of Shor’s algorithm (1994) put the entire infrastructure on a deprecation timeline.
- If a “Shor’s algorithm for lattices” were discovered — an algorithm solving MLWE in polynomial time on a quantum computer — the impact would be equally devastating.
No such algorithm is known or believed likely, but the lesson of SIKE is that our beliefs about hardness can be wrong. The cryptographic community has been wrong before, and the consequences of being wrong about lattice problems would be catastrophic.
5.2 The Portfolio Approach
Sound post-quantum security strategy requires a portfolio of algorithms from independent mathematical families:
graph TD
PQC["Post-Quantum<br/>Cryptography"] --> L["Lattice-Based"]
PQC --> C["Code-Based"]
PQC --> H["Hash-Based"]
PQC --> M["Multivariate"]
PQC --> I["Isogeny-Based"]
PQC --> S["Symmetric/ZK"]
L --> L1["ML-KEM<br/>(FIPS 203)"]
L --> L2["ML-DSA<br/>(FIPS 204)"]
L --> L3["FN-DSA<br/>(Draft)"]
C --> C1["HQC<br/>(Selected 2025)"]
C --> C2["Classic McEliece<br/>(Round 4)"]
H --> H1["SLH-DSA<br/>(FIPS 205)"]
M --> M1["UOV / MAYO<br/>(Additional Sigs)"]
I --> I1["SQISign<br/>(Additional Sigs)"]
S --> S1["FAEST / SDitH<br/>(Additional Sigs)"]
style PQC fill:#1a1a2e,stroke:#e94560,color:#eee
style L fill:#1a1a2e,stroke:#e94560,color:#eee
style C fill:#1a1a2e,stroke:#16213e,color:#eee
style H fill:#1a1a2e,stroke:#0f3460,color:#eee
style M fill:#1a1a2e,stroke:#533483,color:#eee
style I fill:#1a1a2e,stroke:#533483,color:#eee
style S fill:#1a1a2e,stroke:#533483,color:#eee
style L1 fill:#16213e,stroke:#e94560,color:#eee
style L2 fill:#16213e,stroke:#e94560,color:#eee
style L3 fill:#16213e,stroke:#e94560,color:#eee
style C1 fill:#16213e,stroke:#16213e,color:#eee
style C2 fill:#16213e,stroke:#16213e,color:#eee
style H1 fill:#16213e,stroke:#0f3460,color:#eee
style M1 fill:#16213e,stroke:#533483,color:#eee
style I1 fill:#16213e,stroke:#533483,color:#eee
style S1 fill:#16213e,stroke:#533483,color:#eee
This taxonomy illustrates that the post-quantum landscape is not a monolith — it is a diverse ecosystem of independent mathematical assumptions. Each family provides a hedge against breakthroughs in the others:
- If lattice problems fall: code-based (HQC, McEliece), hash-based (SLH-DSA), and multivariate (UOV) schemes survive
- If code-based problems weaken: lattice and hash-based schemes are unaffected
- If a hash function collision attack emerges: lattice and code-based schemes are independent
The additional signatures round — evaluating UOV, MAYO, SQISign, FAEST, SDitH, HAWK, and others — is explicitly motivated by this diversity argument. NIST has stated that it wants at least one standardized signature scheme from a non-lattice, non-hash family, explicitly acknowledging the concentration risk in the current standards portfolio.
For organizations building crypto-agility into their architectures (the ability to swap algorithms without redesigning protocols), maintaining awareness of all these families is not academic — it is operational preparedness.
5.3 Timeline of Breaks and Advances
graph LR
A["1997<br/>Oil & Vinegar<br/>proposed"] --> B["1998<br/>Balanced OV<br/>broken"]
B --> C["1999<br/>UOV proposed<br/>(unbalanced fix)"]
C --> D["2005<br/>Rainbow<br/>proposed"]
D --> E["2011<br/>SIDH<br/>proposed"]
E --> F["2017<br/>NIST Round 1<br/>82 submissions"]
F --> G["2020<br/>LUOV broken<br/>(lifted OV attack)"]
G --> H["Feb 2022<br/>Rainbow broken<br/>(Beullens)"]
H --> I["Jul 2022<br/>SIKE broken<br/>(Castryck-Decru)"]
I --> J["2023-2025<br/>UOV revival<br/>SQISign advances"]
style A fill:#1a1a2e,stroke:#16213e,color:#eee
style B fill:#1a1a2e,stroke:#e94560,color:#eee
style C fill:#1a1a2e,stroke:#0f3460,color:#eee
style D fill:#1a1a2e,stroke:#16213e,color:#eee
style E fill:#1a1a2e,stroke:#16213e,color:#eee
style F fill:#1a1a2e,stroke:#0f3460,color:#eee
style G fill:#1a1a2e,stroke:#e94560,color:#eee
style H fill:#1a1a2e,stroke:#e94560,color:#eee
style I fill:#1a1a2e,stroke:#e94560,color:#eee
style J fill:#1a1a2e,stroke:#0f3460,color:#eee
6. Practical Implications for Security Professionals
6.1 Algorithm Selection Guidance
When evaluating post-quantum algorithms beyond the NIST primary standards:
For high-confidence production deployments (2026):
- Use ML-KEM + ML-DSA as the primary post-quantum algorithms — these have the strongest combination of cryptanalytic maturity, performance, and standardization status
- Deploy in hybrid mode with classical algorithms (X25519, ECDSA) as a hedge against both undiscovered PQC weaknesses and implementation bugs in new code
- Use SLH-DSA where lattice-independent signatures are required (e.g., firmware signing, long-lived root certificates, root CA certificates with 20+ year lifetimes)
- Ensure your TLS stack supports ML-KEM hybrid key exchange — all major browsers and cloud providers have deployed this as of late 2025
For bandwidth-constrained applications:
- Monitor SQISign’s progress in the NIST additional signatures round
- Its 177-byte signatures are compelling for certificate chains in IoT and embedded systems
- Do not deploy SQISign in production until it achieves standardization — it is too young
For defense-in-depth / algorithmic diversity:
- Plan to incorporate a multivariate signature (UOV or MAYO) when standardized — this provides a hedge against lattice breakthroughs that would affect ML-DSA and FN-DSA simultaneously
- Consider HQC as an alternative KEM to ML-KEM for systems requiring code-based hedging; HQC’s selection by NIST in March 2025 provides standardization confidence
- Classic McEliece offers the most conservative KEM security assumption (47 years of cryptanalytic study) but requires managing 260 KB+ public keys, which is impractical for most TLS deployments but acceptable for long-term key storage and specialized protocols
- Build crypto-agility into your architecture now — the ability to swap algorithms at the configuration level rather than requiring code changes will pay dividends as the standards portfolio evolves
6.2 Risk Assessment Matrix
When evaluating non-standard PQC algorithms for potential future deployment, security teams should consider the following risk dimensions:
| Factor | Low Risk | Medium Risk | High Risk |
|---|---|---|---|
| Cryptanalytic age | 20+ years (UOV, McEliece) | 10–20 years (lattice schemes) | <10 years (SQISign, CSIDH) |
| Standardization status | NIST FIPS published | NIST round candidate | Academic proposal only |
| Implementation maturity | Multiple audited libraries | Reference implementations available | Proof-of-concept only |
| Side-channel resistance | Constant-time implementations proven | Constant-time possible but unaudited | Inherent timing leaks |
| Mathematical diversity | Independent assumption family | Related to standardized schemes | Same assumption as existing standard |
Schemes in the “high risk” column are not unsuitable for all purposes — research deployments, experimental protocols, and academic study all benefit from exploring the frontier. But production systems protecting sensitive data should remain on the “low risk” end until standardization is complete.
6.3 What to Watch
The post-quantum landscape continues to evolve rapidly. Key developments to monitor:
- NIST additional signatures round outcome (expected 2026–2027): Will determine whether UOV, MAYO, SQISign, FAEST, or others join the standards portfolio
- Lattice cryptanalysis advances: Any improvement in solving MLWE or NTRU would have systemic implications. Watch for papers at CRYPTO, EUROCRYPT, and ASIACRYPT.
- Quantum hardware milestones: Progress toward fault-tolerant quantum computers affects the urgency of migration but not the direction. See Quantum Computing Fundamentals.
- Side-channel attacks on PQC implementations: Practical attacks on deployed PQC implementations (power analysis, timing attacks, fault injection) are a growing concern as PQC moves from theory to practice
- New isogeny constructions: Post-SIDH isogeny research (SQISign, CSIDH optimizations, higher-dimensional isogenies) remains active. Novel constructions may emerge.
- Multivariate key compression: Research into reducing UOV and MAYO public key sizes continues. Breakthroughs here could make multivariate schemes viable for constrained environments where they currently cannot compete with lattice-based alternatives.
- Post-SIKE isogeny constructions: The death of SIDH has not killed isogeny-based cryptography — it has redirected it. Constructions that avoid revealing auxiliary torsion point information (SQISign, CSIDH, and newer proposals like SCALLOP) continue to advance. The isogeny community has absorbed the SIDH lesson and is designing around it.
- Group-action standardization: If efficient, quantum-resistant group-action instantiations emerge, they could enable post-quantum protocols (NIKE, threshold signatures, VRFs) that lattice-based schemes cannot efficiently support. This would fill a genuine capability gap in the post-quantum toolkit.
6.4 Comparison: Non-Standard PQC Signature Candidates
For security architects evaluating the full landscape of post-quantum signature options beyond ML-DSA and SLH-DSA, the following comparison captures the key trade-offs among candidates in NIST’s additional signatures round and other notable schemes:
| Scheme | Family | PK Size | Sig Size | Sign Speed | Verify Speed | Key Advantage | Key Concern |
|---|---|---|---|---|---|---|---|
| UOV | Multivariate | ~67 KB | 96 B | Fast | Fast | 25+ years of cryptanalysis | Large public keys |
| MAYO | Multivariate | ~1.2 KB | ~321 B | Fast | Fast | Compressed UOV keys | Newer compression technique |
| SQISign | Isogeny | 64 B | 177 B | Very slow | Slow | Smallest combined size | Young scheme, slow signing |
| FAEST | Symmetric/ZK | ~32 B | ~5.6 KB | Moderate | Moderate | AES-based (minimal assumption) | Larger signatures |
| SDitH | Code/ZK | ~35 B | ~9 KB | Moderate | Moderate | Code-based diversity | Larger signatures |
| HAWK | Lattice | ~1.0 KB | ~555 B | Fast | Fast | Lattice diversity (not Fiat-Shamir) | Same lattice family concern |
No single scheme dominates across all dimensions. Algorithm selection must be driven by the specific constraints of the deployment environment — bandwidth limits, computational budgets, key storage capacity, and the organization’s risk tolerance for newer mathematical assumptions.
Summary
The post-quantum cryptographic landscape extends far beyond the NIST primary standards. Multivariate schemes like UOV have demonstrated remarkable resilience over decades, while their optimized variant Rainbow fell to a devastating structural attack. Isogeny-based cryptography produced both the most spectacular failure in modern cryptographic history (SIKE, broken in an hour on a laptop) and one of the most promising compact signature schemes (SQISign). Symmetric-key approaches like Picnic and FAEST push the boundary of minimal-assumption cryptography.
The central lesson is that cryptanalytic maturity cannot be manufactured — it can only be earned through time and adversarial scrutiny. Schemes that survive decades of attack (UOV, McEliece, hash-based signatures) carry a form of security evidence that no proof can replicate. Schemes built on novel mathematics, however elegant, carry elevated risk until they have endured comparable scrutiny.
For security professionals, the imperative is clear: deploy the standardized algorithms, deploy them in hybrid mode, and maintain algorithmic agility to incorporate new standards as the additional signature round concludes. The breaks of 2022 demonstrated that cryptographic assumptions can fail suddenly and completely. A diversified portfolio — spanning lattice, code, hash, multivariate, and eventually isogeny families — is the only responsible strategy.
The PQC families explored on this page — multivariate, isogeny-based, symmetric/ZK, group-action, and hybrid constructions — are not merely academic curiosities. They represent the next generation of standards and the safety net against catastrophic failure of the current ones. Understanding their strengths, weaknesses, and the lessons embedded in their broken predecessors is essential for any organization building a long-term post-quantum security posture.
For the NIST standardization timeline and the algorithms that were selected, see NIST PQC Standardization Process. For implementation guidance on the currently standardized algorithms, see ML-KEM Deep Dive and ML-DSA & SLH-DSA Deep Dive.