NIST selects HQC as fifth algorithm for post-quantum encryption

HQC selection and role

  • Commenters welcome HQC as a complementary KEM to Kyber/ML-KEM: different hardness assumption (code-based vs lattice) with more practical key sizes than Classic McEliece.
  • Clarification that HQC is the fifth PQC algorithm standardized, but only the second KEM; the other three are signature schemes.

Using multiple PQ algorithms / hybrids

  • Several people advocate hybrid designs: combine classical (X25519/Ed25519, RSA) with PQ (Kyber, HQC, Dilithium, SPHINCS+) so attackers must break both.
  • Some designs use multiple KEMs to protect one symmetric session key, combined via a KDF/combiner.
  • Signal’s PQXDH is repeatedly cited as an example of classical+PQC in parallel.

Urgency of PQ transition and “store-now-decrypt-later”

  • Consensus: KEM/key exchange should be upgraded “ASAP” because past traffic can be recorded now and decrypted later.
  • Signatures are less urgent but still important due to long rollout times.
  • Practical concern: PQ signatures (especially SPHINCS+, Dilithium) are much larger than ed25519 and hard to fit into constrained embedded environments.

Layering encryption and combiners

  • Debate whether “layering” multiple encryptions is safe.
    • One camp: naïve bespoke layering can open side channels or cross-protocol issues; better to encrypt different key shares with different public-key schemes and combine via a KDF.
    • Another camp: protocol-level layering (TLS-over-TLS, SSH-over-SSH) is fine and even mandated in high-security settings for defense in depth.
  • Some note historical cross-protocol attacks (e.g., DROWN) as cautionary tales.
  • Hash function “layering” is flagged as particularly tricky.

Quantum computing timeline and priorities

  • Wide disagreement on when (or whether) QC will threaten current crypto:
    • Skeptics say “basically zero” chance by 2050, view QC risk as hype, and argue resources should focus on perennial bugs (buffer overflows, XSS, injection).
    • Others stress infrastructure inertia and long-term secrecy needs (national security, 20–30 year lifetimes), arguing even a small probability justifies action now.
    • Some point to government and intelligence community behavior as evidence they take QC seriously, though “urgent” is contested.
  • There’s back-and-forth on whether funding PQC meaningfully diverts resources from fixing today’s vulns; some argue budgets are large enough to do both, others insist trade-offs are real.

Trust in NIST and NSA influence

  • Some commenters worry NIST recommendations might be biased by NSA knowledge of secret attacks, suggesting using non‑NIST algorithms alongside NIST ones or distrusting NIST entirely.
  • Others don’t resolve this, leaving the extent of NSA influence unclear.

Understanding PQ math and preferred schemes

  • Several admit lattice/code-based schemes are harder to build intuition for than RSA/EC, sharing resources and noting AI tools helped them understand toy versions.
  • Hash-based schemes like SPHINCS+ and Lamport signatures are praised for conceptual simplicity and strong assumptions, though large signatures are a downside.
  • Individual “favorites” vary: lattices (Kyber, Dilithium, NTRU Prime), code-based (Classic McEliece, HQC), and hash-based (SPHINCS+) all have supporters, with many favoring hybrids to hedge against future breaks.

Standards, naming, and politics

  • Some mention US timelines (e.g., CNSA 2.0) pushing PQ adoption by ~2030, especially for long‑lived secrets.
  • There’s mild concern that political changes or budget cuts could disrupt NIST’s standardization role, but details of potential impact remain unclear.