More on whether useful quantum computing is “imminent”
Factoring Benchmarks and Scaling Shor’s Algorithm
- Several comments note that even factoring 21 with “real” Shor on fault‑tolerant qubits is beyond current capabilities, so factoring is a bad current benchmark.
- One side argues we’re still at the “make it work at all” stage; once small numbers can be reliably factored on logical qubits, scaling to large keys could be relatively quick (Moore’s‑law‑like).
- Others respond that in quantum systems, difficulty scales badly with circuit depth and qubit count, so input size is absolutely part of the challenge.
- Some stress we haven’t yet run Shor “properly” even for 15; existing demos use shortcuts that don’t test real‑world scalability.
Error Correction, Noise, and Physical Constraints
- Discussion distinguishes physical vs logical qubits: theory assumes near‑perfect logical qubits built from many noisy physical ones.
- One view: logical error can drop exponentially with code size, so for
1000 logical qubits you might “only” need a large but constant physical overhead (1000×). - Others argue SNR and gate precision are not magically fixed by error correction—especially for fine rotations in quantum Fourier transforms.
- A technical comment estimates that realistic connectivity (nearest‑neighbor, SWAPs, decoherence while waiting, limited control lines) pushes required error rates and physical qubit counts extremely high (hundreds of thousands+ for modest keys).
Imminence and Historical Analogies
- Some researchers in the thread say “it will happen” but not “imminent” in the everyday sense; we’re compared to early transistor/mechanical‑computer eras, maybe even pre‑computer 1920s.
- Others say this is more like nuclear fusion: each advance triggers “5 years away!” narratives without delivering usefulness.
- A minority view holds that scaling may hit unknown physical limits (e.g., concern about computing with amplitudes ~2^-256), though others push back that such limits are not supported by current theory.
Applications Beyond Cryptography
- Widely cited “known” applications:
- simulation of quantum physics/chemistry,
- breaking much current public‑key crypto,
- modest speedups in optimization/ML and related tasks.
- Some links/remarks suggest quantum advantage for chemistry may be narrower than initially hoped, because classical methods improved.
- “Quantum compression” claims (100–1000× data compression) are strongly challenged as misunderstanding both compression and quantum algorithms.
- Many expect any realistic deployment to be hybrid: quantum as a specialized accelerator, not a standalone replacement.
Security, Secrecy, and Post‑Quantum Cryptography
- The blog’s nuclear‑fission analogy and warning about estimates “going dark” are read by some as a serious signal to migrate to post‑quantum crypto; others see it as more general precaution than “RSA is secretly broken.”
- Commenters note intelligence agencies are actively pushing post‑quantum schemes, partly due to “harvest now, decrypt later” risk.
- Skeptics point out that no non‑toy quantum factorization beyond trivial numbers has been published, suggesting we’re far from breaking 2048‑bit keys.
- Practical “signals” of real progress that people suggest watching for:
– sudden funding spikes or classified projects,
– unexplained draining of old Bitcoin/ECC‑vulnerable addresses (though any visible large‑scale attack would damage the asset’s value).
Hype, Grift, and Research Ecosystem
- Several comments complain about “refrigerator companies” and snake‑oil: firms overselling one‑off or non‑reproducible results to secure funding.
- A working researcher laments too few rigorous groups, poor methodology, lack of openness, and fragmented directions (multiple architectures, digital vs photonic) slowing progress.
- At the same time, many argue the field is still young and deserves continued funding, even if useful general‑purpose quantum computing is likely decades away and not guaranteed.