#2226: When Quantum Breaks Everything

Quantum computers will shatter RSA and elliptic-curve encryption—but the real danger is data being stolen and stored right now, waiting to be decry...

0:000:00
Episode Details
Episode ID
MWP-2384
Published
Duration
23:30
Audio
Direct link
Pipeline
V5
TTS Engine
chatterbox-regular
Script Writing Agent
claude-sonnet-4-6

AI-Generated Content: This podcast is created using AI personas. Please verify any important information independently.

When Quantum Breaks Everything: Post-Quantum Cryptography and the Internet's Race Against Time

The threat from quantum computing is often framed as distant and theoretical. But there's a problem happening right now that makes it urgent: nation-state adversaries are almost certainly recording encrypted internet traffic today, betting they'll be able to decrypt it in fifteen years when quantum computers mature. This "harvest-now-decrypt-later" attack is the real reason the cryptographic infrastructure of the internet is being overhauled.

How Quantum Computers Break Current Encryption

RSA and elliptic-curve cryptography—the algorithms that secure HTTPS, TLS, SSH, and certificate authorities—rely on mathematical problems that are believed to be computationally intractable. Factoring a 2,048-bit number would take classical computers longer than the age of the universe. But in 1994, Peter Shor published an algorithm that, running on a sufficiently powerful quantum computer, collapses that problem to hours or less.

The same vulnerability exists in elliptic-curve cryptography, which uses the discrete logarithm problem over elliptic curves. Shor's algorithm breaks both with equal efficiency—which is why both underpin the internet's public-key infrastructure.

The timeline, however, is uncertain. Current quantum systems like IBM's Heron and Google's Willow operate with hundreds to low thousands of physical qubits. Breaking RSA would require millions of high-fidelity physical qubits—most serious estimates place cryptographically relevant quantum computing at ten to twenty years out. But that timeline doesn't matter for data being encrypted today. Once it's stored, it can wait.

NIST's Post-Quantum Standards

Recognizing this urgency, NIST launched a post-quantum standardization process in 2016. They received eighty-two algorithm submissions from research teams worldwide and spent eight years evaluating them through mathematical analysis, cryptanalysis attempts, and performance benchmarking. In August 2024, they finalized the first three standards:

  • ML-KEM: A key encapsulation mechanism based on the CRYSTALS-Kyber scheme
  • ML-DSA: A digital signature algorithm based on CRYSTALS-Dilithium
  • SLH-DSA: A hash-based signature algorithm as a backup

The first two are lattice-based, representing the field's consensus on what can survive quantum attacks.

Lattice Cryptography and the Learning With Errors Problem

Lattice-based cryptography operates on a deceptively simple premise: a lattice is a regular grid of points in high-dimensional space—imagine an infinitely extending checkerboard in five hundred dimensions. The hard problem at its core is the Learning With Errors (LWE) problem: given a secret vector multiplied by a public matrix plus a small random error term, recovering the secret is believed to be computationally hard—even for quantum computers. No known quantum algorithm provides a meaningful speedup against LWE.

This is important to emphasize: we don't have a proof that LWE is hard. We have decades of cryptanalysis, connections to well-studied worst-case lattice problems, and strong evidence. But like RSA, it's a computational assumption, not a theorem. The NIST process mitigated this risk by standardizing algorithms from different mathematical families and including multiple rounds of cryptanalysis.

The Migration Challenge

Replacing the internet's cryptographic infrastructure is genuinely large. It's often compared to Y2K, but that undersells it. Y2K was a finite bug with a known deadline. This is a heterogeneous infrastructure replacement—web browsers, certificate authorities, VPN systems, embedded devices in industrial control systems, hardware security modules with ten-year lifespans. Some can be patched with software. Others require physical replacement.

During the transition, systems must support both classical and post-quantum algorithms simultaneously in a "hybrid approach." Google has been running post-quantum key exchange experiments in Chrome since 2023, using a hybrid of X25519 and CRYSTALS-Kyber. Cloudflare has tested similar schemes. Apple announced post-quantum protections in iMessage with PQ3. The US government issued directives requiring federal agencies to inventory cryptographic assets and begin migration planning. But most enterprise systems and legacy infrastructure haven't started.

Beyond Defense: Homomorphic Encryption and Zero-Knowledge Proofs

Post-quantum standards are about defending what exists. But the frontier includes cryptographic tools that enable things previously impossible.

Homomorphic encryption allows computations on encrypted data without decryption. A server can process ciphertext, and the decrypted result is correct—as if computation happened in the clear. The server never sees underlying values. The catch: it's slow. Fully homomorphic encryption carries overhead factors of ten thousand to one hundred thousand compared to plaintext computation, depending on the scheme. But the use cases justify it: hospitals can run machine learning on encrypted patient records without exposing data. Banks can jointly compute on encrypted transaction data. Genomics research can operate on extraordinarily sensitive genome data. The field has advanced dramatically since Craig Gentry's 2009 proof of concept. The CKKS scheme (2017) handles approximate arithmetic for machine learning much more efficiently. Hardware accelerators are being designed specifically for homomorphic encryption.

Zero-knowledge proofs are philosophically stranger: proving a statement is true without revealing any information about why. You can prove you know a Sudoku solution without showing it. You can prove you know a password without transmitting it. You can prove your age is over eighteen without revealing your birthdate. These are already deployed in privacy-preserving cryptocurrencies and emerging in enterprise systems.

The Deeper Vulnerability

The Voynich manuscript stumped cryptanalysts because they lacked conceptual tools. Modern cryptography is built on the assumption that certain mathematical problems are hard. The threat from quantum computing isn't that someone got smarter—it's that the computational substrate changed. That's a different kind of vulnerability, one that purely mathematical analysis can't catch. The race now is to rebuild the foundation before the machine arrives.

Downloads

Episode Audio

Download the full episode as an MP3 file

Download MP3
Transcript (TXT)

Plain text transcript file

Transcript (PDF)

Formatted PDF with styling

#2226: When Quantum Breaks Everything

Corn
So Daniel sent us this one as a follow-up to the Voynich manuscript episode — and honestly, the through-line is elegant. We spent that episode sitting with this humbling idea: that even brilliant cryptanalysts, working at the absolute edge of what their era could conceive, can stare at a cipher for centuries and get nowhere. And Daniel wants to use that as a springboard into modern cryptography. Specifically: quantum computers and the threat they pose to RSA and elliptic-curve encryption, NIST's finalization of post-quantum cryptography standards last year, and what all of this means for the field right now. Lattice-based cryptography, homomorphic encryption, zero-knowledge proofs — the whole frontier.
Herman
And it's a genuinely interesting moment to be asking this. The Voynich thing is a good frame because the failure mode there was conceptual — the right tools didn't exist yet. What's happening now is almost the inverse. We have tools that are about to exist, and we're scrambling to restructure the entire cryptographic foundation of the internet before they arrive.
Corn
Before they arrive. That's the part that keeps me awake. Well, not really — I sleep constantly — but hypothetically.
Herman
Right, so let me back up slightly on the threat model, because I think a lot of coverage oversimplifies it. The thing people say is "quantum computers will break RSA." That's true but incomplete. The mechanism is Shor's algorithm, which Peter Shor published in 1994. It runs on a quantum computer and can factor large integers in polynomial time. RSA's entire security assumption rests on the fact that factoring a two-thousand-and-forty-eight-bit number is computationally intractable for classical machines — we're talking longer than the age of the universe. Shor's algorithm, on a sufficiently powerful quantum computer, collapses that to hours or less.
Corn
And elliptic-curve encryption has the same problem?
Herman
Same problem, different mathematical trapdoor. Elliptic-curve cryptography relies on the discrete logarithm problem over elliptic curves — also solvable by Shor's algorithm on a quantum machine. So both RSA and elliptic-curve, which together underpin basically all public-key infrastructure on the internet — TLS, HTTPS, SSH, certificate authorities — are vulnerable to the same class of attack.
Corn
So the question is just: when does a quantum computer powerful enough to run Shor's algorithm actually exist?
Herman
That's the crux of it, and there's genuine uncertainty here. Right now, the most advanced quantum systems — IBM's Heron processors, Google's Willow chip announced late last year — are operating in ranges of hundreds to low thousands of physical qubits. But to run Shor's algorithm against a two-thousand-and-forty-eight-bit RSA key, you need something in the range of four thousand logical qubits. And logical qubits are not the same as physical qubits. Because of error rates, you need somewhere between a thousand and ten thousand physical qubits per logical qubit depending on the error correction scheme. So we're talking millions of high-fidelity physical qubits. That's not a near-term problem. Most serious estimates put cryptographically relevant quantum computing at ten to twenty years out.
Corn
Which sounds like a lot of runway until you think about the harvest-now-decrypt-later problem.
Herman
Which is the thing that makes this urgent right now. The attack model isn't "wait until quantum computers exist, then break encryption." It's "record encrypted traffic today, store it, decrypt it when the hardware catches up." Nation-state adversaries are almost certainly doing this already. Anything classified, anything sensitive that's transmitted over the internet today could potentially be decrypted in fifteen years. So the urgency isn't about the quantum computer existing — it's about the data that's being collected right now.
Corn
And that's why NIST didn't wait. They started the post-quantum standardization process back in 2016.
Herman
2016, right. They put out a call for candidate algorithms, received eighty-two submissions from research teams around the world, and spent the next eight years in evaluation rounds — mathematical analysis, cryptanalysis attempts, performance benchmarking. And then in August 2024, they finalized the first three post-quantum cryptography standards. FIPS two-oh-three, FIPS two-oh-four, and FIPS two-oh-five. The primary ones being ML-KEM, which is a key encapsulation mechanism based on the CRYSTALS-Kyber scheme, and ML-DSA, a digital signature algorithm based on CRYSTALS-Dilithium. Both are lattice-based.
Corn
Okay, so lattice-based cryptography. This is the heart of where the field has moved. What actually is a lattice in this context?
Herman
So a lattice is a regular grid of points in high-dimensional space — think of it like an infinitely extending checkerboard, but in five hundred dimensions instead of two. The hard problem at the core of lattice cryptography is called the Learning With Errors problem, or LWE. The basic idea: you take a secret vector, you multiply it by a public matrix, and you add a small random error term. Given the result and the public matrix, recovering the secret vector is believed to be computationally hard — even for quantum computers. There's no known quantum algorithm that gives a meaningful speedup on LWE. That's the key property.
Corn
And "believed to be hard" is doing a lot of work in that sentence.
Herman
It is, and I want to be honest about that. We don't have a proof that LWE is hard. We have strong evidence — decades of cryptanalysis, connections to worst-case lattice problems that are well-studied — but it's a computational assumption, not a theorem. The same is technically true of RSA, though. We don't have a proof that factoring is hard. We just have a few centuries of people failing to do it efficiently.
Corn
The Voynich problem again. Absence of a solution isn't a proof of impossibility.
Herman
Which is why the NIST process included multiple rounds of cryptanalysis, and why they standardized algorithms from different mathematical families as backups. The fourth standard they finalized, FIPS two-oh-six, is SLH-DSA, which is based on hash functions rather than lattices. If lattice-based cryptography turns out to have a weakness we haven't found yet, hash-based signatures are a fallback.
Corn
How painful is the migration going to be? Because "the entire public-key infrastructure of the internet needs to be replaced" is not a small sentence.
Herman
It's a genuinely large undertaking. The comparison people reach for is Y2K, but I think that undersells it. Y2K was a finite bug with a known deadline. This is a cryptographic infrastructure replacement where the timeline is probabilistic and the systems involved are extraordinarily heterogeneous. You've got web browsers, certificate authorities, VPN systems, embedded devices in industrial control systems, hardware security modules with ten-year lifespans. Some of those can be updated with a software patch. Some of them require physical hardware replacement. And there are interoperability issues — during the transition period, systems need to support both classical and post-quantum algorithms simultaneously, which is what's called a hybrid approach.
Corn
How are the major players actually moving on this? Is it happening?
Herman
It's starting. Google has been running experiments with post-quantum key exchange in Chrome since around 2023 — they were using a hybrid of X25519 and CRYSTALS-Kyber before the standards were even finalized. Cloudflare has been testing similar hybrid schemes. Apple announced post-quantum protections in iMessage with their PQ3 protocol. The US government issued a directive requiring federal agencies to inventory their cryptographic assets and begin migration planning. But we're still in the early stages. Most enterprise systems haven't started. Most legacy infrastructure hasn't been touched.
Corn
And there's a particular irony here that I keep thinking about. The Voynich manuscript stumped cryptanalysts because they lacked the conceptual tools. Modern cryptography is built on the assumption that certain mathematical problems are hard. And the thing that might break it isn't a cleverer algorithm — it's a different kind of machine.
Herman
That's a nice way to put it. The threat isn't that someone got smarter. It's that the computational substrate changed. And that's a different kind of vulnerability — one that purely mathematical analysis can't catch.
Corn
Alright, let's pivot because I want to get into homomorphic encryption and zero-knowledge proofs, because these feel like a different category of thing. Less "defend what we have" and more "enable things that were previously impossible."
Herman
Very different category. So homomorphic encryption — the core idea is that you can perform computations on encrypted data without ever decrypting it. The server doing the computation sees only ciphertext. It never sees the underlying values. And the result, when you decrypt it, is the correct answer as if you'd done the computation in the clear.
Corn
Which sounds like magic. And probably is magic, so what's the catch?
Herman
The catch is performance. Fully homomorphic encryption, which allows arbitrary computations on encrypted data, is still extremely slow. We're talking overhead factors of ten thousand to one hundred thousand compared to plaintext computation, depending on the operation and the scheme. IBM has a homomorphic encryption library, Microsoft has SEAL, there are others — but the practical applications are still limited to scenarios where you can tolerate that overhead or where the privacy guarantee is worth the cost.
Corn
What are the actual use cases that are compelling enough to absorb that cost?
Herman
Medical data is the canonical one. Imagine a hospital wants to run a machine learning model on patient records to predict disease risk. They could send encrypted patient data to an external compute provider, the provider runs the model on the ciphertext, sends back an encrypted result, and the hospital decrypts it. The compute provider never sees a single patient record. That's genuinely transformative for healthcare data sharing. Financial fraud detection across institutions is another — banks could jointly compute on encrypted transaction data without exposing their customers' information to each other. Genomics is a big one. Genome data is extraordinarily sensitive and extraordinarily useful for research simultaneously.
Corn
And the performance problem is improving?
Herman
Meaningfully. The field has advanced dramatically since Craig Gentry's original fully homomorphic encryption construction in 2009, which was a proof of concept that was completely impractical. The CKKS scheme, developed around 2017, handles approximate arithmetic — which is what you need for machine learning — much more efficiently. There are hardware accelerators being designed specifically for homomorphic encryption operations. I'm not going to claim it's around the corner for general use, but the trajectory is real.
Corn
Zero-knowledge proofs feel philosophically weirder to me. Like, the concept itself is strange.
Herman
The concept is genuinely strange. A zero-knowledge proof lets you convince someone that a statement is true without revealing any information about why it's true. The classic illustration is proving you know the solution to a Sudoku puzzle without showing them the solution. In a cryptographic context — proving you know a password without transmitting the password. Proving your age is over eighteen without revealing your actual birthdate. Proving you have sufficient funds for a transaction without revealing your account balance.
Corn
And these are actually deployed in real systems?
Herman
They are. The most mature deployment is in privacy-preserving cryptocurrencies — Zcash uses zk-SNARKs, which stands for zero-knowledge Succinct Non-interactive Arguments of Knowledge. But the applications are expanding well beyond crypto. Ethereum has been building out rollup systems that use zero-knowledge proofs to compress transaction verification — you can verify thousands of transactions with a single compact proof, which is a scalability solution as much as a privacy one. And there's real momentum around using ZK proofs for identity and credential verification. The idea that you could prove to a website that you're a licensed driver in a given jurisdiction without handing over your actual driver's license data — that's technically achievable now.
Corn
The privacy implications of that are enormous. Like, the entire data-broker economy assumes that you have to reveal information to prove things about yourself.
Herman
And ZK proofs crack that assumption open. The question is whether the infrastructure and the user experience can get to the point where it's actually adopted. The cryptography works. The deployment problem is the hard part. And there's a quantum consideration here too — most current ZK proof systems rely on elliptic-curve cryptography under the hood, which means they have the same post-quantum vulnerability as everything else. There's active research on post-quantum ZK proofs, but they're less mature.
Corn
By the way — today's script is courtesy of Claude Sonnet 4.6, which I find appropriate given that we're discussing cryptographic infrastructure. An AI writing about the systems that keep AI communications secure.
Herman
There's a recursive quality to that, yes. Although I'm not sure Claude has strong opinions about lattice problems.
Corn
Probably just learning with errors of its own. Okay, I want to zoom out for a second because I think there's a bigger picture here about what this inflection point means for the field of cryptography as a profession and as a discipline.
Herman
It's actually a remarkable moment. Cryptography for most of its history was a niche specialty — mathematicians and intelligence agencies. The public-key revolution in the late seventies, Diffie-Hellman and RSA, democratized it somewhat, but it was still largely invisible infrastructure. What's happening now is that cryptography is becoming a policy concern, a geopolitical concern, a boardroom concern. The NIST standardization process involved governments and companies from around the world. The migration decisions being made in the next five years will shape the security posture of the global internet for decades.
Corn
And there's a geopolitical dimension that doesn't get discussed enough. The countries that achieve cryptographically relevant quantum computing first — or that have already harvested encrypted data in anticipation of it — have a significant intelligence advantage.
Herman
Significant is an understatement. If a nation-state can decrypt fifteen years of diplomatic communications, financial transactions, military intelligence traffic — that's a transformative intelligence capability. And the harvest-now-decrypt-later problem means this isn't hypothetical. The data is being collected. The only question is whether the decryption capability arrives.
Corn
Which is also why the NIST process being open and international matters. The alternative is fragmented standards — different post-quantum schemes adopted by different countries or blocs, which creates interoperability problems and potentially introduces backdoors.
Herman
The backdoor concern is real and has history. The DUAL EC DRBG situation — the random number generator that NIST standardized in 2006 that turned out to have what appeared to be an NSA-inserted backdoor — that's a scar on the standardization process. The post-quantum process was designed with much more transparency, multiple evaluation rounds, public comment periods, international participation. Whether that's sufficient reassurance is a judgment call.
Corn
One thing I find genuinely underappreciated in the public conversation is the distinction between the algorithms being quantum-resistant and the implementations being secure. You can have a mathematically sound post-quantum algorithm and still have a catastrophically vulnerable system if the implementation has side-channel leaks.
Herman
This is so important. Side-channel attacks — timing attacks, power analysis, cache-timing — these exploit the physical implementation of a cryptographic algorithm rather than the mathematics. There were papers during the NIST evaluation process showing that early implementations of CRYSTALS-Kyber were vulnerable to timing side-channels. The algorithm itself was fine; the code was leaking information through execution timing. Post-quantum migration isn't just "swap in the new algorithm." It requires careful, audited implementations, hardware support, constant-time coding practices.
Corn
And this is where the field of cryptographic engineering becomes as important as the mathematics. It's not enough to prove the algorithm is hard to break. You have to build things that are hard to break in practice.
Herman
The gap between theoretical security and deployed security is where most real-world cryptographic failures live. Not Shor's algorithm breaking RSA in the abstract — it's a misconfigured TLS certificate, or a key stored in plaintext in a config file, or a timing leak in a library that nobody audited. The post-quantum transition is an opportunity to revisit a lot of that accumulated technical debt, but it's also an opportunity to introduce new mistakes at scale.
Corn
Let's get practical for a minute. For someone listening who's in a technical role — developer, security engineer, IT decision-maker — what does "paying attention to this" actually look like?
Herman
A few concrete things. First, cryptographic inventory. If you're responsible for a system that uses encryption, you need to know where RSA and elliptic-curve keys are being used. Certificate lifetimes, key exchange mechanisms, signature schemes — map it out. Most organizations don't have this visibility, and you can't migrate what you haven't inventoried.
Corn
That sounds tedious and important.
Herman
Extremely both. Second, watch the library updates. OpenSSL, BoringSSL, LibreSSL — these are the cryptographic libraries that underpin most software. OpenSSL has been adding post-quantum algorithm support. When your dependencies update to include these, that's the implementation layer starting to move. Third, for anything with a long operational lifespan — embedded systems, hardware security modules, infrastructure with a ten-year replacement cycle — the procurement decisions being made now need to account for post-quantum requirements. If you're buying a hardware security module today that will still be in service in 2035, it needs to support post-quantum algorithms.
Corn
And for the more exotic stuff — homomorphic encryption, ZK proofs — is there anything actionable yet?
Herman
Zero-knowledge proofs, yes, in specific domains. If you're building identity verification systems, credential systems, or working in the blockchain space, ZK proof libraries are mature enough to be worth serious evaluation. The zkSync and StarkNet ecosystems have production ZK proof infrastructure. For homomorphic encryption, the honest answer is that unless you're in a domain with specific high-value privacy requirements — genomics, healthcare data sharing, multi-party financial computation — you're probably watching and waiting. The tooling is improving fast enough that checking back in eighteen months is reasonable.
Corn
There's a version of this story that's kind of hopeful. Like — the field saw the threat coming, ran an open international process to develop defenses, standardized them, and is now in the early stages of deploying them. That's actually a pretty functional response compared to how these things usually go.
Herman
Compared to, say, the Y2K analogy — where the deadline was fixed and the response was mostly reactive — the post-quantum transition has had a decade of proactive lead time. That's genuinely unusual. The concern is whether the deployment pace will match the threat timeline. Ten to twenty years sounds comfortable until you consider the inertia of global infrastructure. The internet still has systems running cryptographic standards from the nineties. Not because the standards are good, but because nobody updated them.
Corn
The internet is basically the Voynich manuscript of infrastructure. Layers of things nobody fully understands, accumulated over decades, that somehow still function.
Herman
I'm going to use that. The difference is we actually need the internet to keep working while we translate it.
Corn
And unlike the Voynich manuscript, we at least know what language it's supposed to be in. Alright — what's the open question you're sitting with on this?
Herman
The one I keep coming back to is the assumption that lattice problems are hard. We've built the post-quantum future on that assumption. NIST has built it. The cryptographic community has largely converged on it. And I think it's probably right — the evidence is strong. But I remember that we also thought RSA was going to be fine forever, and the threat to it came from a direction nobody expected: not a better classical algorithm, but a different model of computation entirely. Is there a different model of computation we're not thinking about that makes lattice problems tractable? I genuinely don't know. Nobody does. And that uncertainty is the thing I can't fully resolve.
Corn
That's the honest version of "we're probably fine." Which is the best version of it, actually.
Herman
It's all we have. Cryptography is ultimately a bet on the limits of adversarial ingenuity. You're saying: here's a problem, we believe it's hard, we're staking security on that belief. The Voynich manuscript reminds us that beliefs about what's solvable can be wrong in both directions.
Corn
Well, that's a suitably unsettling place to leave it. Thanks to Hilbert Flumingtop for producing — as always, the show exists because he makes it happen. And Modal is keeping our pipeline running, which at this point feels like its own form of infrastructure we'd rather not think too hard about. If you want to dig into the full back catalog — two thousand one hundred and fifty episodes now, which is either impressive or alarming depending on your perspective — everything's at myweirdprompts.com. This has been My Weird Prompts. Leave us a review if you're enjoying the show — it genuinely helps.
Herman
See you next time.

This episode was generated with AI assistance. Hosts Herman and Corn are AI personalities.