On the face of it, even relatively "point-target" goals of this kind could take many decades if at all; GaN for blue diodes come in mind as an example of a field that was stuck for a generation -- until it wasn't.
As OP said elsewhere[0, 1]:
> Once you understand quantum fault-tolerance, asking “so when are you going to factor 35 with Shor’s algorithm?” becomes sort of like asking the Manhattan Project physicists in 1943, “so when are you going to produce at least a small nuclear explosion?”
In other words (IIUC): Some problems (here: scaling fault tolerance) seem to be easier than others.
[0]: https://scottaaronson.blog/?p=9665#comment-2029013
[1]: See also https://news.ycombinator.com/item?id=47959531 for a very similar quote.
I thought it was a typo at first but wikipedia explained:
The Sword of Damocles is an ancient Greek moral anecdote, an allusion to the imminent and ever-present peril faced by those in positions of power.
Shor's algorithm is a quantum algorithm for finding the prime factors of an integer
SSH keys, on the other hand, are authentication and would require an online Quantum Computer to break, so we have more time. Authentication is also (usually) more complicated, so there are still disagreements on what to do with the Web PKI for example. To give you a concrete target, Google, Microsoft, and CloudFlare have self-imposed deadlines of 2029 for their PQC migrations.
In practice, PQC migration means updating your software, bugging your vendors to ensure they have this on their roadmaps, and making sure your own code is flexible in respect to algorithms used.
On a separate note, I've definetly been hearing worried murmurs about "harvest and decrypt" attacks along with post-quantum TEE slightly before the GCP paper, and I definetly think it appears a couple nation states are on track for a "quantum leap" by 2030 given the rate at which I've been hearing it within my network.
The worry about "harvest and decrypt" in a 5 year timeframe is primarily from a nation state/natsec perspective.
If you are being targeted by a nation state as a line level engineer, harvest and decrypt is the least of your worries.
...8 years previously.[1]
Long, long ago in a datacenter far away, breaking 3DES used to be the province of expensive bespoke hardware owned by only the elite nation states. Today it is so trivial that the gpu in your second hand laptop can do it "at scale".
5 years ago ChatGPT was a wet dream.
We should be very conservative in our planning where future security is concerned. The only thing we can be sure of is that Murphy's Law is looking for every chance to make us look foolish.
If you have any link to trivially cracking it on your second hand laptop and doing it at scale, would be very interested.
So we know that quantum computers hold a real risk of being able to break a lot of encryption. We also know that changing cyphers is hard (because reasons)
But what I don't see is what I can practically do now, as either someone who is a CTO/Big Cheese™ or a lowly engineer?
Migrate! The major TLS and OpenSSH applications already support PQC, for example.
1. Make sure you have the required dependencies (e.g., openssl 3.5+ is when a lot of PQC algorithms got support).
2. Make sure the client/server software is up to date (this might be all that's needed, e.g., OpenSSH 10.0+ enables PQC in-transit encryption by default, and so does Chrome 131+).
3. Enable PQC support in the configuration (e.g., "ssl_ecdh_curve X25519MLKEM768;" in Nginx).
If you are the developer of anything that's explicitly using RSA or ECC (or god forbid Diffie-Hellman), you can also migrate your own software, or at least make the algorithm selectable at initialization time instead of hardcoded. If you have vendors, ask them for their PQC migration roadmaps.
Note that with encrypted data you want to protect yourself against attackers that are capturing data today and waiting to break it in the future (Harvest-Now, Decrypt-Later). So migrating encryption is more urgent than migrating authentication.
If you're transmitting credit card info that changes every few years and can be changed on demand, that's no big deal. If you're transmitting information that will remain sensitive for decades, the time to look for methods that would stand up to quantum computing was years ago. However, today is still better than years in the future. At the very least, you can choose what to send in encrypted form over public networks and what not to send.
There are people who will scoff at the notion of quantum computing ever developing to the point where it can make an impact. There are people who scoff at the effort and expense of QKD or good ol' spooks carrying briefcases full of one-time PADs. You might be right to listen to them. You might not be. It's a risk. Whether you, or your organization, can tolerate that risk is entirely dependent on you and yours.
1. Make sure your dependencies are up to date. Move to a recent version of your crypto libraries. 2. Make sure your server can install multiple certificates: you'll need that unless you control all your clients. 3. Automate certificate issuance as far as possible.
Also, what you can do now is to run the following wargame: assume the CRQC arrived. What's the business impact?
For the migration itself I see three parallel streams.
1. Main push of straight-forward cases (TLS, etc.) Might need to wait a bit for software support.
2. Hard cases: crypto baked into hardware; custom protocols; keys in tight spaces (JWT in URLs); etc. You need to bubble those up soon to make decisions on how to fix them.
3. External dependencies. Barely any vendor has a PQ roadmap, so asking now is probably early, but you can figure out what to do if they don't get their stuff ready in time.
What is the biggest number factored using Shor's algorithm?
Last time I looked it was very unimpressive.
Edit: It's gotten worse. 21 from 2012. "Replication of Quantum Factorisation Records with an 8-bit Home Computer, an Abacus, and a Dog" say the factorization of 35 in 2019 actually failed.
> Sometimes these days, I'll survey the spectacular recent progress in fault-tolerance, 2-qubit gate fidelities, programmable hundred-qubit systems, etc., only to be answered with a sneer: "What's the biggest number that Shor's algorithm has factored? Still 15 after all these years? Haha, apparently the emperor has no clothes!" I've commented that this is sort of like dismissing the Manhattan Project as hopelessly stalled in 1944, on the ground that so far it hasn't produced even a tiny nuclear explosion... If there's a reason why you think it can't work beyond a certain scale, say so. But don't fixate on one external benchmark and ignore everything happening under the hood, if the experts are telling you that under the hood is where all the action now is, and your preferred benchmark is only relevant later.
I'm not saying it can't work. Just that in 14 years no one has managed to factor a larger number than 21. Seemingly focus has shifted to other factoring algorithms that don't have performance improvements over conventional computing.
I'm not the one implying that Shor's algorithm will breaking encryption in "a few years from now".
(The analogy with the Manhattan project is apt: an adversary learning about it would have been wise to adjust their planning around the possibility of it succeeding even if they judged that it was not a given that it would)
Small correction: no one has PUBLICLY managed to factor a larger number than 21.
There could be advances (foreign and domestic) that just don't get published because they represent having an upper-hand with regards to cryptography. So, from Game Theory perspective, not making waves is in the interest of nation states. They'll even try to be dismissive about concerns.
Then again, there are enough examples of failed projects. Why should this be comparable to the Manhattan project? In 1944, it was only two years underway, whereas Shor's algorithm is over 30. Tons of articles have been published on quantum computing, while the A bomb was kept as secret as possible, making learning from other countries, sometimes even from colleagues, impossible. In 1942, an atomic explosion was still hypothetical, whereas quantum computing had its first commercial service 7 years ago. Etc.
So, while in principle lack of progress doesn't guarantee failure, a comparison to the Manhattan Project is stylistic bullshit.
I am not particularly invested either which way about the likelihood of quantum computing being a major breakthrough or not but this is seeming like yet one more area of computing research like crypto and LLMs which in recent years is increasingly being flooded by people on a hype train.
See https://algassert.com/post/2500 for details.
I get that there's a lot of R&D going on to make larger quantum computers a thing and that there's been very definite progress, but factoring 21 is just too hard to expect for now. But that also pushes the date where pre-quantum cryptography is broken further into the future. If we still struggle to factor one of the smaller 5 bit numbers, factoring the 128 bit numbers necessary to break elliptic curve cryptography seems quite far away.
https://waymo.com/research/safety-performance-of-the-waymo-r...
> Waymo’s rider-only ride-hailing operations reached its first one million rider-only miles on January 21, 2023
I talked to another guy with the same degree in the same field and he was concerned.
Honest question.
How can a lay person track the real word progress of quantum computers?
The problem is that we're not trying to predict the exact future, we're hedging against possible developments. If there's a 50/50 chance of quantum computers being widely deployed for cryptoanalysis, then there's a 50% chance of this migration being useless. But you don't want to bet your security on a coin toss! So, we migrate.
That's the unfortunate truth of security, sometimes the protections are never triggered. But you still need them.
There's a good consensus that for key exchange/encryption (TLS, SSH, age, etc) the way forward is ML-KEM 768 together with some classical algorithm, like X25519. The public keys are larger (1 KB), but that's usually ok unless you're working on very small microcontrollers. And you should migrate quickly because of harvest-now-decrypt-later attacks.
For signatures, things are harder because there are tradeoffs. Some algorithms have large signatures (10+ KB), others require keeping state and have catastrophic consequences if subkeys are reused. And the systems around it are also more complicated: in a certificate, should you put a classical and a PQC signature together? Or should the PQC signature go in an extension? Should the extension be marked as critical and fail loudly on old clients, or should new clients have a special case to always check it if PQC signature validation is available? Or should we abandon the certificate chains and move to Merkle Tree Certificates[1]?
So signatures/authentication are still up for debate. Unless your team is on the bleeding edge of either crypto research or security risks, then there's not much to do than wait for better consensus to form.
[1] https://postquantum.com/security-pqc/googles-merkle-tree-mtc...
> And you should migrate quickly because of harvest-now-decrypt-later attacks.
...
> So signatures/authentication are still up for debate. Unless your team is on the bleeding edge of either crypto research or security risks, then there's not much to do than wait for better consensus to form.
I'm trying, as a layman, to find some not-too-insane middle ground between those contradictions.
If I send you a document encrypted with classical crypto today, an attacker could grab a copy, wait a few years, then decrypt with a quantum computer (Harvest-Now-Decrypt-Later). The contents of the document I sent today are exposed in the future.
For documents/transmissions that must remain confidential for 10 years, assuming a quantum computer available in 2030, you should have been encrypting them with PQC since 2020! And if deploying PQC for your clients and servers takes two years, you should have started migrating in 2018!
But if I send you a signed document, it's safe because you're verifying the signature today and there are no quantum computers available today to forge a new signature. The same goes for SSH authentication and web certificates, for example. They're safe right until the moment quantum computers arrive (and by then you better have a good solution!).
That's why so many open-source projects already support ML-KEM for key exchange/encryption, but signatures are still under discussion. The former is more urgent.
Quantum AI harvesting antimatter
I have been hearing about one more technical hurdle to solve before quantum algorithms become feasible since before I graduated. That was in 1996.
At the same time, moving to more secure encryption really isn't difficult. How many times have algorithms been deprecated over the past 20 or so years? It's time to do it again.
Let's just make sure that the NSA hasn't worked in any backdoors. At latest since Snowdon, anything they work on is suspect.
Hybrid encryption is as simple as running one encryption and then the other. Problem is mostly that post quantum keys are large.
If Algo-A and Algo-B both rely on "factoring big numbers is hard!" then once the Quantumpocalypse occurs, breaking Algo-B(Algo-A(plaintext)) is no harder than asking ChatGPT 99.5 to add an extra step in your vibe coded cracking engine's frontend, such that it now does B_breaker < cyphertext | A_breaker >> plaintext.lol or whatever the equivalent is for the fashionable language of the that future day.
Duke Nukem Forever was release fifteen years ago. Some things never happen until they suddenly do.
The wolf really does eat the boy at the end of The Boy Who Cried Wolf.
We are still not factoring 21, let alone 35, let alone numbers with thousands of digits.
A threshold that might be beyond what the physical properties of our universe allow. It is still unclear.
I find it weird how bleeding edge research, at the very edges of both physics and engineering, is treated as though it's a market development about to drop. Possibly a consequence of pure R&D having all but died? Getting funded requires pretending there's a business plan for what you're working on?
But as it happens in real life politics too, people who were just proven they were wrong continued to blame the boy.
The story is told from the point of view of a villagers trying to hide their culpability by blaming the victim.
What happened before that in the story
Show the data, the charts, let people decide for themselves.
One needs to read OP's blog post in the context of his other posts from the last couple months (many of which have been discussed here on HN in one way or another), where he does discuss the science.
I'd really like to know what his current work on the subject entails, but when I try googling his stuff all I find are years-old papers, more recent meta discussion, and him making a few comments about other peoples' work.
I was sure that by now he'd have at least collaborated on some avant-garde PQ algo that was as different from the NSA approved stuff as chacha20-poly1305 was from AES. I was hoping for a PQ-NaCl folks would be using soon, not the libpqcrypto that seems to lack traction among devs (for reasons I do not understand). I am disappoint.
(It's probably all tucked away in some corner of the web that a layman like me will never find. Sigh.)
Edit: Hah! I gave up on looking for papers or repos and decided to just read his blog instead. Well would'ya look at that! It's non-stop PQ ranting of the kind we've come to love and cherish from DJB. No new repos or code with his imprimatur that I can see so far but better than I was expecting. Looks like I've got some reading to do....
I should have subscribed to his rss feed years ago. And his "microblog" too! https://microblog.cr.yp.to/
> if quantum computers start breaking cryptography a few years from now, don’t you dare come to this blog and tell me that I failed to warn you. This post is your warning.