If that’s the case, would the time eventually be basically irrelevant with enough compute? For instance, if what’s now a data center is able to fit in the palm of your hand (comparing early computers that took up rooms to phones nowadays). So if compute is (somehow) eventually able to be incredibly well optimized or if we use something new, like how microprocessors were the next big thing, would that then be a quantum threat to 128-bit symmetric keys?
None of those are remotely practical, even imagining quantum computers that become as fast (and small! and long-term coherent!) as classical computers.
Compute has seen in the ballpark of a 5-10 orders of magnitude increase over the last 40 years in terms of instructions per second. We would need an additional 20-30 orders of magnitude increase to make it even close to achievable with brute force in a reasonable time frame. That isn’t happening with how we make computers today.
Keep here in mind that computers today have features approaching the size of a single atom, switching frequencies where the time to cross a single chip from one end to the other is becoming multiple cycles, and power densities that require us to operate at the physical limits of heat transfer for matter that exists at ambient conditions.
We can squeeze it quite a bit further, sure. But anything like 20-30 orders of magnitude is just laughable even with an infinite supply of unobtanium and fairy dust.
If an attacker can break the symmetric encryption in a reasonable amount of time, they can capture the output and break it later.
In addition, how are you doing the key rotation? You have to have some way of authenticating with the rotation service, and what is to stop them from breaking THAT key, and getting their own new certificate? Or breaking the trusted root authority and giving themselves a key?
I agree. The point I am trying to make is that even for asymmetric encryption (which is far more vulnerable), there are still plausible ways to make a quantum break more difficult.
The only thing that could compromise this scheme, aside from breaking the signing keys, would be to have TLS broken to the extent that viewing real-time traffic is possible. Any TLS break delayed by more than 15 minutes would be worthless.
It sounds like you’re talking about breaking TLS’s key exchange? Why would this not have the usual issue of being able to decrypt recorded traffic at any time in the future?
Edit: If it’s because the plaintext isn’t useful, as knorker got at in a sibling comment… I sure hope we aren’t still using classical TLS by the time requiring it to be broken in 1 minute instead of 15 is considered a mitigation. Post-quantum TLS already exists and is being deployed…
What makes you say that? This is the store now decrypt later attack, and it's anything but worthless.
Oh, worthless for your oauth? Uh… but how do you bootstrap the trust? Sounds to me like you need post quantum to carry the whole thing anyway.
Or you mean one key signs the next? Ok, so your bet is that within the time window an RSA key, RSA can't be cracked?
Why in the world would anyone want to depend on that? Surely you will also pair it with PQ?
There are enough order-of-magnitude breakthroughs between today and scalable quantum error correction, that it makes no sense to try to to guess exactly the order of magnitude of the attacks that will be feasible.
Either you believe they won't happen, in which case you can keep using long-term ECDSA keys, or you believe they will happen, in which case they are likely to overshoot your rotation period.
I dont know what the quantum future holds, but if quantum actually happens then i have low faith in your plan.
I think there are too many unknowns to bet it all on one horse.
So, if we have to change all of our infrastructure due to a supposed quantum computing threat, I'd go with HybridPQ for asymmetric encryption.
What is going on?
I think an analogy would be, imagine you are driving across north america in a car, but your engine is broken. The mechanic is near by so you put it in neutral and push it.
If someone said, well it took you half an hour to push it to the mechanic, it will take the rest of your life to get it across north america - that would be the wrong take away. If the mechanic actually fixes the engine, you'll go quite fast quite quickly. On the other hand maybe its just broke and can't be fixed. Either way how fast you can push it has no bearing on how fast the mechanic can fix it or how fast it will work after its fixed.
Maybe people will figure out quantum computers maybe they won't, but the timeline of "factoring" 15 is pretty unrelated.
In the context of cryptography, keep in mind its hard to change algorithms and cryptographers have to plan for the future. They are interested in questions like: is there a > 1% change that a quantum computer will break real crypto in the next 15 years. I think the vibe has shifted to that sounding plausible. Doesn't necessarily mean it will happen, its just become prudent to plan for that eventuality, and now is when you would have to start.
I’ve seen so much change so fast my assumption is someone did it already and preprints are making the rounds.
https://bas.westerbaan.name/notes/2026/04/02/factoring.html
It doesn't say much by itself, but it has four very good links on the subject. One of these has a picture of the smallest known factor-21 circuit, which is vastly larger than that of the factor-15 circuit, and comparable to much larger numbers. Another is Scott Aaronson's article making the analogy of asking factoring small numbers as asking for a "small nuclear explosion" - if you're in 1940 and not able to make a small nuclear explosion, that doesn't mean you're much farther away from a big nuclear explosion.
To get useful results, a quantum computer needs all of its qbits to stay entangled with each other, until the entire group collapses into the result. With current technology, it is very difficult for a reasonable sized group of qbits to stay coherently entangled, so it can only solve problems that are also relatively easy to solve on classical computers.
If someone today were to figure out how to keep large numbers of bits entangled, then quantum computing would instantly be able to break any encryption that isn't quantum safe. It's not something that we are slowly working toward; it's a breakthrough that we can't predict when, or even if, it will happen.
Shor's and Grover's still are algorithm that require a massive amount of steps...
WPA3 moved from symmetric AES to ECDH which is vulnerable to Quantum. Gonna be a tonne of IOT inverters waste.
The say the 's' in IoT stands for secure, and from my experience that is true. Pretty much nothing is getting thrown out, because it isn't secure.
...but even if they had, what realistically could they have done about it? ML-KEM was only standardized in 2024 [1].
also, the addition of ECDH in WPA3 was to address an existing, very real, not-theoretical attack [2]:
> WPA and WPA2 do not provide forward secrecy, meaning that once an adverse person discovers the pre-shared key, they can potentially decrypt all packets encrypted using that PSK transmitted in the future and even past, which could be passively and silently collected by the attacker. This also means an attacker can silently capture and decrypt others' packets if a WPA-protected access point is provided free of charge at a public place, because its password is usually shared to anyone in that place.
0: https://en.wikipedia.org/wiki/Wi-Fi_Protected_Access#WPA3
1: https://en.wikipedia.org/wiki/ML-KEM
2: https://en.wikipedia.org/wiki/Wi-Fi_Protected_Access#Lack_of...
why do you have to assume that?
you're at Acme Coffeeshop. their wifi password is "greatcoffee" and it's printed next to the cash register where all customers can see it.
with WPA2 you have to consider N possible adversaries - Acme Coffee themselves, as well as every single other person at the coffeeshop.
...and also anyone else within signal range of their AP. maybe I live in an apartment above the coffeeshop, and think "lol it'd be fun to collect all that traffic and see if any of it is unencrypted".
with WPA3 you only have to consider the single possible adversary, the coffeeshop themselves.
And for ECC, I know many are using the "2 exp 255 - 19" / 25519 for it's unlikely to be backdoored but it's only 256 bits but... Can't we find, say, "2 exp 2047 - 19" (just making that one up) and be safe for a while too?
Basically: for RSA and ECC, is there anything preventing us from using keys 10x bigger?
That's correct. The quantum computer needs to be "sufficiently larger" than your RSA key.
> Basically: for RSA and ECC, is there anything preventing us from using keys 10x bigger?
For RSA things get very unwieldy (but not technically infeasible) beyond 8192 bits. For ECC there are different challenges, some of which have nothing to do with the underlying cryptography itself: one good example is how the OpenSSH team still haven't bothered supporting Ed448, because they consider it unnecessary.
you can run benchmarks yourself: openssl speed rsa1024 rsa2048
also this (slightly dated) java ex writeup covers this well: https://www.javamex.com/tutorials/cryptography/rsa_key_lengt...
tldr trade off is found between better performance and how many years the data needs to be assumed confidential
the time to run the algorithm has cubic scaling - 1000x more time required.
but it remains exponentially faster, just 1 minute becomes 1 day, 1 day becomes 3 years. still "easily" broken
Grover attacks are very blatantly impractical. When someone describes Grover-type attacks in the same breath as Shor-type attacks, without caveats, that's a red flag.
every encryption scheme has at least one way to be decrypted.
fidelity of information is one use of encryption, if you apply the solution and get garbage, something is wrong, somewhere.
occultation of information is another use, that is commonly abused by extending undue trust. under the proviso that encryption will eventually be broken, you cant trust encryption to keep a secret forever, but you can keep it secret, for long enough that it is no longer applicible to an attack,or slightly askew usecase, thus aggressive rotation of keys becomes desirable