> To my knowledge, no one has cheated at factoring in this way before. Given the shenanigans pulled by past factoring experiments, that’s remarkable.
[1] https://sigbovik.org/2025/; standalone paper is also available in the code repository https://github.com/strilanc/falling-with-style
[2] Who has previous experience in cheating at quantum factoring: see "Factoring the largest number ever with a quantum computer", posted April Fools' Day 2020 at https://algassert.com/post/2000
I really hope he eventually gets the recognition he deserves, outside of just experts in the field.
The paper's formatting clearly went wrong here, as it should have read p = 2^n - 1 and q = 2^m + 1.
The "Proposed Quantum Factorisation Evaluation Criteria" are excellent, but for measuring progress, the required minimum factor size of 64 bits is too large. A good milestone would be a quantum circuit that can factor the product of any pair of 5-bit primes {17,19,23,29,31}.
It starts here: https://www.metzdowd.com/pipermail/cryptography/2025-Februar...
This part is from farther down thread:
"Just as a thought experiment, what's the most gutless device that could perform this "factorisation"? There's an isqrt() implementation that uses three temporaries so you could possibly do the square root part on a ZX81, but with 1k of RAM I don't think you can do the verification of the guess unless you can maybe swap the values out to tape and load new code for the multiply part. A VIC20 with 4k RAM should be able to do it... is there a programmable calculator that does arbitrary-precision maths? A quick google just turns up a lot of apps that do it but not much on physical devices.
Peter."
All those other applications, no matter how neat, I feel are quite niche. Like, "simulate pairs of electrons in the Ising model". Cool. Is that a multi-billion dollars industry though?
Or as another example, I'm currently at a conference listening to a PhD student's research on biomolecular structure prediction (for protein design).
Its a device that makes and analyzes at the same time, check out this primer:
https://warwick.ac.uk/fac/sci/chemistry/research/oconnor/oco...
I've always heard Qalgs for chemistry compared to classical methods though. Why do you think chemists are using CCSD and similar methods rather than the FT-ICR mass spectroscopy?
It's of interest to governments, for national security reasons. Quantum computing is an arms race.
PQC is as much a tool to reduce funding for QC as it is a tool against an actual eventual quantum computer.
I'm not sure that is true in the way it is intended. The NMOS transistors used in the 6502 were quite large and worked on the basis of electrostatic charges ... as opposed to bipolar transistors that are inherently quantum in operation.
Of course it is now understood that everything that does anything is at some level dependent on quantum effects. That would include the dog...
Forming a conductive channel in silicon in any FET and semiconductivity in general is an inherently quantum effect too, right?
So only a quantum effect to the extent all effects are at some level quantum.
(Beware of typo pointed out by tromp here.)
The dog is funny but it just means, pick actually "random" numbers from a bigger range than the staged phony numbers quantum factorisation uses.
Brilliant.
>Similarly, we refer to an abacus as “an abacus” rather than a digital computer, despite the fact that it relies on digital manipulation to effect its computations.