People building these things are trying to oversell their achievements while carefully avoiding making them easy to check, reproduce, or objectively compare to others. It's hard to objectively evaluate even for people who work in the field but haven't worked on the exact technology platform reported on. Metrics are taylored to marketing goals, for example, IBM made up a performance metric called "quantum volume", only to basically stop using it when it seemed to no longer favour them.
That being said, it's also undeniable that quantum computing is making significant progress, error correction being a major milestone. What this ends up being actually used for, if anything, remains to be seen (I'm rather sure we'll find something).
The latter has released quantum computers with thousands of qubits, but these qubits are not comparable with the physical qubits in a gate-model computer (and especially not with logical qubits from one).
Since noone has many qubits, typically physical qubits are compared as opposed to virtual qubits (the error corrected ones).
The other key figures of merit are the 1-qubit and 2-qubit gate fidelities (basically the success rates). The 2-qubit gate is typically more difficult and has a lower fidelity, so people often compare qubits by looking only at the 2-qubit gate fidelity. Every 9 added to the 2-qubit gate fidelity is expected to roughly decrease the ratio of physical to virtual qubits by an order of magnitude.
In architectures where qubits are fixed in place and can only talk to their nearest neighbours, moving information around requires swap gates which are made up of the elementary 1 and 2-qubit gates. Some architectures have mobile qubits and all-to-all connectivity, so their proponents hope to avoid swap gates, considerably reducing the number of required 2-qubit gates required to run an algorithm, thus resulting in less errors to deal with.
Some companies, particularly ones on younger architectures, but perhaps with much better gate fidelities, argue that their scheme is better by virtue of being more "scalable" (having more potential in future).
It is expected that in the future, the overall clock speed of the quantum computer will matter, as the circuits we ultimately want to run are expected to be massively long. Since we're far away from the point where this matters, clock speed is uncommonly brought up.
In general, different architectures have different advantages. With different proponents having different beliefs of what matters, it was once described to me as each architecture having their own religion.
TL;DR: the two key stats are number of qubits and 2-qubit gate fidelity.
Are there any real world applications yet? Or is the real world application, quantum state experiments?
I think we are pretty far from using it as a general purpose computer or even special (disrupting) usecases like factorization. So who could use it with benefit?
If I had to bet on what (impactful) application might come first, I'd guess simulation of chemical/physical properties used for drug development and materials science.
But they offer it for rent. Who would be a buyer for the quantum part of the hybrid?
"Research institutions" but for what kind of research?
Or is this rather wishful thinking/PR "we bring quantum computing to the market (just nobody uses it)"?
Quantum computing research. I'd guess a big chunk of revenue will come from universities and research institutes. Some companies might also pay for it, e.g. quantum computing startups in need of anything they can show before they have hardware, or startups that aren't even planning to build their own hardware.
There are people working on finding useful problems that these devices might help with and how to best make use of them, how to build "infrastructure" for it. It's useful for them to have something to play with. Also, many organizations want to be (seen as) at the forefront of quantum computing, know the current capabilities, strengths and weaknesses of the various platforms, train and educate people about quantum computing and quantum technology in general, etc.
What would be required to factor a 1024 bit integer key?
Read as: I've heard for nearly 30 years that quantum is just around the corner, and we need post quantum cryptography.
Or, as reverend Sharpton said: "All hell's gunna break loose; and you're gunna need a Bitcoin!"
When you compare it to the historical development of classical computers it's proceeding at a decent rate. Imagine if we'd needed hundreds of thousands of transistors before being able to demonstrate actually useful work by a classical computer. They likely never would have been developed in the first place.
Cryptography wise I'd expect dire warnings about any theoretical attack that's reasonably plausible. Better to react immediately than sit around waiting for it to materialize. It took over 15 years after the warnings for SHA to be broken in practice and I don't necessarily expect that SHA2 ever will be but we've moved on to SHA3 nonetheless.
that was 80 years ago, for the military. So plotting that out, first actual PC was 25 years after ENIAC was decommissioned, the IBM PC 5150, with 29,000 transistors in the 8088. 12 years later, the 586 had 3.1mm transistors, P4 had 42mm, 10 years later (2003) p4xe had 169mm (but a year earlier there were only 65mm in the p4). haswell, ten years later, 1.4 billion transistors. in 2023, AMD ryzen 7800x3d had 6.5 billion transistors.
here's a graph i threw together to see what the trendline was https://i.imgur.com/4ofV7Xr.png
That is a very broad range of possibilities, so allow me to narrow it to cryptography. I am by no means an expert on this, but I spent the weekend reading about quantum motivations to change the cryptographic algorithms society uses and as at as I can tell, nobody knows what the hard lower bound is for breaking hardness assumptions in classical cryptography. The best guess is that it is many orders of magnitude higher than what current machines can do.
We are so far from machines capable of this that it is unclear that a machine capable of it will be made in our lifetimes, despite optimism/fear/hope, depending on who you are, to the contrary.
> What would be required to factor a 1024 bit integer key?
I assume you mean a 1024-bit RSA key (if you mean a 1024-bit ECC key, I want to know what curve). You can crack 1024-bit RSA with 2051 logical qubits according to:
https://arxiv.org/abs/quant-ph/0205095
In order for it to actually work, it is believed that you will need between 1000 to 10000 physical qubits for every logical qubit, so it could take up to 20 million qubits.
Coincidentally, the following paper claims that cracking a 2048-bit RSA key can be done in 8 hours with 20 million physical qubits:
https://arxiv.org/abs/1905.09749
That sounds like it should not make sense given the previous upper estimate of 20 million physical qubits for a 1024-bit RSA key. As far as I can tell, there are different ways of implementing Shor’s algorithm and some ways use more qubits and some ways use less. The biggest factor in the number of physical qubits used is the error correction. If you can do better error correction, you can use fewer physical qubits.
There might be applications other than factoring that can be addressed with the noisy qubits we can actually create.
https://www.dwavequantum.com/company/newsroom/press-release/...
The news is light on technical details. Beyond that, I have no clue about useful applications.