https://www.youtube.com/watch?v=lZ3bPUKo5zc&list=PLUl4u3cNGP...
It's long, and the subject matter is intimidating at times, but watch, re-watch, then go deep by finding papers on subjects like superposition and entanglement, which are the key quantum phenomena that unlock quantum computing.
It also helps to understand a bit about how various qubit modalities are physically operated and affected by the control systems (e.g. how does a program turn into qubit rotations, readouts, and other instruction executions). Some are superconducting chips using electromagnetic wave impulses, some are suspending an ion/atom and using lasers to mutate states, or photonic chips moving light through gates - among a handful of other modalities in the industry and academia.
IBM's Qiskit platform may still have tooling, simulators, and visualizers that help you write a program and step through the operations on the qubit(s) managed by the program:
> how does a program turn into qubit rotations, readouts, and other instruction executions
What is actually involved in the "instruction set" for a quantum computer? How do you "compile" to it? If i treat everything below a "logical qubit" (https://en.wikipedia.org/wiki/Physical_and_logical_qubits) as a black-box since from programming pov it does not(?) matter can i think of it using classical computation models?
This is analogous to how one does not need to know Semiconductor Physics (which is Quantum Physics), Electronic Component Physics to understand the logical boolean framework built on top of it which is then synthesized into an instruction set to program against.
It's worth noting: the book assumes a fair bit of mathematical background, especially in linear algebra. If you don't have the equivalent of an undergrad CS/math/physics degree (with some linear algebra), it may be better to start with gentler sources.
One such gentler source is the free online text I wrote with Andy Matuschak -- https://quantum.country. I'm sure there are others which are very good, but perhaps that's helpful!
Both books focus on foundations of the field, and don't cover recent innovations -- the book with Ike Chuang is 26 years old! Still, many of the foundations have remained quite stable.
Given your experience in this domain; i would appreciate your take on Quantum Computing hype vs. reality? There is a lot of contradictory information like for example; The Case Against Quantum Computing - https://spectrum.ieee.org/the-case-against-quantum-computing
Do you think quantum computing will ever become mainstream? Will the "common folk" be able to program and use it with the same ease with which we do classical computers by using layers of abstractions?
Quantum Mechanics and Quantum Computation by Umesh Vazirani (UC Berkeley course) - https://youtube.com/playlist?list=PL74Rel4IAsETUwZS_Se_P-fSE...
It's old, but really good.
Another nice one is:
Introduction to Classical and Quantum Computation by Wong - https://www.thomaswong.net/introduction-to-classical-and-qua... [PDF]
These are really nice.
My favorite QM book is the one by Eisberg, Resnick. I recommend it to other people.
There are some nice recommendations in this thread:
- Nielsen, Chuang
- quantum.country by Nielsen
- The IBM Qiskit ecosystem, community, platform, etc. are active and welcoming
Manning Publication has some books on the theme. It's worth it to search through them.
For a sampler, just watched the qubit ones and they are excellent.
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.
The Nielsen/Chuang book is what i see recommended everywhere and so am definitely going to get it. What others would you recommend?
I had recently asked a similar question about books on "Modern Physics" (essentially Quantum Physics + Relativity) here https://news.ycombinator.com/item?id=46473352 so given your profile, what would be your recommendations?
PS: You might want to add your website url to your HN profile since your Physics Notes might be helpful to a lot of other folks too. :-)
As for Modern Physics, if you have the math prerequisites and you want a broad overview, the series of textbooks by Landau and Lifshitz would be my go-to. However, the problems are quite challenging and the text is relatively terse. I think the only other textbook that I've used personally would be Halliday, Resnick, and Krane. I didn't read a great deal of the textbook, but I do recall finding it relatively well-written.
For computation models, the circuit model and measurement-based computation cover most real work. Aaronson’s Quantum Computing Since Democritus and Nielsen & Chuang explain why quantum differs from classical (interference, amplitudes, complexity limits).
For computers/architecture, think of qubits as noisy analog components and error correction as how digital reliability is built on top. Preskill’s NISQ notes are very clear here.
For programming, most work is circuit construction and simulation on classical hardware (Qiskit, Cirq). That’s normal and expected.
Beyond Shor, look at Grover, phase estimation, and variational algorithms—they show how quantum advantage might appear, even if it’s limited today.
Yep, that is how i framed my question; glad to see it validated.
Thanks for the pointer to Preskill's NISQ notes.
1/ Digital and analog - where digital equals qubits and analog equals photonics, diamonds, or a range of other bit replacements.
2/ Qubits and gates are the building blocks and operations in digital. Photons, diamonds, electrons, and so on are the bits in analog, you can encode any of these with information in various ways.
3/Strawberry fields for analog qc, and IBM's qiskit for digital
I work on photonic integrated circuits and adapt them to remove the physical limitations on capacity, such as heat, and information loss.
What are some good resources that you would recommend to study and understand the above?
Also do you think QC will ever become mainstream like classical computing?
The online tutorial [2] is a good followup, especially if you want to understand Clifford gates / stabilizer states, which are important for quantum error correction.
If you have a more theoretical bent, you may enjoy learning about the ZX-calculus [3] - I found this useful for understanding how measurement-based quantum computing is supposed to work.
[1] https://cs.uwaterloo.ca/~watrous/QC-notes/QC-notes.pdf [2] https://qubit.guide/ [3] https://zxcalculus.com/
hershkumar pointed to Watrous' book so the notes you point to might be a good introduction to the book itself.
I didn't know of "ZX-calculus" so that goes from my unknown-unknowns to known-unknowns and so there a bunch of reading to be done there too.
Goes through qubits, state vectors and Grovers algorithm in a highly visual and intuitive fashion. Doesn’t discuss the underlying quantum mechanics in depth, but does mention and link out to resources for the interested viewer to delve deeper.
More mathy: A. Yu. Kitaev, A. H. Shen, M. N. Vyalyi, "Classical and Quantum Computation"
A killer app: Peter Shor, "Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer"
Some course notes: https://math.mit.edu/~shor/435-LN/
However, how approachable is the "Classical and Quantum Computation" book? Mathematics is fine as long as it is accessible. Also how good is the explanation of analogy/comparison between concepts from "Classical Computation" vs. "Quantum Computation"? I believe this is the best way to learn this subject and hence am quite interested to know more about how this book does it.