14 pointsby mgh25 hours ago5 comments
  • pjmlp24 minutes ago
    If you think programming a GPU is hard, try to learn how to do a factorial on one of those quantum emulators.

    Here is Microsoft one,

    https://learn.microsoft.com/en-us/azure/quantum/qdk-main-ove...

  • petra5 hours ago
    Nvidia has more money than God. Worst case they'll buy the competition.
  • RoyTyrell5 hours ago
    yawn Maybe d-wave should put up or shut up. QC companies and bro-advocates have been saying this for years and there's been very little use outside of pure r&d labs.

    I don't believe that QC is going to have the ease of use, time to deployment, and relative low-cost that GPUs are going to have any time soon - if ever.

    • cwillu3 hours ago
      QC could have all of those things and it would still not be a threat. Using a quantum computer for general computation is like using a front-end loader to go grocery shopping: it's a spectacular improvement for the task it's designed for, and utterly useless for the vast majority of other tasks.
  • Melatonic3 hours ago
    ....do quantum computers and GPUs have a lot of overlap in the types of tasks they compute ? I was under the impression they solve quite different problems
    • duskwuff3 hours ago
      Do present-day quantum computers compute any nontrivial tasks (i.e. beyond factoring the number 15)?
    • cwillu3 hours ago
      Correct, there's almost no overlap.
    • hank8083 hours ago
      No.
  • mgh22 hours ago
    Sounds like another hype cycle coming...