• tallwookie@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    1 year ago

    yes… but quantum computers dont perform the same sort of tasks as traditional computers (desktop/laptop/phone) do. so, yeah the new generation is absolutely, ridiculously fast, but in a very limited/niche way.

    • TheBeege@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      1 year ago

      I’m not so familiar with quantum computers. Can you describe how they accomplish different tasks?

      My understanding was that the key difference was bits versus qubits, basically translating to individual operations calculating more data. So a bit is x^2, but a qubit is x^8 if i remember correctly. Under the hood, it’s all still math, but with a different base number system. Everything would have to be rebuilt at the lowest layer, but abstractions over bitwise operations should remain the same, I thought. Maybe my base understanding of quantum computers is wrong? I’m curious

      • NewNewAccount@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        1 year ago

        Not an expert by any means but I did study a related field some years ago. As I understand it, traditional computers are good at solving problems that can be represented by deterministic finite automata. Quantum computers, on the other hand, can solve problems that are more readily represented by nondeterministic finite automata.

        Basically, traditional computers are one (complex) machine that can do a single thing at a time (very quickly). Quantum computers, with their qubits, are like dozens of machines all computing simultaneously and solving all possible inputs and outcomes at once.

        Take this with a grain of salt because I’ve been told by people much smarter than me that my understanding is flawed in a way I couldn’t quite grasp.

        Here’s some more info: https://www.geeksforgeeks.org/difference-between-dfa-and-nfa/

      • f1g4@feddit.it
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        As said it’s about superpositions. Normal bits can be set to 0 or 1. Qbits can be in a superpositions of states 0 and 1. So the state of a qbits can be written as a weighted sum of the two states. Now you can do traditional math with this, always could. It’s just a physical level difference of the system. There are many bullshit things with quantum computing tho. They are probabilistic in nature and have infinite memory.of the past, for example.

      • dbilitated@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I think the trouble is you can assume your superposition calculated all the possibilities, but you have no way of picking the right one. it just calculated all the possibilities at the same time. however, there are some really clever ways that i don’t fully understand of figuring out what the result is, if the result is periodic you can see some kind of interference, or if it can be represented as a lowest energy state the system can fall into - honestly it’s all really confusing. but it has to be a specific algorithm that’s been identified for a quantum computer, if you try to run it like a regular computer it won’t work at all. here’s a cool article on the algorithms: https://www.amarchenkova.com/posts/5-quantum-algorithms-that-could-change-the-world

        i think shor’s algorithm is the most interesting because it breaks encryption and ends the world

        • TheBeege@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          That is a great article. I only got through maybe two thirds of it, but I’ll go through the rest when i have better focus.

          So… it sounds like practical use will be something like a QPU, quantum processing unit. When you need highly parallelized, probability-based, shallow computation, it would be good to use a QPU. Data preparation or more step-wise operations would go to a traditional CPU/GPU. I can see how this would be useful.

          • dbilitated@aussie.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            yeah! if something can be expressed in an algorithm that can be implemented using qbits, it can run there. 99% of computing will be done on your regular cpu, but for certain problems a regular cpu would take years to solve, you can run it through the multiverse for a quick answer.

    • ADON15@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      can’t quantum computers in theory calculate everything a normal computer can, it’s just much more of a challenge to build one

  • f1g4@feddit.it
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    Quantum computing is just a very weird, niche and very expensive way to do a very limited range of things. Some scientists are skeptical they’ll ever get to a usable point… bc you need a ton of qbits for something interesting, there’s not way to initialize / return to a blank state (qbits have infinite memory of the past) and to read the solution you must collapse the wave function to something. So even the output is probabilistic. Stuff is werid.

    • mlk6450@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      The problems which are calculated, such as finding prime factors of an integer, take non-polynomial (NP) time on a classical computer to solve. But NP problems, as opposed to NP-hard, can by definition be confirm in P (polynomial) time on a classical computer. Therefore, we can easily confirm that the answer is correct using classical computers.

      On an aside, I used the example of prime factorization because it is one of the most well known problems that can be accelerated via quantum computing using Shor’s algorithm. Using Shor’s algorithm on a quantum computer, an integer can be factorized in P time. This is opposed to NP time on a classical computer.

      • mlk6450@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Also, note that this acceleration provided by Shor’s algorithm is what people are talking about when they say “quantum breaks encryption”. I don’t like when people say that though because quantum computers don’t break all encryption schemes. In fact, there is only one mainstream encryption scheme which is susceptible and that is RSA. Don’t get me wrong, if RSA is comprised that would compromise a LOT of legacy systems. But we already have new public key ciphers, such as elliptic curves, which are ready to replace RSA once quantum computers become large enough to actually implement an attack against RSA.