Falk's Profile

7
Points

Questions
0

Answers
3

  • Asked on July 7, 2020 in Quantum Computing.

    Hi Ross. I have some thoughts on this. Just so you know my background: I have a PhD in quantum physics. I did not mainly study quantum computing. I was more working on the other direction, which is doing simulations of quantum systems on ordinary computers. Still I have some decent background on quantum physics in general and can read and judge the literature to some extend. I have recently written an article on the limitations of quantum computing and how I find them overrated, which you might find interesting to read, see https://medium.com/twodigits/3-reasons-quantum-computing-is-overrated-9d87d11aa248

    Now to your questions:

    Quantum computing has gained media attention about the exponential increase in computational speeds but only when dealing with specific problems which are prone to a quantum computational approach. Do you agree with this statement currently and if so do you think that this limit can be broken with future advances of the technology?

    I very much agree that the media attention is there 🙂 And yes, the potential to solve very specific problems is also there. If this potential can be transformed to some actual practical applications is not clear yet. This might well happen. The limitation that the problem needs to be “prone to quantum computing” is a very important one and this appears to be something very much underrepresented in media coverage. Which problems can be solved exactly is a topic of active research. One crucial question in this regard is: “Can quantum computers efficiently solve so-called NP-Hard problems”. This class of “NP-Hard” problems is so important because many tasks like most optimization problems fall under it. This is really an open question, actually both for quantum and for ordinary computers. But most people believe that neither ordinary nor quantum computers can solve NP-Hard problems efficiently. This is not really a technical issue that can be “overcome” by “advances of the technology”. Instead it looks like this is a fundamental limitation, which cannot be overcome by technical advancements (This is not saying that such problems cannot be solved by any technology ever, but it does not seem to be in scope for quantum computing). That being said, there might still be the one or the other surprise coming up of problems that did not seem so relevant until we actually found a technology that can solve them (This would be problems that are not NP-Hard, but still exponentially scaling on ordinary computers, for a list of such problems see https://cstheory.stackexchange.com/questions/79/problems-between-p-and-npc)

    Current machine learning strategies involve processing copious amounts of data to reach an output. This is a common bottleneck with classical computing and discourages more ambitious research with machine learning. Do you believe that Quantum Computing can/will be able to elevate this bottleneck in machine learning? If not, does Quantum Computing provide alternative approaches for machine learning?

    My honest opinion on that is, that I do not see quantum computers play a role in machine-learning (at least not any time soon, like in the next 20 to 30 years). If quantum computing becomes feasible at all, this will likely not be a territory it will support any time soon. I have the impression, that this connection quantum computing to machine learning really mostly came up and stays around because it sounds so fancy, not because it would come any time soon.  What I think instead is that machine learning will profit from more and more parallel computing on ordinary computers. The methods used can usually be treated on parallel computers very well. Ordinary computers actually are very well suited to handle large amounts of data. For quantum computers large data sets are actually quite a challenge, as you somehow have to get the data into the quantum state of the computer in the first place.

    The problem of decoherence is regarded as one of the bigger obstacles to the development (and application) of Quantum Computing To what extent does decoherence (in your opinion) offer a drawback when applying quantum computing to other fields? If so, do you believe it will be possible to reduce this drawback to negligible in the future?

    I would not call decoherence a “drawback”. It is rather a problem that needs to be handled in order to make quantum computers work in the first place. This is really a topic of active research. It is not at all clear if large scale quantum computers can be build at all, where one of the main obstacles is indeed decoherence. This problem might make it much more difficult to scale up quantum computers than one might intuitively think: For ordinary computers it turned out that creating more and more transistors becomes actually easier in some way, so that it was possible to continuously increase the speed at which more transistors can be combined into a computer. This made it possible that the number of transistors actually increased exponentially over time for decades (Moor’s Law). For quantum computers this might be the other way around. Intuitively one might think “If you created a computer with 50 QBits, how hard can it be to double that to 100 QBits”. But this decoherence problem implies that one not only needs to duplicate the hardware, but also to reduce external disturbances considerably and going from 100 to 150 will one again mean one needs to reduce external noise. There might be fundamental limitations as to how far one can go with that. There are some ideas on how one can overcome this (so-called “error-corrected QBits”), but to me it is by no means clear if this will really work.

    All that being said, quantum computing is certainly an interesting research topic and might some day turn out to solve some problems we cannot solve Today. In fact for some fields like quantum simulation there are already relevant applications. Those are just other applications, then what you usually find in the media (i.e. quantum computers will probably not do so much for optimization, and maybe not do anything for machine learning). At some point it might also happen that researchers find some applications that we didn’t even expect, while there is much less progress on the things many people expected. That is just how fundamental research works.

    • 903 views
    • 1 answers
    • 0 votes
  • Asked on July 7, 2020 in Optimization.

    D-Wave is probably the closest fit for what you are searching for. However, many real-world optimization problems will not fit to their hardware. They can only solve very specific  classes of optimization problems. As stated here: https://medium.com/twodigits/3-reasons-quantum-computing-is-overrated-9d87d11aa248 :

    “””

    Reports from D-Wave [3] focus on practical devices for optimization tasks. It is very misleading when they claim, that they handle so-called NP-Hard problems. What their devices can do is finding approximate solutions to those problems (a much easier task, which can also be done by ordinary computers), but a convincing proof, that they can guarantee to find the global optimum has not yet been delivered, and it seems likely, that they cannot do that [4, 5].

    “””

    I agree with the point that a large amount of input data is probably an indication that it might not be a good fit for quantum computing any time soon, as stated in rcooper’s answer.

    • 1256 views
    • 3 answers
    • 0 votes
  • Asked on July 7, 2020 in Quantum Computing.

    The picture that a quantum computer runs many calculations in parallel is not completely wrong, but it is still somewhat misleading.

    From this simple picture people often think that quantum computers can run pretty much any computation efficiently. However, this is not true. In fact the space of problems quantum computers can solve efficiently is rather limited. Probably the most relevant one is breaking asymmetric encryption. Many other problems, like for example most optimization problems can actually not be solved efficiently on a quantum computer.

    If you want to know more have a look at this article:
    https://medium.com/twodigits/3-reasons-quantum-computing-is-overrated-9d87d11aa248

    • 1101 views
    • 2 answers
    • 0 votes