THIS SITE

Information for

Specialized Quantum Computing

On August 16-17 we organized (the first1) workshop on Semi-Quantum computing. It was a chance to discuss different approaches towards the first demonstration of a real quantum computational advantage. The approaches are seemingly very different, but many open challenges turn out to be similar.

Attendees of the Semi-Quantum Computing workshop in 2016

Quantum computers are expected to significantly outperform classical computers. They are however, very hard to build. At the moment, state-of-the-art technologies allow us to manipulate only a few qubits with high precision, but a universal machine must include error correction mechanisms that will require huge overheads in the number of physical qubits that need to be manipulated with high precision. A short-term demonstration of the so-called quantum supremacy will probably require a different route, that of a non-universal or semi-quantum computer (sometimes also called specialized quantum computer; the acronym SQC works for both).

As far as I can tell, the first such semi-quantum model was DQC1 (Deterministic quantum computing with one qubit), intoduced by Knill and Laflamme in 1999. This model was designed to convince critics of liquid state NMR quantum computing that, quantum algorithms designed specifically with the limitations of this platform in mind, could provide an exponential advantage over classical algorithms. Unlike a universal quantum computing model, the DQC1 model does not require the qubits in the system to be initialized in something like an all zero state, in-fact only one qubit needs to be prepared in a state which is not random. The model can be used to solve a (seemingly) computationally hard problem, and was the first proof of concept that, at least one of the technologically demanding requirements for quantum computers (the first Divincenzo criteria) could be relaxed.

In the past decade, a few other SQC models were introduced for various reasons, each with its own limitations that were either conceptual or technological. In August, we organized the first workshop dedicated to this topic, bringing in experts on different models to IQC. Despite the variety of different models, it was not surprising to realize that that there are many conceptual and practical challenges that are relevant to all models.

Verification

Universal quantum computers are expected to solve hard problems with solutions that are easy to verify, like factoring. If I claim I have a universal quantum computer, it is possible to test my machine by asking me to factor a very large number. It is easy to verify that my solution is correct by multiplying the prime factors. Unfortunately, current SQC models are only known to solve problems where the solutions are not easy to verify. For example, the Boson sampling problem, solved by the linear optics model, is (as far as anyone can tell) outside NP. This poses a serious problem for verifying that your machine works as expected, and when trying to demonstrate quantum supremacy.

There are various ideas on what would constitute a reasonable verification process in SQCs. It seems like most of the directions assume a reasonably adversarial approach, i.e., someone claims their machine can solve a hard problem, but you assume they are trying to trick you. Many of the ideas boil down to demonstrating something on a 40-qubit quantum machine and verifying the result by running a classical simulation that will take a lot of resources.

There is no doubt that such a demonstration would be very impressive. Dealing with errors Since error correction seems to require a huge resource overhead and sophisticated quantum control, SQCs are not usually designed with error correction in mind. In DQC1 for example, the fundamental assumption is that the number of pure qubits is very limited so error correction seems to be impossible (I hope to eat my words on this one day). This is a very serious issue since it is unclear if an SQC can get something even close to the right solution in any realistic (noisy scenario). So far, it looks like Boson sampling is the only model where the effect of errors has been evaluated at some level. Meanwhile, it looks like Daniel Lidar et al. managed to implement some kind of classical error correction on DWAVE, which is an extremely limited machine.

Implementation and scaling

On the implementation side, DWAVE is ahead of everyone, but remains very controversial. The only other SQC with any real progress in this direction is Boson Sampling. Achievements are both in terms of understanding how errors can affect the system, and in designing on-chip optical systems that can deal with some of the difficulties. However, it looks like it will be very hard to scale up. As the number of input ports grow, the length of the circuit (i.e., the physical length of the waveguides) grows and the system becomes almost completely opaque.

Usefulness

The million (billion? trillion?) dollar question is, what are these SQCs useful for? So far that remains to be seen. One argument is that the focus should be on a proof of principle, i.e., demonstrating quantum supremacy. Once that is established we can proceed to the more mundane issues of making useful things (as long as we don't admire them). For now, the sampling problems seem to be furthest from anything useful, while DQC1 at least solves some problems that have scientific value.

Some researchers are trying to find interesting tasks that can be solved by the known SQCs and others focus on the conceptual gains. There are many open questions about quantum computing that can be answered using SQC models. These questions don't require a model that can solve a `useful' problem, and for now they don't even require a model that can be implemented easily. SQCs offer theoretically simplistic models that can be used to provide intuition on difficult problems.

The odd guys out

The topics of the workshop were diverse, but two stood out as particularly different from the rest (at least in my mind). The first was Concordant computation. A computation is Concordant if at each time step it has zero discord. If one follows the idea that concordant states are classical then a concordant computation should be classical in some sense.

However, as it turns out, only a sub-set of concordant computation models can be simulated classically with a known algorithm. Anything outside this subset is in the realm of the unknown, but the problem of finding an efficient simulation seems to be very difficult, even when intuition tells us an algorithm should exist.

The second odd talk was on the topic of small quantum computers. As it turns out, a small quantum computer could be used together with a powerful classical computer to simulate non-trivial systems much faster than a classical computer alone. In terms of usefulness, it seems like a very strong direction to follow. It may take a while to build large universal quantum computers, but either small quantum computers or larger semi-quantum computers are likely to make a huge impact in the not too distant future.


1 Hopefully the first of many

Comments

Add new comment

Nice piece mate!

Blog topics

  1. 2016 (21)
    1. November (1)
    2. October (1)
    3. September (2)
    4. August (3)
    5. July (2)
    6. June (2)
    7. May (2)
    8. April (2)
    9. March (3)
    10. February (2)
    11. January (1)
  2. 2015 (16)
    1. December (1)
    2. November (2)
    3. October (2)
    4. September (2)
    5. August (2)
    6. July (2)
    7. June (3)
    8. May (2)

Educational programs

QKD - Quantum Key Distribution Summer School

USEQIP - Undergraduate School on Experimental Quantum Information Processing

QCSYS - Quantum Cryptography School for Young Students

Quantum Innovators logo