Not so fast …

Well, after nearly a decade of hooplah over a realization of a quantum computer, an interesting study found that it was

When one considers that research into quantum computers is carried out primarily because of the promise of hugely accelerated computing speeds, then another conclusion arrived at by the researchers is particularly significant, namely that D-Wave is not faster than a traditional computer.

There are a few important elements of this … it uses 1/5th the number of qubits that the newer generation machine used.
But it wasn’t, as earlier reported, thousands of times faster.

“Not only have we demonstrated that a traditional computer is faster than the 108-bit version of D-Wave”, Troyer responds. “We also used a traditional computer to solve the same problems that can be solved by the new 512-qubit version or hypothetically even higher-performing machines.” When these findings are compared with those of the researcher from Amherst College, it becomes clear that D-Wave is consistently slower than a traditional computer for the tests performed. According to Troyer, the problem with the Amherst study is that it compared fast algorithms for D-Wave with slower algorithms for traditional computers. “We developed optimized algorithms for traditional computers. This allows us to match even the current 512-qubit version of D-Wave”, explains Troyer. “Nobody knows at present whether a future quantum system like D-Wave with more qubits will offer any advantages over traditional systems. This is an important question, and we are currently using experiments on the 512-qubit machine to find the answer.”
Read more at:

Way back in the day, when working on benchmarking big machines, and comparing performance, one of the major criteria was using identical (or as near to identical) algorithms as possible to assess machine speed, compiler quality, etc. If we go using a bubble sort on one machine, and then claim that another with a heap sort to solve the same problem, is a much faster machine than the first, then we have failed, in the most fundamental way, in our benchmarking efforts.
Real benchmarking is a science. It involves a careful study of a system, as well as an understanding of what you are measuring. In physics, we used to talk of “understanding your detector”. If you don’t understand the underlying physics of what how your measuring device … a detector … a program …works, you could wind up generating garbage results … very … very quickly.
This gets to another point, that I’ve started to call “system physics”. Its the underlying mechanisms of operation of a system … the fundamental rules it follows. These system physics features dictate speed of computing and data flow in a system. So much so that if you understand how your program uses them, you can, to a pretty good degree, model the performance of your system.
And at a much higher level, you can optimize your program or your system design, by exploiting what might be the moral equivalent of a “principle of least action”, or more precisely, a variational principle, which can be used to find extremal solutions.
FWIW: many of these same concepts were used in the design and build of the day jobs systems. Which is why we are so bloody fast.
The system physics around D-Wave is all about an adibatic cooling. I am still not quite grasping how one can have a high throughput on such a machine, which factors into the economics, as you have to maintain some portion of it near LHe temps. The programming and cooling aspect have to take time.
I’ll admit I’ve always been at least a little skeptical of the system. I thought it wasn’t a general quantum computer (it can’t execute Shor’s algorithm so we can pretend that factoring remains hard). The discussion of what it is and isn’t confirms those thoughts.
But it might find a niche into which it can fit.
This said, until we can avoid using cryogenics and hard vacuum systems with power lasers to “program” and “run” our quantum computing systems, I don’t think they will be all that useful or tremendously wide spread …