Not so fast ...
By joe
- 3 minutes read - 482 wordsWell, after nearly a decade of hooplah over a realization of a quantum computer, an interesting study found that it was
There are a few important elements of this … it uses 1/5th the number of qubits that the newer generation machine used. But it wasn’t, as earlier reported, thousands of times faster.
Way back in the day, when working on benchmarking big machines, and comparing performance, one of the major criteria was using identical (or as near to identical) algorithms as possible to assess machine speed, compiler quality, etc. If we go using a bubble sort on one machine, and then claim that another with a heap sort to solve the same problem, is a much faster machine than the first, then we have failed, in the most fundamental way, in our benchmarking efforts. Real benchmarking is a science. It involves a careful study of a system, as well as an understanding of what you are measuring. In physics, we used to talk of “understanding your detector”. If you don’t understand the underlying physics of what how your measuring device … a detector … a program …works, you could wind up generating garbage results … very … very quickly. This gets to another point, that I’ve started to call “system physics”. Its the underlying mechanisms of operation of a system … the fundamental rules it follows. These system physics features dictate speed of computing and data flow in a system. So much so that if you understand how your program uses them, you can, to a pretty good degree, model the performance of your system. And at a much higher level, you can optimize your program or your system design, by exploiting what might be the moral equivalent of a “principle of least action”, or more precisely, a variational principle, which can be used to find extremal solutions. FWIW: many of these same concepts were used in the design and build of the day jobs systems. Which is why we are so bloody fast. The system physics around D-Wave is all about an adibatic cooling. I am still not quite grasping how one can have a high throughput on such a machine, which factors into the economics, as you have to maintain some portion of it near LHe temps. The programming and cooling aspect have to take time. I’ll admit I’ve always been at least a little skeptical of the system. I thought it wasn’t a general quantum computer (it can’t execute Shor’s algorithm so we can pretend that factoring remains hard). The discussion of what it is and isn’t confirms those thoughts. But it might find a niche into which it can fit. This said, until we can avoid using cryogenics and hard vacuum systems with power lasers to “program” and “run” our quantum computing systems, I don’t think they will be all that useful or tremendously wide spread …