Guest essay by Eric Worrall
The Register has published a fascinating video about Quantum Computing, an interview with D-Wave, a company which manufactures what they claim are quantum computing systems.
According to The Register;
It turns out that there are three broad categories of problem where your best bet is a quantum computer. The first is a Monte Carlo type simulation, the second is machine learning, and the third is optimization problems that would drive a regular computer nuts – or, at least, take a long time for it to process.
An example of this type of optimization problem is this: Consider the approximately 2,000 professional hockey players in North America. Your task is to select the very best starting line-up from that roster of guys.
There are a lot of variables to consider. First there’s all the individual stats, like how well they score, pass, and defend. But since hockey is a team sport, you also have to consider how well they work when combined with other specific players. When you start adding variables like this, the problem gets exponentially more difficult to solve.
But it’s right up the alley of a quantum computer. A D Wave system would consider all of the possible solutions at the same time, then collapse down to the optimal set of player. It’s more complicated than I’m making out, of course, but it’s a good layman-like example.
Read more: http://www.theregister.co.uk/2016/04/18/d_wave_demystifies_quantum_computing/
The following is the video of the interview;
A note of caution. Quantum computing is still in its infancy. There is substantial skepticism expressed in some quarters, about what is happening inside the box, about whether the D-Wave system offers a performance advantage over conventional computers.
For example;
A team of quantum-computing experts in the US and Switzerland has published a paper in Science that casts doubt over the ability of the D-Wave Two quantum processor to perform certain computational tasks. The paper, which first appeared as a preprint earlier this year, concludes that the processor – built by the controversial Canadian firm D-Wave Systems – offers no advantage over a conventional computer when it is used to solve a benchmark computing problem.
While the researchers say that their results do not rule out the possibility that the processor can outperform conventional computers when solving other classes of problems, their work does suggest that evaluating the performance of a quantum computer could be a much trickier task than previously thought. D-Wave has responded by saying that the wrong benchmark problem was used to evaluate its processor, while the US–Swiss team now intends to do more experiments using different benchmarks.
The abstract of the paper;
The development of small-scale quantum devices raises the question of how to fairly assess and detect quantum speedup. Here, we show how to define and measure quantum speedup and how to avoid pitfalls that might mask or fake such a speedup. We illustrate our discussion with data from tests run on a D-Wave Two device with up to 503 qubits. By using random spin glass instances as a benchmark, we found no evidence of quantum speedup when the entire data set is considered and obtained inconclusive results when comparing subsets of instances on an instance-by-instance basis. Our results do not rule out the possibility of speedup for other classes of problems and illustrate the subtle nature of the quantum speedup question.
Read more: http://science.sciencemag.org/content/345/6195/420
Quantum computing in my opinion is a goal worth pursuing. Even if the D-Wave system does not fulfil its promise, this will hardly be the end of the Quantum Computing effort. The goal, of harnessing almost unimaginable computation power, of being able to solve problems which simply can’t be tackled with conventional computers, is simply too attractive to abandon.

I’ll believe in quantum computing when there’s a fusion-powered space elevator taking us to LEO.
It’s 1890. Wild-eyed inventors working in garages in France, the US, the UK and Germany are predicting that their radical new machines could provide motive power for personal carriages and wagons – without steam. The real wackos were predicting that with such machines may enable men to fly through the air like birds.
Tom Watson, founder of IBM famously predicated that the world demand for electronic computers would be five, max. Ken Olsen, founder of Digital Equipment famously threw Steve Jobs out of his office for claiming that ordinary people would want to have computers in their homes.
We ignore possible futures at our own risk.
The interesting question is – what if? For this, we need science fiction.
On another thread it was pointed out that our greatest peril is the arrival of an errant bolus from space. The classic exploration of this possibility is Jerry Pournelle’s and Larry Niven’s “Lucifer’s Hammer” in which a comet that is rated a near miss breaks up rounding the sun and the pieces disrupt our comfortable life here on terra (was) firma.
For quantum computing, the current classic is Mather Mather’s “The Complete Atopia Chronicles.” One of the first problems the D-Wave folks point to is the sports team problem. Imagine if Google and Facebook were to combine all of their info on each of us, our history, likes and dislikes, locations, activities and friends and feed it all into QC – Oh wait, who are the investors in D-Wave?
As the book describes, one result is Pfuture News – the news before it happens, to a high statistical probability. “With great powers, they said, came strange responsibilities.”
The good news (depending on your view of the future), is that quantum computing is developing in a highly competitive entrepreneurial environment. It will survive based on its ability to solve real problems in the real world, not by government fiat or personal assertion (are you listening MM?).
I can’t resist another quote, about the advantages of turning control of your physical body over to an AI bot (proxxi) so that you can do the really “important” stuff, like virtual sex, etc.
“Imagine performing more at work while being there less. Want to get in shape? Your new proxxi can take you for a run while you relax by the pool!” she exclaimed, stopping her walk to look directly into each viewer’s eyes. “Look how you want, when you want, where you want, and live longer doing it. Create the reality you need right now with Atopian pssionics. Sign up soon for zero cost!”
The D-Wave prototype is not a universal quantum computer. It is not digital, nor error-correcting, nor fault tolerant. It is a purely analog machine designed to solve a particular optimization problem. It is unclear if it qualifies as a quantum device.
link https://www.quantamagazine.org/20150122-quantum-computing-without-qubits/
Maybe another $50 million will produce a bit of an answer.
The great quantum computing scam. Sound familiar? This is just another fraud by “scientists” to raise money from a generous and long-suffering public by waving big words around. If a thing violates all common sense, then it’s the thing that’s invariably at fault, not common sense. Into the hocus pocus bucket with this.
There’s only two options really. D-Wave is wrong in stating that there machine is a quantum computer. The other alternate is that Quantum Theory is wrong. Pick one and then file a proposal for grant funding.
their
This topic was breached a couple of years ago on a forum I participate in :
https://channel9.msdn.com/Forums/Coffeehouse/flops-Super-computers-and-Quantum-Computing
Some of my comments back then:
Jan 20, 2014 at 7:25PM
“While D-Wave’s hardware is better at dealing with structured code, it runs neck-and-neck with the “fake” system when tackling random problems. ”
http://www.engadget.com/2014/01/20/google-tests-the-performace-limits-of-d-wave-quantum-computers/
– – –
Dec 27, 2013 at 4:10PM
A company near me sold a quantum computer to Lockheed Martin.
http://www.theglobeandmail.com/news/british-columbia/bc-firms-superfast-quantum-computer-upgrade-attracts-lockheed-martin/article10825838/
It’s hard to determine what is meant by speed compared to a conventional computer. Is it compared to a single transistor? Quantum state means it has all possible values, so how do you program a quantum computer ?
— – –
Energy consumption is an important factor in super computer usage. Its an additional cost that must be considered for each program run.
There are two main obstacles that need to be overcome for future improvements :
Memory: http://www.theregister.co.uk/2013/09/14/intel_exascale_update_deepdive/
And interconnect: … http://insidehpc.com/2013/11/15/doe-awards-25-4-million-exascale-interconnect-design/
I remember IBM bought the Inmos transputer in the 80’s and I saw it demonstrated at IBM’s development lab in Winchester, UK. I was impressed. It was claimed to be the future of computing. Although we have symmetric multiprocessing (SMP), multiple cores, 64bit architecture and oodles of RAM even on laptops, such as the one I am using right now, etc these days, it is nothing like what the transputer was. I don’t think the transputer ever got in to production outside the lab.
The Transputer made it into limited production, and was not all that impressive. 4 cores, not all that fast. The I7-Quad is getting a bit old, but could blow the doors off of anything prior.
From memory, the demo unit had 8 “cores” with a 9th to control communications between all the others.
Ya gotta love it.
Only the computer they’re describing does nothing new in computing for these problems. Sheer hardware miniaturization is not automatically more intelligent computing, just faster.
Whenever one looks to nature for these problems are solved now, demonstrates that neural cells grow and interconnect to better assimilate and process the new data and problems.
What is also likely, yet I’ve never heard of any such discovery is that surely nature has developed or implemented quantum processing somewhere somehow. As with so many other mechanical problems, nature has had the time, the materials and nearly infinite opportunity to develop it. Those little bugs, germs, planarian, insects and even birds and squirrels that seem to be much smarter than their brain sizes.
Look on the web for methods to keep squirrels out of bird feeders. There are some hilarious videos out there where squirrels run the gamut and still get the bird seed.
I bet those squirrels drank Carling Black Label beer.
Maybe this will lead to computers by the name of Orac?
https://en.wikipedia.org/wiki/Orac_(Blake%27s_7)
No one imagined having a device that could fit in your shirt pocket that combined a Star Trek communicator and Tri-corder into one handy device that can fit into you shirt pocket. No one imagined that device would become ubiquitous across all of society. Not even Steve Jobs imagined this when he was watching Star Trek in his youth or even when working on his Next computer. Steve Jobs failed in packaging and marketing technology with the Next computer but rode a new wave of technology advancements and became wildly successful in packaging and marketing those advancements as a consumer product.
Technology advances and application is chaotic and unpredictable (kind of like weather and the climate). Quantum computing could change how civilization works as cell-phones have, or be a niche product, or a big zero like the vacations we once imagined we could take on lunar colony resorts by the year 2000.
It all depends on the actual technology breakthroughs.
Mobile technology depends on ultra high transistor packing density, lithium ion batteries, LCD screens and gigahertz radio wave technology.
A vacation on the moon depends on massive access to cheap energy, or some kind of quantum tunnelling teleportation.
With a space elevator on both the earth and the moon, you could dramatically reduce the amount of energy required.
Of course that would require developing the technology to build space elevators. Sigh.
Dick Tracy began using his I-watch in 1946.
The main problems with Climate Science are poor models and grossly inadequate data coverage. Quantum Computers are not going to help with this.
Au Contraire. QC may be able to get blatantly incorrect answers in orders of magnitude less time. We’ll have 10000 models that don’t make correct predictions. 10000 is 100 times better than 100, right?
More computer horsepower would enable a reduction in grid size.
Smaller grids would mean some of the things that currently have to be parameterized could be modeled instead.
Of course the answers would still be wrong, but they may not be quite as wrong as before.
(Smaller grid sizes would be a big improvement for weather forecasting though.)
I remember back in the 1980’s, fuzzy logic was supposed to take over the world of computing.
You mean arm waving by Seth Lloyd has producing anything yet.
Reading through the commentary here, I propose a variation of Godwin’s Law: If an internet conversation about computers goes on long enough, it will inevitably devolve into reminiscing about card reader / punch card programming. 🙂
rip
At least nobody has talked about the really old computers that had to be programmed with toggle switches.
Well, now that you mention it, I did in fact see one of those… I had a cousin that worked at a bank, very early 1970’s.
But it was an advanced version, you entered enough code with the switches (a “boot loader”) so that the computer could then read in an operating system from a mag tape to “boot” the machine. Comes from the term bootstrapping.
My cousin was good at it, he could enter about 100 lines of code with the toggle switches in about 5 minutes. One word of caution, don’t tap him on the shoulder and ask him; “What’s that you’re doing” in the middle of it…..
As non-volatile memory advanced the “boot loader” became the BIOS: Basic Input Output System. Still in use today, first thing that happens after power up (tablet, I-Phone, laptop, mainframe) the processor starts executing the BIOS code. This adds support for floppy disks, hard drives, com ports, etc.
Don’t be fergettin the “plugboard” computers, ……. such as the Univac 1004.
The UNIVAC 1004 was a plug-board programmed punched card data processing system, introduced in 1962, https://en.wikipedia.org/wiki/UNIVAC
ripshin: Oh please. I used punch cards for a few years and paper tape as well. A giant pain. Don’t miss them in the slightest. I remember when disk drives were measured in kilobytes (DEC DF-32 anyone?) and the primary I/O was a KSR-33 Teletype. UGH! My ancient desktop has 2.5 GByte memory, 250 GByte hard drive, a nice lcd monitor, with 5 in. optical disk for backup. 15 years old. Newer stuff is far better. (Well, maybe not software …) As for the past, good riddance.
Mark W: Fuzzy Logic has it’s place. The circular file comes to mind.
Mostly I agree. However the first remotely modern computer I worked on (the previous ones had vacuum tubes) was the CDC1604. It used a modified IBM Selectric typewriter for a console. The film ribbons were a nuisance, but the keyboard was far superior to anything I’ve ever seen hooked to a modern PC.
Do you remember the old dishwasher sized IBM disk drives, the one with removable platters?
> Do I remember the dishwasher sized IBM disk drives with the removable platters?
All too well. BTW if you put one of those that someone had dropped on the floor thereby bending a couple of platters into the disk drive and pressed LOAD it made a most interesting and quite expensive noise when the heads crashed into the bent platters.
3380’s used to do the same thing after a cold start.
It is quite possible the brain also performs quantum computing at the molecular level. The usual objection, that this environment is too hot (i.e. body temperature) to maintain quantum coherence for any length of time does not hold water, because the system is as far from thermodynamic equilibrium as it can be. Subsystems of non-equilibrium pumped systems can stay at an arbitrarily cold virtual temperature for extended periods, as it is demonstrated by laser cooling.
If power supply of neural tissue is turned off, that is either its incoming sugar or oxygen stream is severed, the brain suffers instant, irreversible structural damage. Quite odd behavior from an organ, which is meant to last for many decades. It shows Nature tried to solve a nearly unresolvable problem here (due to some irresistible evolutionary pressure), rather successfully, I must add. What this problem might be is anyone’s guess, but it may well be quantum computing or something even better. Who knows?
It is quite possible the brain also performs quantum computing at the molecular level.
Is this not just one of those models proponents of quantum computing use to assert its merits but with little evidence. They can actually map the signals in a small number of neurons. What is clear is that the signal is amplitude “modulated” in that the intensity of the signal influences output. In short, neurons work in analog not digital. This means a single neuron connection can have, in theory, an infinite number of states (but only one state at a time – not quantum). But the power really comes when you add to that the parallel nature of how the brain processes data. Millions and millions of neurons can act together (and remember each connection can transfer information from a possible infinite number of states) in different parts of the brain. Work on simpler creatures where the cells have been doped with a photo-chemical marker that show how a neural system works as a system.
The interruption or loss of sugar or oxygen stream does not always cause the human brain to suffers instant, irreversible structural or functional damage. Now I specifically noted the human brain because there are a few animal species that when the weather turns cold their bodies and brains will freeze solid ….. and will remain frozen until the Springtime warm temperatures return and defrost their bodies …… and they will continue doing what they had been doing when they were frozen.
The brains of higher level animals, especially the human brain , ….. is a biological self-programming super computer that is capable of increasing its memory capacity (growth of new neurons and their respective synaptic connections) if and when it is needed.
The architecture of a biological computer is unlike that of an electronic computer in that the latter utilizes a physical storage addressing scheme for the storing and/or recalling of info or data ……………… whereas the former (brain) utilizes the data or info itself for the storing of new data or the recalling of previously stored data.
You might find this commentary of mine an interested read, to wit:
Biology of a cell: Genetic memory verses Environmental memory
Well, paramecium is a single celled creature, so it has no nervous system at all, for it has exactly zero neurons. Still, it exhibits quite complex behavior, it can even learn. How?
[youtube https://www.youtube.com/watch?v=7vrXaNIdwXQ&w=420&h=315%5D