Quantum Computing Begins to EmergeSince the invention of the transistor in the late 1940s, the increasing density of manufacturable circuits has always led to faster and cheaper machines. According to Moores Law, computers double in speed every 18 months, while their price-performance ratio is cut in half.
At every point where this trend appeared ready to stall, weve found new ways forward. For example, while clock speeds have been stalled at around 4 GHz, microprocessors have added multiple cores running at those speeds. However, quantum mechanics will converge with the complexities of parallel processing to create a practical limit to the power of traditional computers.
But, ironically the same quantum mechanical principles that will eventually block the road toward ever more powerful conventional computers will also offer the opportunity to create specialized computers. As weve discussed in prior issues, quantum computing represents an entirely new way of computing aimed at solving problems that are impractical or impossible for traditional computers.
The D-Wave One quantum computer is shown with its inventor. Geordie Rose. It occupies over 1,000 cubic feet. Its size, and its specialized and possivly dead-end reliance on "quantum annealing." make this quantum computer roughly analogous to ENIAC, the first digital computer. that is, by bringing quantum computing out of the lab for the first time, the D-Wave One may - like ENIAC - prepare the way for a whole new industry.
In conventional electronic computers, a bit of data is represented as either a 0 or a 1. Data is stored by this rule of binary logic. In the weird world of quantum mechanics, a quantum bit, or "qubit," can not only register a 0 or a 1, it can register both simultaneously ? a phenomenon known as "superposition." Although it is difficult to grasp how this can be, the precise laws of quantum mechanics enable predictions of what quantum computers will do.
A small computer of only a few tens of thousands of qubits would be enormously powerful, since qubits in superposition work together to handle exponentially more data. In fact, for some applications, theyd be more powerful than all the computers that have ever been built ? combined. But getting to tens of thousands of qubits is still an enormous hurdle.
However, a big step in quantum computing has recently been reached: the sale of the first "quantum computer." It was purchased by Lockheed Martin for $10 million.1
Called the D-Wave One, this computer features 128 qubits. The qubits are formed by loops of niobium metal, a material that becomes a superconductor at very low temperatures. Niobium is commonly used in MRI scanners. Couplers, also made of niobium, link the qubits and control how the magnetic fields, which represent the qubits, affect each other.2
How does it work? First, magnetic fields are used to input data and instructions by setting the states of the qubits and couplers. After a short time, a series of quantum mechanical changes provide the final answer using magnetic fields.
As Geordie Rose, D-Waves founder, describes it, "You stuff the problem into the hardware and it acts as a physical proxy for what youre trying to solve. All physical systems want to sink to the lowest energy level, with the most entropy, and ours sinks to a state that represents the solution."
This process of sinking to the lowest energy level that Rose refers to is called "annealing." The D-Wave One processor calculates solutions by piggy-backing a users problem onto "quantum annealing" to reveal the solution.
The D-Wave One performs a single type of mathematical operation called "discrete optimization." Through this algorithm, the computer provides approximate answers to problems that can only be solved by trying every possible solution. Allan Snavelly at the San Diego Supercomputer Center describes these problems as the type "where you know the right answer when you see it, but finding it among the exponential space of possibilities is difficult."
Reminiscent of the time when a computer filled a whole room, the D-Wave One occupies over 1,000 cubic feet. Its size, and its specialized and possibly dead-end reliance on "quantum annealing," makes this quantum computer roughly analogous to ENIAC, the first digital computer. ENIAC was incredibly large, slow, and unreliable, with fewer capabilities than a 1975 HP programmable calculator. However, it is universally acknowledged as the forerunner of every current digital computer ranging from the iPad to the fastest supercomputer. By bringing quantum computing out of the lab for the first time, the D-Wave One may, in a similar way, prepare the way for a whole new industry.
Lockheed Martin certainly views this computer as a usable tool, not a lab curiosity. Operating the D Wave One in conjunction with a conventional computer, the combined system learns from past data and makes predictions about future events. Lockheed needs this capability to identify unforeseen technical problems in products that are complex combinations of software and hardware. These types of glitches contribute to cost over-runs, such as the one the company is experiencing with the F 35 strike fighter, which is 20 percent over budget.
Other applications are looking equally promising:
- Google is testing the D-Wave computer to see if it can speed up software for the interpretation of photos.
- Faster database searches might also be possible with quantum computers.
- Simulations of physical systems are also highly likely applications; for example, analyzing the complex properties of solid-state, chemical, and high-energy systems.
In light of this trend, we provide the following three forecasts for your consideration.
First, we wont have quantum computers sitting on our desks anytime soon ? if ever.
At the current rate of development, the best guess for the arrival of personal quantum computers is around 2050. But even then, there are many tasks that are performed better by conventional, silicon-based computers. Where quantum computers are faster, they are exponentially faster. But for the kind of everyday computing consumers will require, conventional computers will meet the need as they continue to improve dramatically over the next couple of decades. Another large hurdle for quantum computers going "mainstream" is their inherent instability. A quantum superposition is very fragile and incredible technological breakthroughs will be needed for this technology to withstand the typical environment of a laptop computer.
Second, by 2025, quantum computers will make complex applications commonplace using techniques that are currently hard to imagine.
Quantum computing will excel in solving problems that involve millions, if not billions, of factors. For example, weather forecasting could become more accurate. Quantum computers could even enable us to predict future natural disasters such as earthquakes. Tracking the motions of planets and stars will improve, as will the monitoring of the thousands of known asteroids that could potentially veer into Earths path. Also, tasks such as the sequencing of an entire genome might take mere minutes to complete. It could then become possible and affordable for millions of individuals to have their DNA checked for any preventable diseases or disorders that may be lurking, yet undetected.
Third, as soon as 2015, quantum computing could make todays network security technology totally obsolete.
Even small quantum computers will possess the kind of computing power needed to break all known codes in use today. This would be a major disruption for the way we interact on the Internet. As a result, we will be forced to rethink security. Not surprisingly, this issue has captured the attention of the National Security Agency. Fortunately, so-called quantum encryption techniques are already under development, using quantum entanglement to create security algorithms that even quantum computers cant break. While this will enable law-abiding citizens to maintain their privacy, it will also enable terrorists and organized crime organizations to do the same.
References List :
1. MIT Technology Review, June 1, 2011, "Tapping Quantum Effects for Software that Learns," by Tom Simonite. ¨Ï Copyright 2011 by MIT Technology Review. All rights reserved. http://www.technologyreview.com 2. For more information about the D-Wave One computer, visit the D-Wave Systems website at: http://www.dwavesys.com