Quantum Computer
Quantum Computer
Quantum Computer
The basic principle of quantum computation is that the quantum properties can
be used to represent and structure data, and that quantum mechanisms can be
devised and built to perform operations with this data.
Research in both theoretical and practical areas continues at a frantic pace, and
many national government and military funding agencies support quantum
computing research to develop quantum computers for both civilian and national
security purposes, such as cryptanalysis.
If large-scale quantum computers can be built, they will be able to solve certain
problems exponentially faster than any of our current classical computers (for
example Shor's algorithm).
Quantum computers are different from other computers such as DNA computers
and traditional computers based on transistors.
In particular, most of the popular public key ciphers are based on the difficulty of
factoring integers, including forms of RSA.
These are used to protect secure Web pages, encrypted email, and many other
types of data.
Breaking these would have significant ramifications for electronic privacy and
security.
The only way to increase the security of an algorithm like RSA would be to
increase the key size and hope that an adversary does not have the resources to
build and use a powerful enough quantum computer.
Artificial Intelligence
A primary application for quantum computing is artificial intelligence
(AI). AI is based on the principle of learning from experience,
becoming more accurate as feedback is given, until the computer
program appears to exhibit “intelligence.”
Google has already made forays in this field by simulating the energy
of hydrogen molecules. The implication of this is more efficient
products, from solar cells to pharmaceutical drugs, and especially
fertilizer production; since fertilizer accounts for 2 percent of global
energy usage, the consequences for energy and the
environment would be profound.
Cryptography
Most online security currently depends on the difficulty of factoring
large numbers into primes. While this can presently be accomplished
by using digital computers to search through every possible factor, the
immense time required makes “cracking the code” expensive and
impractical.
Financial Modeling
Modern markets are some of the most complicated systems in
existence. While we have developed increasingly scientific and
mathematical tools to address this, it still suffers from one major
difference between other scientific fields: there’s no controlled setting
in which to run experiments.
Weather Forecasting
NOAA Chief Economist Rodney F. Weiher claims (PowerPoint
file) that nearly 30 percent of the US GDP ($6 trillion) is directly or
indirectly affected by weather, impacting food production,
transportation, and retail trade, among others. The ability to better
predict the weather would have enormous benefit to many fields, not
to mention more time to take cover from disasters.
While this has long been a goal of scientists, the equations governing
such processes contain many, many variables, making classical
simulation lengthy. As quantum researcher Seth Lloyd pointed out,
“Using a classical computer to perform such analysis might take
longer than it takes the actual weather to evolve!” This motivated
Lloyd and colleagues at MIT to show that the equations governing the
weather possess a hidden wave nature which are amenable to
solution by a quantum computer.
The United Kingdom’s national weather service Met Office has already
begun investing in such innovation to meet the power and scalability
demands they’ll be facing in the 2020-plus timeframe, and released a
report on its own requirements for exascale computing.
Particle Physics
Coming full circle, a final application of this exciting new physics might
be… studying exciting new physics. Models of particle physics are
often extraordinarily complex, confounding pen-and-paper solutions
and requiring vast amounts of computing time for numerical
simulation. This makes them ideal for quantum computation, and
researchers have already been taking advantage of this.
1. Machine Learning
Machine learning is a hot area right now because we are now seeing significant
deployments at the consumer level of many different platforms. We are now seeing
aspects of this every day in voice, image and handwriting recognition, to name just a few
examples. But it is also a difficult and computationally expensive task, particularly if you
want to achieve good accuracy. Because of the potential payoff, there is a lot of research
ongoing based upon sampling of Boltzmann distributions.
2. Computational Chemistry
There are many problems in materials science that can achieve a huge payoff if we just find
the right catalyst or process to develop a new material, or an existing material more
efficiently. There is already a significant effort in using classical computers to simulate
chemical interactions, but in many cases the problems become intractable for solving
classically. So the original idea presented by Richard Feynman is why not use a quantum
computer to simulate the quantum mechanical processes that occur. Here are just a few
examples of significant problems that could see large payoffs if we can solve them.
Finding the optimum mix for a basketful of investments based upon projected returns, risk
assessments, and other factors is a daily task within the finance industry. Monte Carlo
simulations are constantly being run on classical computers and consume an enormous
amount of computer time. By utilizing quantum technology to perform these calculations,
one could achieve improvements in both the quality of the solutions as well as the time to
develop them. Because money managers handle billions of dollars, even a 1%
improvement in the return is worth a lot of money. There is a web site called Quantum for
Quants that is devoted to this subject that you can check out to learn more.
Many common optimizations used in industry can be classified under logistics and
scheduling. Think of the airline logistics manager who needs to figure out how to stage his
airplanes for the best service at the lowest cost. Or the factory manager who has an ever
changing mix of machines, inventory, production orders, and people and needs to minimize
cost, throughput times and maximize output. Or the pricing manager at an automobile
company who needs to figure out the optimum prices of all the dozens car options to
maximize customer satisfaction and profit. Although, classical computing is used heavily to
do these tasks, some of them may be too complicated for a classical computing solution
whereas a quantum approach may be able to do it.
5. Drug Design
Although drug design is really a problem in computational chemistry, I put it into its own
classification because it is used by the pharmaceutical industry. Many of the drugs being
developed still do so through the trial and error method. This is very expensive and if more
effective ways of simulating how a drug will react would save a ton of money and time.
6. Cyber Security
Cyber security is becoming a larger issue every day as threats around the world are
increasing their capabilities and we become more vulnerable as we increase our
dependence upon digital systems, learn more about cybersecurity in 2019 over at sites
such as Upskilled and others. Various techniques to combat cyber security threats can be
developed using some of the quantum machine learning approaches mentioned above to
recognize the threats earlier and mitigate the damage that they may do.
7. Codebreaking
You may wonder why I have put codebreaking so far down the list, given all the attention
given to Shor’s algorithm and its ability to factor large numbers and break RSA encryption.
The reason is that I believe that this will just be a temporary application until the world
converts to a class of “post-quantum” cryptographic techniques that will not be vulnerable to
breaking by a quantum computer. There is an increasing amount of research in post-
quantum cryptography that you can review here and here. So although, we probably will
have quantum computers able to factor very large numbers 10 years from now, it is not
clear if we will have a use for it at that time.
When one develops large software programs with millions of lines of code or large ASIC
chips that have billions of transistors, it can get awfully difficult and expensive to verify them
for correctness. There can be billions or trillions of different states and it is impossible for a
classical computer to check every single one in simulation. Not only does one want to
understand what will happen when the system is operating normally, but one also wants to
understand what happens if there is a hardware or other error. Will the system detect it and
does it have a recovery mechanism to mitigate any possible problem? The costs of an error
can be very high because some of these systems can be used where lives or millions of
dollars might be dependent on their being error-free. By using quantum computing to help
in these simulations, one can potentially provide a much better coverage in their simulations
with a greatly improved time to do so.