Download as rtf, pdf, or txt
Download as rtf, pdf, or txt
You are on page 1of 9

Quantum computing is the use of quantum-mechanical phenomena such as superposition and

entanglement to perform computation. Computers that perform quantum computation are known as a
quantum computers.[1]:I-5 Quantum computers are believed to be able to solve certain computational
problems, such as integer factorization (which underlies RSA encryption), significantly faster than
classical computers. The study of quantum computing is a subfield of quantum information science.

Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum
mechanical model of the Turing machine.[2] Richard Feynman and Yuri Manin later suggested that a
quantum computer had the potential to simulate things that a classical computer could not.[3][4] In
1994, Peter Shor developed a quantum algorithm for factoring integers that had the potential to decrypt
RSA-encrypted communications.[5] Despite ongoing experimental progress since the late 1990s, most
researchers believe that "fault-tolerant quantum computing [is] still a rather distant dream".[6] In recent
years, investment into quantum computing research has increased in both the public and private sector.
[7][8] On 23 October 2019, Google AI, in partnership with the U.S. National Aeronautics and Space
Administration (NASA), published a paper in which they claimed to have achieved quantum supremacy.
[9] While some have disputed this claim, it is still a significant milestone in the history of quantum
computing.[10]

Quantum computing is modeled by quantum circuits. Quantum circuits are based on the quantum bit, or
"qubit", which is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0
quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are
measured the result is always either a 0 or a 1; the probabilities of these two outcomes depend on the
quantum state that they were in immediately prior to the measurement. Computation is performed by
manipulating qubits with quantum logic gates, which are somewhat analogous to classical logic gates.

Limitations outlined in the strong Church thesis are imposed on quantum computers. Many hypothetical
machines have been realized, however these machines cannot be constructed in reality. Church’s thesis
requires these machines to be physical, therefore these hypothetical machines cannot represent a
quantum computer that satisfies the strong Church thesis. Church’s thesis originally defined the
guideline for hypothetical quantum computers used to realize quantum algorithms such as Shor’s
algorithms for factorization and discrete logarithms. Specifically, Shor’s algorithms utilize the quantum
gate array model to simulate a quantum computer with an arbitrary Hamiltonian. There are many other
quantum models, including the adiabatic quantum computer and one-way quantum computer. When
preforming a calculation the model that best of best fit is selected.[11]

Computational ability is limited by the space and time requirements of a process, such as an algorithm.
While in classical computation we only are limited by space and time, quantum computation introduces
a new limiting variable: precision. Precision is introduced inherently from superposition of states in
quantum mechanics, and is a measure of the accuracy of a computation. This precision is governed by
the Heisenberg uncertainty principle which similarly governs measurements in quantum mechanics.
Introduction of precision into quantum computations is thought to enable quantum computers to solve
problems in polynomial time when on a classical computer, the solution to these problems are
nondeterministic polynomial time or worse.[11]

There are currently two main approaches to physically implementing a quantum computer: analog and
digital. Analog approaches are further divided into quantum simulation, quantum annealing, and
adiabatic quantum computation. Digital quantum computers use quantum logic gates to do
computation. Both approaches use quantum bits or qubits.[1]:2–13

Any computational problem that can be solved by a classical computer can also, in principle, be solved by
a quantum computer. Conversely, quantum computers obey the Church–Turing thesis; that is, any
computational problem that can be solved by a quantum computer can also be solved by a classical
computer. While this means that quantum computers provide no additional power over classical
computers in terms of computability, they do in theory provide additional power when it comes to the
time complexity of solving certain problems. Notably, quantum computers are believed to be able to
quickly solve certain problems that no classical computer could solve in any feasible amount of time—a
feat known as "quantum supremacy". The study of the computational complexity of problems with
respect to quantum computers is known as quantum complexity theory.

Quantum computing is the use of quantum-mechanical phenomena such as superposition and


entanglement to perform computation. Computers that perform quantum computation are known as a
quantum computers.[1]:I-5 Quantum computers are believed to be able to solve certain computational
problems, such as integer factorization (which underlies RSA encryption), significantly faster than
classical computers. The study of quantum computing is a subfield of quantum information science.

Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum
mechanical model of the Turing machine.[2] Richard Feynman and Yuri Manin later suggested that a
quantum computer had the potential to simulate things that a classical computer could not.[3][4] In
1994, Peter Shor developed a quantum algorithm for factoring integers that had the potential to decrypt
RSA-encrypted communications.[5] Despite ongoing experimental progress since the late 1990s, most
researchers believe that "fault-tolerant quantum computing [is] still a rather distant dream".[6] In recent
years, investment into quantum computing research has increased in both the public and private sector.
[7][8] On 23 October 2019, Google AI, in partnership with the U.S. National Aeronautics and Space
Administration (NASA), published a paper in which they claimed to have achieved quantum supremacy.
[9] While some have disputed this claim, it is still a significant milestone in the history of quantum
computing.[10]

Quantum computing is modeled by quantum circuits. Quantum circuits are based on the quantum bit, or
"qubit", which is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0
quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are
measured the result is always either a 0 or a 1; the probabilities of these two outcomes depend on the
quantum state that they were in immediately prior to the measurement. Computation is performed by
manipulating qubits with quantum logic gates, which are somewhat analogous to classical logic gates.

Limitations outlined in the strong Church thesis are imposed on quantum computers. Many hypothetical
machines have been realized, however these machines cannot be constructed in reality. Church’s thesis
requires these machines to be physical, therefore these hypothetical machines cannot represent a
quantum computer that satisfies the strong Church thesis. Church’s thesis originally defined the
guideline for hypothetical quantum computers used to realize quantum algorithms such as Shor’s
algorithms for factorization and discrete logarithms. Specifically, Shor’s algorithms utilize the quantum
gate array model to simulate a quantum computer with an arbitrary Hamiltonian. There are many other
quantum models, including the adiabatic quantum computer and one-way quantum computer. When
preforming a calculation the model that best of best fit is selected.[11]

Computational ability is limited by the space and time requirements of a process, such as an algorithm.
While in classical computation we only are limited by space and time, quantum computation introduces
a new limiting variable: precision. Precision is introduced inherently from superposition of states in
quantum mechanics, and is a measure of the accuracy of a computation. This precision is governed by
the Heisenberg uncertainty principle which similarly governs measurements in quantum mechanics.
Introduction of precision into quantum computations is thought to enable quantum computers to solve
problems in polynomial time when on a classical computer, the solution to these problems are
nondeterministic polynomial time or worse.[11]

There are currently two main approaches to physically implementing a quantum computer: analog and
digital. Analog approaches are further divided into quantum simulation, quantum annealing, and
adiabatic quantum computation. Digital quantum computers use quantum logic gates to do
computation. Both approaches use quantum bits or qubits.[1]:2–13
Any computational problem that can be solved by a classical computer can also, in principle, be solved by
a quantum computer. Conversely, quantum computers obey the Church–Turing thesis; that is, any
computational problem that can be solved by a quantum computer can also be solved by a classical
computer. While this means that quantum computers provide no additional power over classical
computers in terms of computability, they do in theory provide additional power when it comes to the
time complexity of solving certain problems. Notably, quantum computers are believed to be able to
quickly solve certain problems that no classical computer could solve in any feasible amount of time—a
feat known as "quantum supremacy". The study of the computational complexity of problems with
respect to quantum computers is known as quantum complexity theory.

Quantum computing is the use of quantum-mechanical phenomena such as superposition and


entanglement to perform computation. Computers that perform quantum computation are known as a
quantum computers.[1]:I-5 Quantum computers are believed to be able to solve certain computational
problems, such as integer factorization (which underlies RSA encryption), significantly faster than
classical computers. The study of quantum computing is a subfield of quantum information science.

Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum
mechanical model of the Turing machine.[2] Richard Feynman and Yuri Manin later suggested that a
quantum computer had the potential to simulate things that a classical computer could not.[3][4] In
1994, Peter Shor developed a quantum algorithm for factoring integers that had the potential to decrypt
RSA-encrypted communications.[5] Despite ongoing experimental progress since the late 1990s, most
researchers believe that "fault-tolerant quantum computing [is] still a rather distant dream".[6] In recent
years, investment into quantum computing research has increased in both the public and private sector.
[7][8] On 23 October 2019, Google AI, in partnership with the U.S. National Aeronautics and Space
Administration (NASA), published a paper in which they claimed to have achieved quantum supremacy.
[9] While some have disputed this claim, it is still a significant milestone in the history of quantum
computing.[10]

Quantum computing is modeled by quantum circuits. Quantum circuits are based on the quantum bit, or
"qubit", which is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0
quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are
measured the result is always either a 0 or a 1; the probabilities of these two outcomes depend on the
quantum state that they were in immediately prior to the measurement. Computation is performed by
manipulating qubits with quantum logic gates, which are somewhat analogous to classical logic gates.
Limitations outlined in the strong Church thesis are imposed on quantum computers. Many hypothetical
machines have been realized, however these machines cannot be constructed in reality. Church’s thesis
requires these machines to be physical, therefore these hypothetical machines cannot represent a
quantum computer that satisfies the strong Church thesis. Church’s thesis originally defined the
guideline for hypothetical quantum computers used to realize quantum algorithms such as Shor’s
algorithms for factorization and discrete logarithms. Specifically, Shor’s algorithms utilize the quantum
gate array model to simulate a quantum computer with an arbitrary Hamiltonian. There are many other
quantum models, including the adiabatic quantum computer and one-way quantum computer. When
preforming a calculation the model that best of best fit is selected.[11]

Computational ability is limited by the space and time requirements of a process, such as an algorithm.
While in classical computation we only are limited by space and time, quantum computation introduces
a new limiting variable: precision. Precision is introduced inherently from superposition of states in
quantum mechanics, and is a measure of the accuracy of a computation. This precision is governed by
the Heisenberg uncertainty principle which similarly governs measurements in quantum mechanics.
Introduction of precision into quantum computations is thought to enable quantum computers to solve
problems in polynomial time when on a classical computer, the solution to these problems are
nondeterministic polynomial time or worse.[11]

There are currently two main approaches to physically implementing a quantum computer: analog and
digital. Analog approaches are further divided into quantum simulation, quantum annealing, and
adiabatic quantum computation. Digital quantum computers use quantum logic gates to do
computation. Both approaches use quantum bits or qubits.[1]:2–13

Any computational problem that can be solved by a classical computer can also, in principle, be solved by
a quantum computer. Conversely, quantum computers obey the Church–Turing thesis; that is, any
computational problem that can be solved by a quantum computer can also be solved by a classical
computer. While this means that quantum computers provide no additional power over classical
computers in terms of computability, they do in theory provide additional power when it comes to the
time complexity of solving certain problems. Notably, quantum computers are believed to be able to
quickly solve certain problems that no classical computer could solve in any feasible amount of time—a
feat known as "quantum supremacy". The study of the computational complexity of problems with
respect to quantum computers is known as quantum complexity theory.

Quantum computing is the use of quantum-mechanical phenomena such as superposition and


entanglement to perform computation. Computers that perform quantum computation are known as a
quantum computers.[1]:I-5 Quantum computers are believed to be able to solve certain computational
problems, such as integer factorization (which underlies RSA encryption), significantly faster than
classical computers. The study of quantum computing is a subfield of quantum information science.

Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum
mechanical model of the Turing machine.[2] Richard Feynman and Yuri Manin later suggested that a
quantum computer had the potential to simulate things that a classical computer could not.[3][4] In
1994, Peter Shor developed a quantum algorithm for factoring integers that had the potential to decrypt
RSA-encrypted communications.[5] Despite ongoing experimental progress since the late 1990s, most
researchers believe that "fault-tolerant quantum computing [is] still a rather distant dream".[6] In recent
years, investment into quantum computing research has increased in both the public and private sector.
[7][8] On 23 October 2019, Google AI, in partnership with the U.S. National Aeronautics and Space
Administration (NASA), published a paper in which they claimed to have achieved quantum supremacy.
[9] While some have disputed this claim, it is still a significant milestone in the history of quantum
computing.[10]

Quantum computing is modeled by quantum circuits. Quantum circuits are based on the quantum bit, or
"qubit", which is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0
quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are
measured the result is always either a 0 or a 1; the probabilities of these two outcomes depend on the
quantum state that they were in immediately prior to the measurement. Computation is performed by
manipulating qubits with quantum logic gates, which are somewhat analogous to classical logic gates.

Limitations outlined in the strong Church thesis are imposed on quantum computers. Many hypothetical
machines have been realized, however these machines cannot be constructed in reality. Church’s thesis
requires these machines to be physical, therefore these hypothetical machines cannot represent a
quantum computer that satisfies the strong Church thesis. Church’s thesis originally defined the
guideline for hypothetical quantum computers used to realize quantum algorithms such as Shor’s
algorithms for factorization and discrete logarithms. Specifically, Shor’s algorithms utilize the quantum
gate array model to simulate a quantum computer with an arbitrary Hamiltonian. There are many other
quantum models, including the adiabatic quantum computer and one-way quantum computer. When
preforming a calculation the model that best of best fit is selected.[11]

Computational ability is limited by the space and time requirements of a process, such as an algorithm.
While in classical computation we only are limited by space and time, quantum computation introduces
a new limiting variable: precision. Precision is introduced inherently from superposition of states in
quantum mechanics, and is a measure of the accuracy of a computation. This precision is governed by
the Heisenberg uncertainty principle which similarly governs measurements in quantum mechanics.
Introduction of precision into quantum computations is thought to enable quantum computers to solve
problems in polynomial time when on a classical computer, the solution to these problems are
nondeterministic polynomial time or worse.[11]

There are currently two main approaches to physically implementing a quantum computer: analog and
digital. Analog approaches are further divided into quantum simulation, quantum annealing, and
adiabatic quantum computation. Digital quantum computers use quantum logic gates to do
computation. Both approaches use quantum bits or qubits.[1]:2–13

Any computational problem that can be solved by a classical computer can also, in principle, be solved by
a quantum computer. Conversely, quantum computers obey the Church–Turing thesis; that is, any
computational problem that can be solved by a quantum computer can also be solved by a classical
computer. While this means that quantum computers provide no additional power over classical
computers in terms of computability, they do in theory provide additional power when it comes to the
time complexity of solving certain problems. Notably, quantum computers are believed to be able to
quickly solve certain problems that no classical computer could solve in any feasible amount of time—a
feat known as "quantum supremacy". The study of the computational complexity of problems with
respect to quantum computers is known as quantum complexity theory.

Quantum computing is the use of quantum-mechanical phenomena such as superposition and


entanglement to perform computation. Computers that perform quantum computation are known as a
quantum computers.[1]:I-5 Quantum computers are believed to be able to solve certain computational
problems, such as integer factorization (which underlies RSA encryption), significantly faster than
classical computers. The study of quantum computing is a subfield of quantum information science.

Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum
mechanical model of the Turing machine.[2] Richard Feynman and Yuri Manin later suggested that a
quantum computer had the potential to simulate things that a classical computer could not.[3][4] In
1994, Peter Shor developed a quantum algorithm for factoring integers that had the potential to decrypt
RSA-encrypted communications.[5] Despite ongoing experimental progress since the late 1990s, most
researchers believe that "fault-tolerant quantum computing [is] still a rather distant dream".[6] In recent
years, investment into quantum computing research has increased in both the public and private sector.
[7][8] On 23 October 2019, Google AI, in partnership with the U.S. National Aeronautics and Space
Administration (NASA), published a paper in which they claimed to have achieved quantum supremacy.
[9] While some have disputed this claim, it is still a significant milestone in the history of quantum
computing.[10]

Quantum computing is modeled by quantum circuits. Quantum circuits are based on the quantum bit, or
"qubit", which is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0
quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are
measured the result is always either a 0 or a 1; the probabilities of these two outcomes depend on the
quantum state that they were in immediately prior to the measurement. Computation is performed by
manipulating qubits with quantum logic gates, which are somewhat analogous to classical logic gates.

Limitations outlined in the strong Church thesis are imposed on quantum computers. Many hypothetical
machines have been realized, however these machines cannot be constructed in reality. Church’s thesis
requires these machines to be physical, therefore these hypothetical machines cannot represent a
quantum computer that satisfies the strong Church thesis. Church’s thesis originally defined the
guideline for hypothetical quantum computers used to realize quantum algorithms such as Shor’s
algorithms for factorization and discrete logarithms. Specifically, Shor’s algorithms utilize the quantum
gate array model to simulate a quantum computer with an arbitrary Hamiltonian. There are many other
quantum models, including the adiabatic quantum computer and one-way quantum computer. When
preforming a calculation the model that best of best fit is selected.[11]

Computational ability is limited by the space and time requirements of a process, such as an algorithm.
While in classical computation we only are limited by space and time, quantum computation introduces
a new limiting variable: precision. Precision is introduced inherently from superposition of states in
quantum mechanics, and is a measure of the accuracy of a computation. This precision is governed by
the Heisenberg uncertainty principle which similarly governs measurements in quantum mechanics.
Introduction of precision into quantum computations is thought to enable quantum computers to solve
problems in polynomial time when on a classical computer, the solution to these problems are
nondeterministic polynomial time or worse.[11]

There are currently two main approaches to physically implementing a quantum computer: analog and
digital. Analog approaches are further divided into quantum simulation, quantum annealing, and
adiabatic quantum computation. Digital quantum computers use quantum logic gates to do
computation. Both approaches use quantum bits or qubits.[1]:2–13
Any computational problem that can be solved by a classical computer can also, in principle, be solved by
a quantum computer. Conversely, quantum computers obey the Church–Turing thesis; that is, any
computational problem that can be solved by a quantum computer can also be solved by a classical
computer. While this means that quantum computers provide no additional power over classical
computers in terms of computability, they do in theory provide additional power when it comes to the
time complexity of solving certain problems. Notably, quantum computers are believed to be able to
quickly solve certain problems that no classical computer could solve in any feasible amount of time—a
feat known as "quantum supremacy". The study of the computational complexity of problems with
respect to quantum computers is known as quantum complexity theory.

The prevailing model of quantum computation describes the computation in terms of a network of
quantum logic gates.[12]

A memory consisting of {\textstyle n}{\textstyle n} bits of information has {\textstyle 2^{n}}{\textstyle


2^{n}} possible states. A vector representing all memory states thus has {\textstyle 2^{n}}{\textstyle
2^{n}} entries (one for each state). This vector is viewed as a probability vector and represents the fact
that the memory is to be found in a particular state.

In the classical view, one entry would have a value of 1 (i.e. a 100% probability of being in this state) and
all other entries would be zero. In quantum mechanics, probability vectors are generalized to density
operators. This is the technically rigorous mathematical foundation for quantum logic gates, but the
intermediate quantum state vector formalism is usually introduced first because it is conceptually
simpler. This article focuses on the quantum state vector formalism for simplicity.

We begin by considering a simple memory consisting of only one bit. This memory may be found in one
of two states: the zero state or the one state. We may represent the state of this memory using Dirac
notation so that

{\displaystyle |0\rangle :={\begin{pmatrix}1\\0\end{pmatrix}};\quad |1\rangle


:={\begin{pmatrix}0\\1\end{pmatrix}}}

You might also like