Physical Principles Underpinning Quantum Computing

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

Physical Principles Underpinning Quantum Computing

January 5, 2024 Filippo Di Giovanni


Quantum computing harnesses the power of three critical quantum principles: superposition,
entanglement and interference
Today’s computers are ubiquitous and extremely powerful, accomplishing a wide range of tasks
in science, education, economics and everyday life. They are accessible to anyone who can
afford a laptop or a mobile phone. Despite the phenomenal progress in power processing spurred
by advances in microelectronics, however, the computer structure has basically remained
unchanged since the Hungarian physicist and mathematician John von Neumann proposed the
eponymous architecture based on stored programs and inspired by British mathematician Alan
Turing, who laid the logico-mathematical foundations of computation and modern computer
science.
A typical digital computer system has four basic elements: input/output ports, main memory,
control unit and arithmetic-logic unit (ALU). The limitations of that conventional computing
structure in tackling the most difficult tasks, current supercomputers notwithstanding, are
pushing researchers toward quantum computer development. Together with artificial intelligence,
quantum computing represents one of the main technical and scientific challenges of the near
future.
The first mechanical calculating machines were invented by Blaise Pascal of France and
Gottfried Wilhelm Leibniz of Germany in the 17th century, but American scientist John V.
Atanasoff is credited with building the first electronic digital computer, which he constructed
between 1939 and 1942 with the assistance of a graduate student. In 1946, J. Presper Eckert and
John W. Mauchly, both of the University of Pennsylvania, built the ENIAC (electronic numerical
integrator and computer), which was derived from Atanasoff’s machine; both computers used
vacuum tubes in place of relays as active logic blocks, a characteristic that enabled a significant
increase in processing speed. The digital computer roadmap is dotted with other important
innovations, from the transistor to the integrated circuit and, finally, to microprocessors and
VLSI circuits in the 1980s. These improvements have supported the empirical Moore’s Law,
which predicts a doubling of a chip’s processing capability (or transistor density) roughly every
18 months.
In 1959, American physicist and Nobel laureate Richard Feynman, known for proposing a new
formulation of quantum electrodynamics (QED) that describes light-matter interaction, argued
that as electronic components approach microscopic dimensions, strange effects predicted by
quantum physics appear. Feynman believed those effects could be exploited in the design of
more powerful computers. Arcane phenomena occurring only at the scale of atoms or particles
are the basis of quantum computing hardware.
Quantum computing harnesses the power of three critical quantum principles: superposition,
entanglement and interference. These concepts play a pivotal role in the capabilities of quantum
computers, which differ greatly from conventional computers.
Superposition: In the world of quantum mechanics, objects such as particles do not necessarily
possess clearly defined states, as demonstrated by the famous double-slit experiment. In this
configuration, a single photon of light passing through a screen with two small slits will produce
an interference pattern on a photosensitive screen akin to that generated by light waves; this can
be visualized as a superposition of all available paths. If a detector is used to determine which of
the two slits the photon has crossed, the interference pattern vanishes. The interpretation of this
strange outcome is that a quantum system “exists” in all possible states before a measurement,
introducing a fatal perturbation, collapses the system into one state; this is called decoherence.
Reproducing this phenomenon in a computer promises to expand computational power
exponentially.
A traditional digital computer employs binary digits, or bits, that can be in one of two states,
represented as 0 and 1; thus, for example, a 4-bit computer register can hold any one of 16 (24)
possible numbers. In contrast, a quantum bit (qubit) exists in a wavelike superposition of 0 and 1
values; thus, for example, a 4-qubit computer register can handle 16 different numbers
simultaneously. In theory, a quantum computer can therefore operate on a much larger number of
values in parallel, so that a 30-qubit quantum computer would be comparable to a digital
computer capable of performing 10 trillion (10 x 1012) floating-point operations per second
(TFLOPS), a speed matched by a very fast digital supercomputer.
Today’s fastest digital computer is the Frontier. Installed at the Department of Energy’s (DOE’s)
Oak Ridge National Laboratory in Tennessee, it reached a speed of 1.1 exaFLOPS (1018
FLOPS). Supercomputers built on classical architecture are very complex and heavy machines,
requiring an incredibly high number of parallel components and processors. The Frontier
contains 9,408 CPUs, 37,632 GPUs and 8,730,112 cores, all linked together by 145 kilometers of
cables. The supercomputer stretches over an area of 372 square meters (4,004 sq ft) and
consumes 21 MW of power, reaching 40 MW at peak.
Entanglement: Whereas superposition is the ability of a qubit to exist in multiple states
simultaneously, i.e., in a state of 0, 1 or any combination of both, entanglement is the quantum
phenomenon in which two or more qubits become correlated. In other words, the state of one
qubit cannot be described independently of the state of its companion(s). This interdependence
allows for instantaneous information sharing between entangled qubits, however distant they
may be. Einstein dubbed this phenomenon “spooky action at a distance” to underscore his
aversion to the nondeterministic and non-local nature of quantum mechanics. Entanglement is
the backbone of many quantum algorithms, leading to faster and more efficient problem-solving.
Interference occurs when two or more quantum states are combined to create a new state,
resulting in either constructive or destructive interference. Constructive interference amplifies
the probability of obtaining the correct output, while destructive interference reduces the
probability of incorrect outputs. By manipulating interference patterns, quantum computers can
quickly parse potential solutions, converging on the correct answer much faster than classical
computers.
But how can a qubit be built?
Consider a single electron and its angular momentum, namely spin. Spin, being quantized, can be
either up or down. By defining 0 as the spin-up state and 1 as the spin-down state, an electron
can be used as qubit. It is useful here to use the bra-ket notation introduced by theoretical
mathematician and physicist Paul Dirac. Quantum states can be represented by “kets” that are
basically column vectors, so the two states can be written as |0> and |1>. The spin therefore plays
the same role that a transistor does for a bit in standard Boolean logic.
The superposition principle states that, unlike a classical bit, a qubit can be represented by a
superposition of both 0 and 1 simultaneously. In mathematical notation, if |ψ> identifies the state
of a qubit, this can be expressed as:
|ψ> = W0 |0> + W1 |1>
where W0 and W1 are two numbers representing the relative weight of |0> and |1> in the
superposition. More formally, such numbers are complex probability amplitudes of the qubit and
determine the probability of getting a 0 or a 1 when measuring the state of the qubit. Of course,
they must obey the normalization condition: |W0|2 + |W1|2 = 1. When W0 = 1 e W1 = 0, the qubit
is in its |0> state, corresponding to the off state of a transistor. If W0 = 0 and W1 = 1, the qubit
state corresponds to the on state of the transistor. For any other values of W0 and W1, it is as if
the transistor, in classical terms, is neither “off” nor “on” but simultaneously both “on” and “off”
— much as the cat can be both dead and alive in the famous thought experiment conceived by
Edwin Schrödinger, one of the founders of quantum mechanics.
Superposition lets one qubit perform two calculations at once, and if two qubits are linked
through entanglement, they can help perform 22 or four calculations simultaneously:
|ψ> = W00 |00> + W01 |01> + W10 |10> + W11 |11>
Three qubits can handle 23 or eight calculations in parallel, and so on.
In principle, a quantum computer with 300 qubits could perform more calculations
instantaneously than there are atoms in our visible universe. A quantum computer with this many
qubits already exists. It is IBM’s 433-qubit Osprey, which hosts the most powerful quantum
processor to date and is accessible as an exploratory technical demonstration on IBM Cloud.
After codification of data in qubits, it is necessary to modify and manipulate the states of the
qubits. In a digital computer, this is done by means of basic operations performed by logic gates
such as AND, NAND and NOR. The corresponding operations in quantum computers are
implemented by quantum gates, which can be classified depending on the number of qubits
involved. As opposed to classical gates, quantum gates can create and manipulate entanglement
and superposition, which are essential for the increased computational power of quantum
computers.
Some of the quantum gates performing operations on qubits through a set of quantum logic
operations are Pauli-X, Pauli-Y, Pauli-Z, Hadamard and CNOT (controlled NOT). Pauli-X, for
instance, is the quantum analog to the classical NOT gate. The Hadamard gate transforms a
single qubit and into a perfectly balanced superposition of the |0> and |1> states, such that a
measurement of a single qubit that has been “transformed” by this gate will produce either |0> or
|1> with equal probability: W1 = W2 = 1/√2. In fact, (1/ √2)2 + (1/ √2)2 = 1.

Quantum processing unit


The core component of quantum computing hardware, a quantum processing unit (QPU),
executes quantum algorithms by processing qubits through a series of quantum gates. Whereas
conventional processors like CPUs, GPUs and DPUs (data processing units extensively used in
data centers) exploit principles of classical physics, QPUs handle qubits, enabling quantum
computers to perform complex calculations exponentially faster than their classical counterparts.
QPUs can vary in their underlying technology, such as nuclear magnetic resonance, trapped ions,
superconducting qubits and photonic chips, with each approach offering unique advantages and
challenges. Because of the different implementations and architectures, it is not straightforward
to compare QPUs by just looking at the number of qubits they handle.
Quantum computing is emerging as potentially one of the most transformative technologies in
the world, but the constraints are stringent. A quantum computer must maintain coherence
between its qubits (or quantum entanglement) long enough to run a complete algorithm. Because
of nearly inevitable interactions with the environment, decoherence may happen; therefore,
robust methods of detecting and correcting errors need to be worked out. Finally, because the act
of measuring a quantum system disturbs its state, reliable methods of extracting information
must be devised.
It is certain that we will enjoy another revolution in computational science. Many of today’s
intractable problems could be addressed with the new machines. At the same time, the
impressive capabilities of quantum computing have started a fierce competition among
companies to achieve “quantum supremacy.”

You might also like