Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Quantum Machine Learning

Syed Bilal Hyder Shah


LUMS, School of Science and Engineering
(Dated: September 9, 2023)

ABSTRACT MACHINE LEARNING

Machines as human helpers


The term machine mearning was coined by Arthur
Samuel, an employee of IBM and a pioneer in computer Machines have reshaped the lives of humans ever since
gaming in 1959. The main idea behind machine learning the industrial revolution in the 18th and 19th centuries.
was to make machines self-learn and improve with ex- The dominance in the economy changed hands from
perience and data. In particular, machine learning com- agrarian and handicraft to industrial processes and ma-
prises of a set of methods that can automatically detect chine manufacturing. Machines eased up the burden of
patterns in data. Those patterns are then used to pre- mechanical work on humans and took on the load of phys-
dict future data or, even better, make real-life decisions. ically hard work. Following the introduction of digital
Back in mid 20th century, it was not a big deal as the computers, machines have taken off a lot of intellectual
machines were slow, the data was limited and most of burden from humans as well. Graphing software, com-
it was easy to handle manually. In contrast, machine plex calculation calculators, and many more have made
learning is of much more value now that we have entered lives easier for us.
the era of big data. Internet users generate about 2.5
quintillion bytes of data each day, the total size of the
internet stood at 40 Zettabytes in 2020, and 500 hours of Making a machine train on data
videos are being uploaded to YouTube every minute![3]
Handling this much data is not possible if we just stick
One of the typical machine learning examples is to give
to the old methods of data management. There must be
the computer some images of cats and dogs, and train
some way to automate the processing of all that data.
the computer to recognize the difference between the two
For these reasons, machine learning has surfaced again
categories by providing it with the labels associated with
and has been a topic of interest for many big firms and
each image. After the training is complete, we give the
investors.
computer some images and ask it to predict the labels of
More so, the rate at which data is increasing, it is these new images. But how is all of this achieved? How
expected that the classical machines we are using right does a computer see the image? How can it do something
now would not be able to keep up with the data handling that we only associate with humans?
tasks in near future. This is where the power of Quantum
computers will be of much need.
Classifier

Computers do not have eyes, or ears, or any other


characteristics of living beings. All that the computers
(classical) can distinguish between are highs and lows in
electrical signals i.e. bits. To make it understand an
INTRODUCTION image, we first need to convert it to bits. Each image
is made out of hundreds, thousands, or even millions of
small portions of colored regions known as pixels. Nowa-
For this report, I assume no background in Machine days, the computers use 16-bit colors, which means that
Learning on the readers’ side. Hence, the first part of to encode each pixel we need a sequence of 16 0s and 1s.
the report will be an introduction to Machine Learning This makes a total of 216 possible values that each pixel
fundamentals, the term classifier an overview of some can take. This makes up an insane amount of data if we
commonly known classifiers. The second part of the re- consider millions of pixels in a single image. To learn to
port will deal with how quantum computers fit into the distinguish between the two labels, the computer takes
picture of machine learning, and what are quantum clas- the data from all of the pixels of all of the images and
sifiers. Finally, I will conclude the report with the out- finds patterns in the sequence of pixels. Based on these
line of my work for the project, providing the readers sequences, the computer predicts the label of the new im-
with the final version of a quantum machine learning al- age. The algorithm or program that utilizes the patterns
gorithm that runs on an actual quantum computer, and to predict the output label is known as a classifier. Hav-
performs an actual machine-learning related task. ing two output labels is not a restriction on classifiers,
2

there can be multi-class classifiers as well e.g. a classi- Many data points are passed through the circuit and the
fier that predicts the mood of a person from the picture; measured data provides us with the raw data for making
in this case we can have multiple options for it such as a classifier.
happy, sad, excited, neutral, scared, etc. All of this working of encoding the data into a quantum
state can be conveniently encoded into feature mapping
Data to train CMS functions provided in the qiskit.circuit.library of Python.
the computer
A classifier There are currently 3 versions of these feature maps avail-
Known label able in this library,
of the data

• ZFeatureMap: This feature is the simplest of all


A classifier CMS 3, it just uses Hadamard and Pauli Z-gates, and
Label prediction does no entanglement, so it provides no quantum
Data without
advantage.
labels

Why need Quantum Computers when it’s all


working so fine?

Classical computers do machine learning just fine un-


less the parameters to train on become very large in num-
ber. Parameters are features or properties of the object • ZZFeatureMap: This feature map uses 2 qubit
of interest based on which you categorize them and assign controlled Pauli Z-gates to do the required task and
them labels. Color of the pixel was one such property, incorporates entanglement as well.
some specific sequence of pixels might make up another
parameter. Increasing the number of parameters often
increases the accuracy of classification, but at the ex-
pense of training time as the computer will have to map
more properties of the object to make the classifier. This
becomes a problem for classical computers as they go over
each point individually to get the information. Quantum
computers, on the other hand, are good at such tasks
because they harness quantum mechanical properties of • PauliFeatureMap: This does all the work of
particles i.e. superposition and entanglement. So, in ZZFeatureMap and makes use of all Pauli gates.
quantum machine learning we let the quantum computer
do the heavy work of creating the classifier which we then
run on a classical computer to make predictions.
Data to train
the computer

A classifier

Known label
of the data ZZFeatureMap and PauliFeatureMap also provide two
options to choose the kind of entanglement, linear or cir-
cular.
Constructing a quantum classifier. After all the points have been mapped, they can be
used to make a quantum classifier of the user’s choice.
To make a quantum classifier we need to encode classi-
cal data into a quantum state. We do this by passing the
data through a parameterized quantum circuit aka the
variational quantum circuit. It is a special kind of quan-
MY PROJECT
tum circuit whose rotational parameters vary depending
on the iteration and the data being encoded. For differ-
ent data, the parameters will change accordingly to map For my project, I chose to make a Quantum classifier
it into a quantum state. After the basic encoding of clas- that would be able to distinguish between handwritten 0s
sical into quantum data, the qubits are entangled with and 1s using just 64-pixel images, and each pixel working
one another in the next step. In the end they are mea- on 4-bit grey-scale. The data for 0s and 1s were obtained
sured; giving us the mapping of the data for a classifier. from sklearn.dataset.digits library.
3

To set a benchmark, I first ran the data on an entirely


classical system. Made the classical SVM and used the
same test and train data. The prediction accuracy of
this classical algorithm was 80%. After this, I ran the
quantum variant of this algorithm with 8000 shots on a
quantum simulator which gave the accuracy of 75%. Fi-
digits library contains data for all 10 numerical digits. nally, I ran the model on IBM’s ibm manila, a 5 qubit
I chose 200 samples of 0’s and 1’s chosen at random out of quantum computer. The results on this quantum com-
this dataset, and after some normalization, downscaled puter gave an impressive accuracy score of 72.5% despite
the data to 5 dimensions (the max number of qubits avail- all the noise in the quantum computers that are currently
able for public use on IBM’s quantum computers). Af- available for public use.
ter this I mapped the data using ZZFeatureMap, and
made a kernel matrix out of it for the quantum state
vector machine classifier. The state vector machine uti- CONCLUSION
lizes the kernel matrix for classification. To compute the
kernel matrix, we calculate the transition amplitude of Quantum machine learning is still in its infancy. There
each point with every other point in the training data. is much room for research in the field. Although the ac-
Transition amplitude is the change in zero count propor- curacy of our model was lower than the classical result,
tion from one point to the other since we know the labels and this particular task was not tough for modern classi-
of the data points in the training set. We repeat the cal computers as well, this result still shines as a beacon
processes till mapping, for another 40 pictures of 0s and of hope for quantum machine learning. Researchers are
1s chosen at random from the same dataset. Using the working tirelessly to increase the number of qubits on
mapping of these 40 pictures we make a testing kernel quantum computers, which will mean that in near future
matrix by calculating transition amplitudes against the we will be able to make quantum classifiers for more com-
training data. These two matrices are then fed into the plex data as we will not need to downscale the dimension-
classical computer which uses them to classify the test ality of the input parameters, resulting in better-trained
data with the labels of 0 or 1. models and improved accuracy.

REFERENCES

[1] The Hundred Page Machine Learning Book


Andriy Burkov

[2] Machine Learning - A Probabilistic Perspective


Kevin P. Murphy

[3] 27+ big data statistics - how big it actually is


in 2021? TechJury. (2021, December 6). Retrieved
December 15, 2021, from https://techjury.net/blog/big-
data-statistics/gref

[4] 57 fascinating and incredible YouTube statistics.


Brandwatch. (n.d.). Retrieved December 15, 2021, from
https://www.brandwatch.com
4

[5] Encyclopædia Britannica, inc. (n.d.). Industrial rev- Revolution


olution. https://www.britannica.com/event/Industrial-
[6] All resources for illustrations used from Freepik.com

You might also like