Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Proc.

of the International Conference on Electrical, Computer and Energy Technologies (ICECET)


9-10 December 2021, Cape Town-South Africa

Design and Implementation of Efficient Quantum


Support Vector Machine
Mhlambululi Mafu Makhamisa Senekane
2021 International Conference on Electrical, Computer and Energy Technologies (ICECET) | 978-1-6654-4231-2/21/$31.00 ©2021 IEEE | DOI: 10.1109/ICECET52533.2021.9698509

Department of Physics and Astronomy Institute for Intelligent Systems


Botswana International University of Science and Technology University of Johannesburg
Palapye, Botswana Johannesburg, South Africa
mafum@biust.ac.bw smakhamisa@uj.ac.za

Abstract—Machine Learning (ML) is arguably the most ad- the performances and functionalities of entire sectors. There
vanced sub-field of Artificial Intelligence (AI). It concerns the are broadly three categories of Machine Learning. These
study of computer algorithms that can improve automatically categories include [4, 5]:
by learning from data (experience) without being explicitly
pre-programmed. On the other hand, the quantum version • Supervised machine learning-deals with a labeled dataset
of ML, Quantum Machine Learning (QML), forms one of {(xi , yi )}N
the most crucial and recent quantum computing applications. i=1 , where xi is the input or feature vector
It uses quantum mechanical principles such as superposition, and the label yi stands for the ground truth where one
interference, and tunneling to enable quantum computers to learn builds the knowledge of the learning algorithm. Among
from data. Some examples of QML algorithms include Quantum others, a supervised learning algorithm’s primary goal is
Principal Component Analysis (QPCA) and Quantum Support to analyze the dataset to produce a model that, when given
Vector Machine (QSVM). The QSVM is a robust supervised an input or feature vector x, one can predict the value
machine learning algorithm used for classification and regression.
Therefore, this paper discusses the design and implementation of of y. Essentially, all data is labeled, and the algorithm
an efficient QSVM model. The model’s efficiency is achieved using predicts the output from the input data. The examples
Principal Component Analysis (PCA) on the classical data, then include linear regression, random forest, and support
loading the classical data into a quantum computer. Experimental vector machines.
results demonstrate the significance and application of the QSVM • Unsupervised machine learning-is a type where the
model reported in this paper.
Index Terms—Artificial Intelligence, Machine Learning, Quan- dataset is a collection of vectors that are not labeled
tum Support Vector Machine, quantum computing {xi }N
i . Unsupervised machine learning takes this input
vector and extracts valuable properties from the dataset’s
I. I NTRODUCTION single or overall data distribution. In other words, all
Machine Learning allows computing machines to learn from data is unlabeled, and the algorithms learn to inherent
data through experience [1, 2, 3]. Machine Learning is con- structure from the data. The examples include clustering
cerned with designing computational agents (software tools) and principal component analysis.
that can make accurate decisions on their own without being • Reinforcement machine learning-it assumes the machine
explicitly pre-programmed. It is an area of research at the ”lives” in a domain and can probe the state of the
intersection of statistics, artificial intelligence and computer domain as a feature vector. Consequently, the machine
science, and recently engineering as researchers translate the can perform various actions at different states resulting
theoretical models to build products for use and commer- in different rewards and punishments. The goal of re-
cialization. Notably, it has become one of the key drivers in inforcement machine learning algorithm is to learning a
digital transformation. Its applications have become seamless strategy through trial and error using feedback from its
in the commercial sector and everyday life. For instance, actions and experiences. Notably, this category finds use
it has found applications in business, finance, advertising, where critical decision-making and long-term objectives
sports, transportation, manufacturing and the environment. In are critical.
scientific research, machine learning has found applications As already mentioned, the quantum counterpart of conven-
in understanding stars, DNA sequence analysis and cancer tional ML is an emerging field called QML [6, 7, 8]. Like
treatment. The main goal of machine learning is to optimize conventional ML, its goal is developing quantum algorithms
and optimizing performance to achieve the required results in
This work is supported by the Institute for Intelligent Systems; University
of Johannesburg, The National Institute for Theoretical and Computational today’s industries and applications. Notably, QML is based
Sciences, and the National University of Lesotho. on the theory of quantum physics and combines conventional

978-1-6654-4231-2/21/$31.00 © 2021 IEEE


Authorized licensed use limited to: National Institute of Technology Karnataka Surathkal. Downloaded on May 08,2024 at 18:09:55 UTC from IEEE Xplore. Restrictions apply.
ML and quantum mechanics. QML uses quantum mechanical a) Principal Component Analysis in order to improve com-
principles like superposition, interference, and entanglement putation efficiency and
to enable computational agents to learn from data in a manner b) Quantum Support Vector Machine.
that offers some advantage over conventional ML [9]. More Except for this introduction, the rest of the paper is struc-
specifically, it is responsible for generating improved statistical tured according to the following manner. The next section
interpretations and accuracy by manipulating quantum systems covers the theory on quantum computing and Quantum Ma-
to process information and gaining computational speed by us- chine Learning. Furthermore, Section III provides a detailed
ing quantum systems to process information. Due to increased explanation of the efficient QSVM scheme proposed in this
experiments in quantum physics generating increased data, paper. Additionally, results are provided and discussed in
QML would be an incredible resource to perform analysis, Section IV. Then, in section V, we provide a conclusion to
make predictions, and, where possible, control the experiment. this paper.
Moreover, it allows one to perform powerful data visualization
techniques. Thus, it will be critical for theoreticians and exper- II. T HEORY
imentalists to understand the structure of complex manifolds As already mentioned in this paper, this section provides
better or predict theoretical models [4]. background information on quantum computing and QML.
To date, various QML schemes have been proposed [6, 7, Thus, it is divided into two subsections; the first subsection
8, 10]. Examples of QML schemes include: covers quantum computing, while the second covers Quantum
Machine Learning.
• Quantum Support Vector Machine (QSVM),
• Variational Quantum Eigensolver (VQE), A. Quantum Computing
• Quantum Boltzmann Machine (QBM), and Quantum computing uses quantum mechanical principles
• Quantum Generative Adversarial Networks (QGAN). like superposition, interference, entanglement, including tun-
The QSVM algorithm finds applications in classification prob- neling to perform computation in a manner that outperforms
lems requiring a feature map for which computing the col- conventional computing [10, 19, 20]. Therefore, the devices
lection of the kernel (i.e., collection of the inner product) is that perform quantum computations are called quantum com-
inefficient classically [11]. The QSVM is an example of super- puters. Notably, quantum computers can significantly solve
vised learning, consisting of training and test or classification specific computational problems. For example, in contrast to
phases. The Variational-Quantum-Eigensolver (VQE) refers to a conventional computer, which uses a binary digit (bit) as a
a quantum or classical hybrid algorithm that can be used to fundamental unit of information, a quantum computer uses a
find eigenvalues of an (often large) matrix H [12, 13]. The quantum bit (qubit). Therefore, the principle of superposition
variational quantum eigensolvers are demonstrating promise allows computational speeds up due to the simultaneous execu-
for solving hard problems relating to quantum chemistry and tion of quantum states. Moreover, quantum states can interact
performing data-classification tasks [14, 15]. The quantum with each other due to entanglement, despite how further they
Boltzmann machines are neural networks used for genera- might be from each other. Thus, the proven advantages of us-
tive machine learning [16]. Then, the Quantum Generative ing quantum computers for optimization problems suggest that
Adversarial Networks (QGAN) are Generative adversarial these systems may solve classification problems by improving
networks (GANs) have become the most widely adopted semi- artificial intelligence models.
supervised and unsupervised ML methods applied in high- A qubit can be represented mathematically as a superposi-
definition image, video, and audio generation [17]. tion of two states as follows [6, 19]:
The fusion of machine learning and quantum mechanics |ψi = α|0i + β|1i, (1)
can drive new quantum technologies, algorithms and explore
where α and β are probability amplitudes and satisfy the
novel ways to speed computations, improve existing models,
condition:
and devise novel learning quantum schemes. This fusion can
|α|2 + |β|2 = 1. (2)
be realized using four approaches [7]. One of the approaches
involves the processing a classical dataset by using a quan- On the other hand, geometrically, a qubit is represented as a
tum computer. The challenge that this approach poses is point on a surface of a Bloch sphere [19, 20, 21], and is shown
that it requires an appropriate quantum-classical interface in in Fig. 1. On a Bloch sphere, orthogonal quantum states are
order to efficiently load a classical dataset on a quantum antipodal; that is, they are diametrically opposite to each other.
computer [7, 18]. However, quite fortunately, this challenge Quantum computing can be implemented using various
was addressed in Ref. [18], where an efficient data loading models of computation. Notably, one quantum computing
(quantum state preparation) based on the divide-and-conquer model is a quantum circuit model. This model is analogous to
algorithm was presented. Therefore, in this paper, we propose the conventional model of computing. Additionally, this model
and implement an efficient QSVM algorithm. The QSVM performs computation through the use of quantum gate [19].
proposed in this paper uses a classical dataset on a quantum Furthermore, a collection of quantum gates in turn forms a
computer approach. Furthermore, it consists of the following quantum circuit; which can be used to implement quantum
stages: algorithms such as QML.

Authorized licensed use limited to: National Institute of Technology Karnataka Surathkal. Downloaded on May 08,2024 at 18:09:55 UTC from IEEE Xplore. Restrictions apply.
TABLE I
P REDICTION ACCURACY OF THE SVM FOR VARIOUS TRAINING AND TEST
SIZES .

Training Size Test Size Accuracy (%)


100 20 50
50 10 60
25 5 70
20 4 62

As can be observed from Table I, the reasonably high accu-


racy values justify the utility of this efficient QSVM algorithm.
Lastly, it can be observed from Table I that incredible results
are obtained if the training size is 25, and a test size is 5.
Therefore, the optimal training size for this algorithm is 25,
while the optimal test size is 5.
V. C ONCLUSION
Fig. 1. A qubit representation on a Bloch sphere. In this paper, we have reported the implementation of an
efficient QSVM . This efficiency is achieved by using PCA on
the classical dataset, loading data into the quantum computer,
B. Quantum Machine Learning and performing QSVM on a quantum computer. The accuracy
As already stated, QML fuses the fields of ML and quantum results obtained from this work underline the utility of the
computing. Additionally, the implementation of ML, both con- efficient QSVM algorithm proposed in this paper. However,
ventional and quantum, can be realized using four approaches future work is going to focus on incorporating the efficient
[7, 8]. These approaches are: data loading scheme proposed in Ref. [18] to further improve
the efficiency of the QSVM.
• using a classical dataset on a conventional computer,
• using a classical dataset on a quantum computer, ACKNOWLEDGMENT
• using a quantum dataset on a classical computer, and Makhamisa Senekane wants to thank the University of
• using a quantum dataset on a quantum computer. Johannesburg through the Institute for Intelligent Systems for
The efficient QSVM algorithm proposed in this paper is the support provided during the research work reported in
based on using a classical dataset on a quantum computer. The this paper. This work is funded by the National Institute for
design and implementation of this algorithm will be covered Theoretical and Computational Sciences, South Africa.
in the next section. R EFERENCES
III. A N E FFICIENT QSVM [1] D. L. Poole and A. K. Mackworth, Artificial Intelligence:
foundations of computational agents, 2nd ed. Cambridge
The algorithm proposed in this paper uses Python program-
University Press, 2017.
ming language, especially the following two Python packages:
[2] M. Mohri, A. Rostamizadeh, and A. Talwalkar, Founda-
• SKLearn for Principal Component Analysis and
tions of machine learning, 2nd ed. MIT press, 2018.
• IBM‘s Qiskit for the implementation of Quantum Support
[3] S. Russell and P. Norvig, Artificial Intelligence: A Mod-
Vector Machine. ern Approach, 4th ed. Pearson, 2021.
The proposed algorithm can be summarized as fol- [4] S. Marsland and M. Learning, An Algorithmic Perspec-
lows. First, the Breast Cancer dataset is loaded from tive, 2nd ed. CRC Press, 2015.
sklearn.datasets. This is followed by applying the Principal [5] R. S. Sutton and A. G. Barto, Reinforcement learning:
Component Analysis, which reduces the dimensionality of the An introduction, 2nd ed. MIT press, 2018.
dataset from 30 dimensions to 2 dimensions. It is worth noting [6] P. Wittek, Quantum machine learning: what quantum
that this step is responsible for the efficiency of the QSVM computing means to data mining. Academic Press, 2014.
algorithm presented in this paper. The next step involves the [7] M. Schuld and F. Petruccione, Supervised learning with
loading of the resultant 2-dimensional dataset into a quantum quantum computers. Springer, 2018.
computer. This, in turn, is followed by the implementation of [8] M. Senekane, M. Maseli, and M. B. Taele, “Noisy,
the QSVM algorithm using Qiskit. Finally, the accuracy values intermediate-scale quantum computing and industrial
for different training and testing sizes are recorded. revolution 4.0,” in The Disruptive Fourth Industrial Rev-
olution. Springer, 2020, pp. 205–225.
IV. R ESULTS AND D ISCUSSION [9] L. Buffoni and F. Caruso, “New trends in quantum
The accuracy results for various sizes of both training test machine learning (a),” EPL (Europhysics Letters), vol.
data are provide in Table I. 132, no. 6, p. 60004, 2021.

Authorized licensed use limited to: National Institute of Technology Karnataka Surathkal. Downloaded on May 08,2024 at 18:09:55 UTC from IEEE Xplore. Restrictions apply.
[10] A. Asfaw, A. Corcoles, L. Bello, Y. Ben-Haim,
M. Bozzo-Rey, S. Bravyi, N. Bronn, L. Capelluto, A. C.
Vazquez, J. Ceroni, R. Chen, A. Frisch, J. Gambetta,
S. Garion, L. Gil, S. D. L. P. Gonzalez, F. Harkins,
T. Imamichi, H. Kang, A. h. Karamlou, R. Loredo,
D. McKay, A. Mezzacapo, Z. Minev, R. Movassagh,
G. Nannicini, P. Nation, A. Phan, M. Pistoia, A. Rattew,
J. Schaefer, J. Shabani, J. Smolin, J. Stenger, K. Temme,
M. Tod, S. Wood, and J. Wootton. (2020) Learn
quantum computation using qiskit. [Online]. Available:
http://community.qiskit.org/textbook
[11] P. Rebentrost, M. Mohseni, and S. Lloyd, “Quantum sup-
port vector machine for big data classification,” Physical
review letters, vol. 113, no. 13, p. 130503, 2014.
[12] J. R. McClean, J. Romero, R. Babbush, and A. Aspuru-
Guzik, “The theory of variational hybrid quantum-
classical algorithms,” New Journal of Physics, vol. 18,
no. 2, p. 023023, 2016.
[13] A. Peruzzo, J. McClean, P. Shadbolt, M.-H. Yung, X.-Q.
Zhou, P. J. Love, A. Aspuru-Guzik, and J. L. O’brien,
“A variational eigenvalue solver on a photonic quantum
processor,” Nature communications, vol. 5, no. 1, pp. 1–
7, 2014.
[14] Y. Cao, J. Romero, J. P. Olson, M. Degroote, P. D.
Johnson, M. Kieferová, I. D. Kivlichan, T. Menke,
B. Peropadre, N. P. Sawaya et al., “Quantum chemistry
in the age of quantum computing,” Chemical reviews,
vol. 119, no. 19, pp. 10 856–10 915, 2019.
[15] V. Havlı́ček, A. D. Córcoles, K. Temme, A. W. Harrow,
A. Kandala, J. M. Chow, and J. M. Gambetta, “Super-
vised learning with quantum-enhanced feature spaces,”
Nature, vol. 567, no. 7747, pp. 209–212, 2019.
[16] M. H. Amin, E. Andriyash, J. Rolfe, B. Kulchytskyy,
and R. Melko, “Quantum boltzmann machine,” Physical
Review X, vol. 8, no. 2, p. 021050, 2018.
[17] M. Y. Niu, A. Zlokapa, M. Broughton, S. Boixo,
M. Mohseni, V. Smelyanskyi, and H. Neven, “Entangling
quantum generative adversarial networks,” arXiv preprint
arXiv:2105.00080, 2021.
[18] I. F. Araujo, D. K. Park, F. Petruccione, and A. J.
da Silva, “A divide-and-conquer algorithm for quantum
state preparation,” Scientific Reports, vol. 11, no. 1, pp.
1–12, 2021.
[19] M. A. Nielsen and I. Chuang, Quantum computation and
quantum information, 10th ed. Cambridge University
Press, 2010.
[20] M. M. Wilde, Quantum information theory, 2nd ed.
Cambridge University Press, 2017.
[21] M. Mafu, “Security in quantum key distribution proto-
cols.” Ph.D. dissertation, 2013.

Authorized licensed use limited to: National Institute of Technology Karnataka Surathkal. Downloaded on May 08,2024 at 18:09:55 UTC from IEEE Xplore. Restrictions apply.

You might also like