Download as pdf or txt
Download as pdf or txt
You are on page 1of 55

Fingerprint and Face Authentication Portable

Digital Electronic Voting Machine

Brandon Sage Cagape John Carlo Lisondato Princess Diane Maboloc


College of Engineering Education College of Engineering Education College of Engineering Education
Electronics Engineering Program Electronics Engineering Program Electronics Engineering Program
Davao City, Philippines Davao City, Philippines Davao City, Philippines
b.cagape.513587@umindanao.edu.ph j.lisondato.488253@umindanao.edu.ph p.maboloc.499584@umindanao.edu.ph

is under repair. Even if there are no reported fraudulent


Abstract— The paper presents the use of Raspberry Pi as the activities, there is still a possibility that it will occur; therefore,
brain of an Electronic Voting Machine (EVM) to facilitate digital it is advisable to switch to fully automated voting to prevent
elections. The EVM is designed to perform various tasks, such as fraud [2]. The Paper Ballot system can also hinder the voting
integrating fingerprint, face authentication, and anti-spoofing ability of the marginalized population, such as the illiterate,
techniques that require high computational power. The Adafruit
[pregnant] women, Persons with Disabilities (PWDs), and
fingerprint sensor, with its built-in firmware, is used to read or
store fingerprint templates. The face authentication and senior citizens since the voting interpretation and validity is
anti-spoofing techniques use models such as Local Binary Pattern left at the discretion of election officers. Electronic Voting
Histograms and Haar-Cascade Classifier models to authenticate Machines will prevent this, ensuring that their votes are
faces and eye-blink counters for anti-spoofing. Both correctly counted [3]. Aside from significantly reducing
authentications are combined and incorporated into the voting electoral costs, tackling fraudulent activities, and ensuring the
GUI, which is programmed through the Python language along participation of the marginalized population, using Electronic
with its libraries that create the EVM. The accuracy test resulted in Voting Machines will also make vote counting quicker since
an overall accuracy of 80% for fingerprint authentication and the results are declared within 2-3 hours compared to the
95.56% for face authentication. The anti-spoofing technique
paper ballot system, which takes an average of 30-40 hours
resulted in an overall accuracy of 100%. For the overall
performance of the EVM used to conduct mock elections for ECE vote-counting time [4].
students in the University of Mindanao with 20 registered students, A study of [5] uses PIC16F877A as a microcontroller in an
80% of the students were successfully authenticated using Electronic Voting Machine, making the voting process faster,
fingerprint authentication. Those who failed the fingerprint reliable, and efficient. However, there is a problem with the
authentication proceeded to face authentication, resulting in the fingerprint data since the database images have high
authentication of all voters. The database results match the receipt resolution requiring more memory to be allocated. There is
of the voters, demonstrating the device's overall efficiency of 100%. also an Arduino-based EVM that is particularly used in school
In conclusion, the paper proposes an efficient and secure EVM that elections. This EVM is workable but not 100% efficient since
ensures fair and transparent digital elections.
it can only accommodate three (3) candidates and has limited
memory [6]. A face recognition feature is commonly used in
Keywords—Electronic Voting Machine, Local Binary Pattern
Histogram (LBPH), Haar-Cascade Classifier, Anti-Spoofing,
an Electronic Voting Machine to authenticate the user
Raspbian OS, FirebaseDB registered on the server. To avoid the case of twin voters, a
two-fold authentication called biometric authentication is used
I. INTRODUCTION [7]. In biometric testing of the study [8], the parameters False
Acceptance Rate (FAR) and False Reject Rate (FRR) are
Voting is one of the fundamental rights that an individual applied. The FAR is an identification percentage used when
possesses as an essence of democracy. It gives them the there is an unexpected acceptance of an unauthorized user
freedom to choose the candidates they think are the most whereas the FRR is the percentage of registered but rejected
suited for certain positions to represent the people and their or unmatched users. Based on the calculations, there is a FAR
interests. Among the different voting systems in the world, the of 2%, an FRR of 10%, and an overall 94% high accuracy
Philippines has adopted the Paper Ballot system wherein the indicating that the biometric is reliable for user authentication.
voters shade the circle beside the candidate’s name on the The OpenCV library is integrated and then executed by
ballots provided to them and are then fed into a machine that Python language and utilizes the Haar-Cascade and Local
will count their votes. This is a good system to some extent; Binary Pattern Histogram model for face detection and
however, there are disadvantages. One is the possibility of recognition authentication [9]. An EVM having a face
ghost voting and the production and use of defective ballots, recognition feature that also makes use of Haar-Cascade has a
which is marked as wastage. Despite the number of faulty 91% accuracy in recognizing their subjects [10]. One of the
ballots being a small percentage of the overall, this percentage disadvantages of face recognition features is that it is prone to
is still significant, especially in small-scale voting [1]. This identity fraud because of facial spoofing. Facial spoofing is
wastage can be avoided by investing in Electronic Voting the act of using a person’s face through a photo, a video, or a
Machines (EVM). The other disadvantage of the Paper Ballot hyper-realistic mask to steal their identity [11]. This can be
system is that it is susceptible to fraud. There is an issue in the prevented by different anti-spoofing techniques such as using
recent Philippine Presidential Elections in which most voters sensors or dedicated hardware for detecting facial features,
were advised to voluntarily surrender their ballots to the poll challenge-response methods of the user in real-time and
watchers as the machine that is supposed to read their ballots

XXX-X-XXXX-XXXX-X/XX/$XX.00 ©20XX IEEE


recognition algorithms (specular feature projections, depth Teller Machine (ATM) places, making it easy to navigate for
feature fusion, image quality assessment, deep learning) [12]. first-time voters at the same time giving them enough privacy
The study focuses on an Electronic Voting Machine when voting. The only difference will be that the EVM kiosk
(EVM) with two authentication methods for authentic, will be lower and will be table-like compared to the ATMs to
transparent, and credible voting systems. Specifically, the cater the PWDs, especially the ones on a wheelchair. This will
EVM features biometric authentication combined with face also make the voters, especially the senior citizens, more
recognition authentication that will ensure the credibility of comfortable when voting. As a limitation, the blind people are
the voting system [14]. To completely authenticate the user, an the only PWDs that are excluded from being catered because
anti-spoofing technique will be implemented together with including them would yield additional features that are to be
face recognition to prevent identity theft through facial considered which would cause additional expenses and
spoofing. The technique that will be used is the complicated intricacies in the Electronic Voting Machine.
challenge-response technique, wherein the user is requested to Compared to the usual voting system of the Philippines, the
interact with the system in a specific way, such as doing facial use of indelible ink after voting will not be needed anymore
expressions or head movements [14]. An EVM is in the form because aside from the implementation being small-scale, the
of a portable standalone tablet kiosk for convenience in system can check whether the individual has already voted.
voting. A receipt is printed after voting for the voters to ensure Additionally, voter information and vote counts are saved on a
that their votes were read correctly. The receipt will be put in a cloud database presuming that there is a sudden and
drop box, which the committee keeps for manual recounting unprecedented data loss. A receipt will be printed after voting
in case a candidate files a petition for it [13]. for transparency and for the voters to ensure that their votes
This study aims to develop a standalone Electronic Voting are read correctly. The receipts generated will then be dropped
Machine (EVM) with two authentication systems. To be more in a drop box and kept by the committee for manual
specific, this electronic voting machine should be able to recounting, especially when there are anomalies or when a
recognize and authenticate the fingerprints and faces, candidate requests it. This is also another way of preserving
supported by the anti-spoofing technique, of the registered data in case of data corruption or loss. This study aims to
users as implemented. This study also aims to perform an implement small-scale voting such as school body elections
effective voting procedure and accurately count the number of and contests that are vote-based with the use of this Electronic
votes from the registered users and tally them. Furthermore, Voting Machine (EVM). To be specific, the implementation
the votes counted from the system should match the manual will take place in our university, The University of Mindanao
vote count from the receipts printed after the voting process.
This study would be of great importance as it will make II. MATERIALS AND METHODS
voting more convenient with the aid of a device that will make
the flow of the electoral process smoother. The Electronic A. Conceptual Framework
Voting Machine (EVM) will be a standalone computer kiosk The study will use Raspberry Pi as the main processing
to make the voting process ideal for the voters and won’t be a unit of the system, which runs the Raspbian operating system.
hassle. The monitor screen used for the voting process will The component that will be used is compatible with the
have bigger fonts to make the on-screen instructions very clear fingerprint scanner through its USB. Also, the Python
and readable for the people, especially for the senior citizens language will be used for programming because of its
and the ones with poor eyesight. With this, they will be able to compatibility with the libraries to be used. The Open
vote with little to no effort. The EVM will use fingerprint and Computer Vision Library, or OpenCV, is widely used to
face authentication features to ensure the authenticity and perform image processing and computer vision projects.
credibility of the voting process. An anti-spoofing technique Additionally, the use of the library is utilized to build the
will be implemented coming with the face recognition feature Haar-Cascade model for face detection/anti-spoofing and
to prevent identity theft through facial spoofing. This will also LBPH (Local Binary Pattern Histogram) model for face
prevent fraud as the only one who can access the person’s recognition [9-10]. Aside from this, the Pyqt5 library designs
registered account is the voter themself. Moreover, since the the graphical user interface of the system.
voter will be the one to decide which fingerprint (left or right Fig. 1 shows the conceptual framework of the study. The
thumb, index, middle, ring, pinky) they are to register with, system will require the fingerprint or face acquisition to
they will also be the only ones aside from the proponents validate the voter's identity for authentication. This will be the
who’ll know which one to use for their authentication. This system’s input. The fingerprint acquired will be preprocessed
will add to the security the system of the EVM will have. to optimize the sample and extract significant features of the
This study proposes to create an Electronic Voting sample to find the matched fingerprint from the database
Machine that uses two authentication methods, namely the [15-16]. If the system fails to authenticate the registered
fingerprint and face recognition, to prevent fraud and the voter's fingerprint, it will proceed to face authentication. The
possibility of ghost voting. Producing an EVM will also system will first perform face detection, which will increase
remove the need to print out ballots where there is the accuracy of the face recognizer since it marks the voter's
unavoidable production of defective ones. The EVM will be face before proceeding to recognition. Afterwhich, the face
set in a kiosk-style that will be similar to closed Automatic recognition algorithm will match the extracted voter's face
from the database. Moreover, for the anti-spoofing procedure,
the same algorithm of face detection will be used. It detects
the voter's eyes and then requires them to blink twice to
prevent identity theft. There are two authentication methods
but the voter is only required to pass one of the biometric
authentications before proceeding to the voting proper,
wherein the voter will vote for their preferred candidates
among the selection and finalize their votes. And this
Figure 3. LBPH (Local Binary Pattern Histogram)
comprises the electronic voting machine.
C. Voter's Registration
Before the election day, people eligible to vote will be
registered ahead of time with their personal information,
which includes their names and samples for fingerprint and
face recognition, the two methods of authentication for the
EVM. Furthermore, they are given informed consent
regarding our study.
As shown in Fig. 4, voters are required to touch the
fingerprint scanner with their desired finger for authentication.
Figure 1. Conceptual Framework The images are processed with a highly sensitive pixel
amplifier, then extracted templates that are stored/saved in the
B. Face Detection and Recognition Model database as a unique ID.
Fig. 2 shows the overall concept of the Haar-Cascade
classifier model. It uses Haar features to determine the lines or
edges of the samples wherein there are changes in the pixel
intensity. It also uses the integral images technique for the
summation of pixels over the image sub-area [10]. Aside from
this, pre-trained files are available to detect specific body parts
like the face for face detection. eyes detection to require the
voter to blink their eyes to prevent anti-spoofing.

Figure 2. Haar-Cascade Classifier


Figure 4. Fingerprint Registration Block Diagram
Fig. 3 illustrates the Local Binary Pattern Histogram or
LBPH model, one of the simplest algorithms used in facial In Fig. 5, using a camera, 200 face images will be collected
recognition. The model locates the image structure and from each voter. The images will then be pre-processed to
compares each pixel to its adjacent pixels. The adjacent values provide good-quality images rather than false ones. Next,
are then converted to binary based on a threshold, where feature extraction will be implemented to extract the face
values above the threshold are represented by 1s, while values components. Lastly, images will be trained by converting
below the threshold are represented by 0s. The accuracy of the extracted features of each image to binary that resulted in the
algorithm is proportional to the number of samples in the center point to be the decimal equivalent. These significant
dataset, and a minimum of 100 samples is required to achieve features are the face encodings which will be stored in pickle
optimal accuracy [9]. In this study, 200 image samples will be files that will be used in face recognition.
collected from each voter.
Fig. 7 shows the face recognition process. The voter's
images will be scanned, and their facial features will be
extracted. The system will then compare the sample image
with the stored face encodings to determine the best match. If
the sample image matches with the stored encodings, the
name of the voter is selected, and their information is fetched
from the database. The system will then proceed to the voting
display.

Figure 5. Face Registration Block Diagram

D. Voter's Recognition
To be able to vote, one must pass either the fingerprint or
face authentication. The voter is to authenticate first through
the fingerprint authentication method. However, if the voter’s
fingerprint biometric is not recognized, they will be redirected
to the face authentication instead. If the voter fails to
authenticate themselves from both methods, they cannot
proceed to the voting proper.
Fig. 6 shows the fingerprint recognition block diagram.
Figure 7. Face Recognition Block Diagram
This is when the voter touches the fingerprint scanner for
authentication. Scanned images will be enhanced through the E. System Integration
HSP amplifier and will then be matched to the stored The EVM and registration are separated, as shown in Fig.
fingerprint data. Once it finds a match, the voter information 8. A laptop will be used to collect voter data, which are their
will be fetched from the database for authentication and then name and their fingerprint and face data. The gathered data
the recognized voter can proceed to the voting display. will be sorted and transferred to the storage of the Raspberry
Pi. The transferred features will then be used to authenticate
the voters. Additionally, the Internet of Things database will
be used to store the users who already voted together with
their votes to the Firebase database to prevent data loss in
instances where there is a storage corruption of the Raspberry
Pi. Additionally, this cloud database will only connect when
the EVM validates the voter’s already-voted status or appends
the recent voter and its vote selection.

Figure 8. System Integration

F. Hardware Development
Fig. 9 illustrates the prototype design of the Electronic
Figure 6. Fingerprint Recognition Block Diagram
Voting Machine. The size and dimensions of the prototype are
based on the needs and capacity of the potential categories of
voters. There is a presumption that one of the potential Fig. 12 illustrates the system flow in vote casting. It has
categories of voters who will use the Electronic Voting two methods of authentication. Voter's data is stored and
Machine are Persons with Disabilities (PWDs), with the matched (fingerprint and/or face) before proceeding to the
exclusion of blind voters. The Electronic Voting Machine will voting display. The system is on standby displaying a prompt
be placed on a table so that it reaches the height of 48" to place their finger on the scanner. If the fingerprint scanner
(~1.2m), which is the ideal height of an EVM according to the detects a fingerprint, it will start the authentication process,
Kiosk ADA Accessibility ADA Compliance for it to be and when it matches to a registered data, the system will
accessible to the PWDs, especially the people on a wheelchair proceed to the voting display. However, if the fingerprint
like the crippled [13]. authentication fails, it will be redirected to face authentication.
Once the voter passes the face recognition, the system will
then require the voter to blink twice for them to pass
anti-spoofing. Next, the system will check the database to
confirm whether they have already voted. If not yet, the voter
will proceed to the voting display and can now select their
desired candidates, else if they have already voted, they are
shown an error display indicating that they already have voted
and cannot cast more votes. Once the voter is done selecting
their candidates, the voter will be directed to a page showing
Figure 9. EVM Prototype Design the candidates that they have chosen to confirm their
selection. After this, their votes will be counted in the system.
Fig. 10 illustrates the hardware design for the Electronic Additionally, already-voted voters and vote counts will be
Voting Machine, which consists of several devices that are saved on a cloud database, presuming that there is a sudden
connected to a processing unit. The Raspberry Pi implements and unprecedented data loss. Lastly, a receipt will be printed
the Python code, which is the brain of the EVM system. It will showing the exact candidates selected by the voter in the
be powered by a power supply to turn on the connected process for transparency and for the voters to ensure that their
external devices. External peripherals, which are the monitor votes are read correctly. The receipts generated will then be
screen, mouse, camera, and thermal printer are connected to dropped in a drop box and will be kept by the committee for
the RPi. The fingerprint scanner and the camera are the main manual recounting, especially when there are anomalies or
devices for the authentication of voters. Both devices collect when a candidate requests for it. This is also another way of
data for processing and matching to the existing data that were preserving data in case of data corruption or loss.
uploaded to the system. Furthermore, the overall device is the
portable EVM, made for convenience.

Figure 10. EVM Hardware Design

Fig. 11 illustrates the separate hardware design for


registering the voters. It is composed of a camera, a
fingerprint scanner, and a laptop. Before the election day,
assigned people will facilitate the registration of the voters,
including the fingerprint and face samples of the voters,
before uploading them to the EVM system or the RPi
database.

Figure 11. Registration Hardware Design Figure 12. Hardware System Flow
G. Testing Procedure III. RESULTS AND DISCUSSIONS
Fig. 13 illustrates the testing procedure cycle of the study.
The requirement of definition signifies the problems of the A. Electronic Voting Machine
study and comes up with solutions that are feasible to the Fig. 14 shows the actual device used to conduct the mock
issue. The prototyping stage is carried out before building the election with the following components used: the monitor
actual device. The design of the actual device involves screen, mouse, camera, and thermal printer, all of which are
hardware components and peripherals. Furthermore, the connected to the Raspberry Pi 4 to perform the voting
software design has a graphical user interface, fingerprint and procedure.
face authentication, and the Internet of Things. The next phase
is the implementation of the actual device, testing its
functionality, and debugging errors. Thorough testing of the
device will be done until it gives good and fairly accurate
results. The final step of the testing will be the full operation
of the completed device while also maintaining its function for
the registration of voters and the election proper at the
University of Mindanao.

Figure 14. Actual Device

B. Accuracy Testing Results


The accuracy test results for the fingerprint authentication is
shown in Table I for fifteen (15) samples, with three trials
conducted for each sample. The overall accuracy is calculated
by taking the percentage average of the three trials.
TABLE I. FINGERPRINT AUTHENTICATION ACCURACY TEST RESULTS

Required samples for Trial 1 Trial 2 Trial 3


fingerprint accuracy
testing: 15 9/15 13/15 14/15

Accuracy 60.00% 86.67% 93.33%

Overall Accuracy 80.00%

The accuracy test results for face authentication are shown


Figure 13. Testing Procedure Cycle in Table II, presented in a confusion matrix with two
classifications: registered and unregistered voters. Thirty (30)
G. Statistical Analysis samples were used, with three trials conducted for each
This study will employ two authentication methods to sample.
verify the legitimacy of the voting process by conducting trials
and calculating the overall accuracy for fingerprint and face TABLE II. CONFUSION MATRIX FOR FACE AUTHENTICATION
ACCURACY TEST RESULTS
authentication using a confusion matrix. Furthermore,
precision, which measures the accuracy of positive
predictions, recall, which measures the ability of the model to n = (30 samples)
Predicted
identify positive cases, and F1 score, which provides a * (3 trials) = 90
balanced measure of both metrics, will be computed.
Additionally, the False Reject Rate (FRR), which is the rate of Registered voters = 45
registered voters that are not authenticated, was compared to
45 0
the precision, while the False Acceptance Rate (FAR), which True Positive False Positive
is the rate of unregistered voters that are authenticated with Actual
another person's identity, will be compared to the recall. To Unregistered voters = 45
guarantee efficiency, database vote selection and voting
receipt selection will be compared. 4 41
False Negative True Negative
Of the total trials for the registered voters, forty-five (45) TABLE IV. FINGERPRINT AND FACE AUTHENTICATION RESULTS
were authenticated (True Positive) and none (0) were rejected.
On the other hand, of the total trials for the unregistered Voter Fingerprint Face
voters, three (3) were falsely identified as other people (False 1 Successful Passed
Negative) while forty-two (42) were classified as unknown 2 Successful Passed
(True Negative). 3 Successful Passed
The overall accuracy of the face authentication method is 4 Error Successful
95.56% using equation (1). The precision and recall are 100%
5 Successful Passed
and 91.84% by using equations (2) and (3), respectively.
6 Error Successful
Lastly, the F1-score resulting from equation (4) was 95.75%.
And False Rejection Rate (FRR) resulted in 0% using 7 Successful Passed
equation (5), while the False Acceptance Rate (FAR) resulted 8 Successful Passed
in 8.16% using equation (6). 9 Successful Passed
10 Successful Passed
(𝑇𝑃+𝑇𝑁)
𝑂𝑣𝑒𝑟𝑎𝑙𝑙 𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 = (𝑇𝑃+𝐹𝑃+𝑇𝑁+𝐹𝑁)
* 100% (1) 11 Successful Passed
𝑇𝑃 12 Successful Passed
𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 = (𝑇𝑃+𝐹𝑃) * 100% (2)
13 Successful Passed
𝑇𝑃
𝑅𝑒𝑐𝑎𝑙𝑙 = (𝑇𝑃+𝐹𝑁)
* 100% (3) 14 Successful Passed
2*𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛*𝑅𝑒𝑐𝑎𝑙𝑙 15 Successful Passed
𝐹1 𝑆𝑐𝑜𝑟𝑒 = (𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛+𝑅𝑒𝑐𝑎𝑙𝑙)) * 100% (4)
16 Successful Passed
𝐹𝑅𝑅 = 100% − 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 (5)
𝐹𝐴𝑅 = 100% − 𝑅𝑒𝑐𝑎𝑙𝑙 (6) 17 Error Successful
18 Error Successful
Table III shows the overall accuracy of the anti-spoofing 19 Successful Passed
technique (blink twice) used after a successful face 20 Successful Passed
authentication. The overall accuracy is computed by the Accuracy 80% 20%
success count over the total number of trials in which the
Overall Accuracy 100%
result is 100%.
TABLE III. ANTI-SPOOFING TEST RESULTS
Based on the results in Table IV, all twenty (20) voters
were able to successfully authenticate and proceed to the
n = (30 samples) * (3 trials) Success Failed voting proper. The candidates chosen by each voter are stored
= 90 in the EVM system, and the printed receipt which contains the
90 0 voter’s name and selected candidates, were dropped in the
drop box after confirming their selections. The casted votes
stored from the database were compared to the receipts, and as
Overall Accuracy 100%
presented in Table V, all votes matched from each other,
resulting to an accuracy of 100%.
C. Mock Election Results
TABLE V. DATABASE VOTES AND RECEIPT VOTES COMPARISON
The results in Table IV show the type of authentication in
which the voters have successfully verified with. It is marked
as “Successful” if the voter has authenticated in that method Matched Mismatched
of authentication, while an “Error” is marked when the voter n = 20
has failed to do so. In the case of the face authentication 20 0
method, the voter is marked as “Passed” when it already has
successfully authenticated with their fingerprint, where they Accuracy 100%
won’t need to authenticate for their face anymore.
Based on the results, all voters were able to successfully IV. CONCLUSIONS AND FUTURE WORKS
authenticate themselves. 16 out of 20 voters were able to
authenticate with their fingerprints while 4 out of 20 voters A Raspberry Pi-based IoT portable electronic voting
had to go through face recognition to authenticate themselves. machine system that uses fingerprint and face authentication
The overall accuracy is computed by the total percentage methods to verify a voter's identity has successfully been
of the success rate of both fingerprint and face authentication designed. In verifying the efficiency of each authentication,
methods. Given this, the accuracy of fingerprint and face the use of the Adafruit fingerprint sensor for fingerprint
authentication are 80% and 20% respectively, which resulted authentication resulted in an 80% overall accuracy. The cause
in an overall accuracy of 100%. of the accuracy having a lower threshold is the failed
fingerprint detection on the first [out of three] trials. Most of [8] B. U. Umar, O. M. Olaniyi, L. A. Ajao, D. Maliki, and I. C. Okeke,
“Development of A Fingerprint Biometric Authentication System For
the voters failed in the first trial due to wet or dirty
Secure Electronic Voting Machines,” Kinet. Game Technol. Inf. Syst.
fingerprints. The face authentication with the use of Comput. Network, Comput. Electron. Control, 2019, doi:
Haar-Cascade Classifier and Local Binary Pattern Histogram 10.22219/kinetik.v4i2.734.
(LBPH) models, on the other hand resulted in a 95.56% [9] M. BANSAL, “FACE RECOGNITION IMPLEMENTATION ON
RASPBERRY PI USING OPENCV AND PYTHON,” Int. J. Comput.
overall accuracy. Moreover, the anti-spoofing technique
Eng. Technol., 2019, doi: 10.34218/ijcet.10.3.2019.016.
resulted in a 100% accuracy. In line with this, the face [10] Facial spoofing: Meaning and how to prevent it, 06-May-2022. [Online].
authentication method authenticated all the registered voters, Available:
however a few unregistered voters were recognized by the https://www.electronicid.eu/en/blog/post/facial-spoofing-what-it-is-how-
to-prevent-it-and-spoofing-detection-solutions/en. [Accessed:
system resulting in a FAR percentage of 8.16%. In the mock
24-Jun-2022]
election with 20 registered voters, 80% were authenticated [11] S. Najam, A. Z. Shaikh, and S. Naqvi, “A Novel Hybrid Biometric
using the fingerprint authentication method, while the 20% Electronic Voting System: Integrating Finger Print and Face
who failed proceeded to the face authentication in which they Recognition,” Mehran Univ. Res. J. Eng. Technol., 2018, doi:
10.22581/muet1982.1801.05.
were successful, resulting in a 100% overall authentication
[12] A. T. Dang, “Facial recognition: Types of attacks and anti-spoofing
accuracy. And the vote results from the database and voter's techniques,” Towards Data Science, 10-Oct-2020. [Online]. Available:
receipts matched upon comparison, resulting in a 100% match https://towardsdatascience.com/facial-recognition-types-of-attacks-and-a
for all 20 authenticated voters. nti-spoofing-techniques-9d732080f91e. [Accessed: 23-Jun-2022].
[13] K. M. Ada, “ADA kiosk,” Kiosk Kiosks, 11-Aug-2020. [Online].
For future study, the researchers suggest that the user
Available:https://kioskindustry.org/standards/ada-kiosk/?fbclid=IwAR1
interface will be displayed using a touchscreen LCD to lessen AKMfx6cW3VtiS-DfZQGvK_SwWKJ1oBcWC6nVMJKxMLeUGUBl
the bulkiness of the device and for an easier navigation of the rdqpLlgc. [Accessed: 28-Jul-2022].
GUI. Furthermore, the use of a more sensitive fingerprint [14] D. Fernandez, “800 voters per precinct a ‘safe’ ratio in 2022 polls,”
inquirer.net, 04-Dec-2021. [Online]. Available:
sensor that has a high tolerance for wet or dirty fingerprints is
https://newsinfo.inquirer.net/1523666/800-voters-per-precinct-a-safe-rati
also recommended. o-in-2022-polls-Comelec?fbclid=IwAR0G5Uq5R7bdvLJsZ8cqsCQzO6
NkLdbOEyFBZRC-SvtD96_5DqQzADLitTM
REFERENCES [15] Y. Lee, S. Park, M. Mambo, S. Kim, and D. Won, “Towards trustworthy
e-voting using paper receipts,” Comput. Stand. Interfaces, 2017, doi:
10.1016/j.csi.2010.03.001.
[1] G. M. A. N. GISELLE OMBAY, “COMELEC: 106K defective ballots
[16] V. Malathy, N. Shilpa, M. Anand, and R. Elavarasi, “Radio frequency
within 'margin of error',” GMA News Online, 22-Mar-2022. [Online].
identification based electronic voting machine using fingerprint
Available:
module,” 2020, doi: 10.1088/1757-899X/981/3/032018.
https://www.gmanetwork.com/news/topstories/nation/825913/comelec-1
06k-defective-ballots-within-margin-of-error/story/. [Accessed:
02-Jun-2022] [2] D. E. W. Sanjay Kumar, “ANALYSIS OF
ELECTRONIC VOTING SYSTEM IN VARIOUS COUNTRIES,” Int.
J. Comput. Sci. Eng., 2011.
[2] “Malfunctioning vcms: Voters told to leave their ballots or wait for
repair,” Yahoo! News. [Online]. Available:
https://ph.news.yahoo.com/elections-2022-almost-2000-vc-ms-reported-
malfunctioning-on-election-day-080109477.html?guccounter=1&guce_r
eferrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=
AQAAABODHzYIAPIMSGfgmyutjQdXLfN0jloNeWqa-jBhimYg8Lz
457NtBq66HcSMAGbfLCf_6Oa74aYAVfrMBwmv_RkYi1L4aLsD3V
Yw0ecZAzTrY-Tee5aZoYohs4d-nLruzjPKtjm2vIHeBjPSrczTKbH_IMu
V6jmO3cwoddUqPMYe. [Accessed: 02-Jun-2022
[3] S. Ravi, “How electronic voting machines have improved India’s
democracy,” brookings, 06-Dec-2019. [Online]. Available:
https://www.brookings.edu/blog/techtank/2019/12/06/how-electronic-vo
ting-machines-have-improved-indias-democracy/. [Accessed:
26-Aug-2022]
[4] “What are the advantages in using EVMs??,” Systematic Voters'
Education and Electoral Participation, 2022. [Online]. Available:
https://ecisveep.nic.in/ 2022
[5] Abeesh A I, Amal Prakash P, Arun R Pillai, Ashams H S, Dhanya M,
Seena R, 2017, Electronic Voting Machine Authentication using
Biometric Information, INTERNATIONAL JOURNAL OF
ENGINEERING RESEARCH & TECHNOLOGY (IJERT) NCETET –
2017 (Volume 5 – Issue 16).
[6] Vinayachandra, K. G. Poornima, M. Rajeshwari, and K. K. Prasad,
“Arduino Based Authenticated Voting Machine (AVM) using RFID and
Fingerprint for the Student Elections,” 2020, doi:
10.1088/1742-6596/1712/1/012004.
[7] J. I. Prince and G. Udhayakumar, “Electronic voting machine with facial
recognition and fingerprint sensors,” Ijarnd.com. [Online]. Available:
https://www.ijarnd.com/manuscripts/v3i3/V3I3-1216.pdf. [Accessed:
09-Jul-2022].
APPENDIX A
TRADE - OFF ANALYSIS
The researcher conducted a trade-off analysis to compare and evaluate the proposed designs. Trade-off analysis is a
statistical technique that gives insight based on the calculated values which examines the pros and cons of each design.
Moreover, it allowed the researcher to choose the optimal design that considers various factors. The outcomes of this analysis
serve as a guide for researchers' selections.

Researchers Constraints
The researchers decide on various factors in study. The following parameters were selected to get the most favorable
results that will be used as design for the study.

Complexity. The materials functions are difficult to integrate to each of the proposed designs due to their complex
nature. The researcher encountered challenges in seamlessly integrating the material functions into each design variant.

Performance. Performance evaluation was vital in assessing how well the designs met objectives. Metrics like speed,
accuracy, reliability, durability, and energy consumption were used. Rigorous testing and statistical analysis compared and
identified strengths and weaknesses.

Attractive. Focused on enhancing the attractiveness of the designs. Aesthetics played a significant role in the study, as
they influenced user perception and acceptance.

Economical. Emphasized the importance of economic considerations in the designs. Cost-effectiveness and efficiency
were key factors in the study. The researcher employed strategies to optimize resource utilization and minimize expenses
without compromising quality.

Maintenance. Recognizing the significance of long-term usability and sustainability, the researcher incorporated
features that simplified maintenance and upkeep.

Availability. The researcher took into account potential challenges related to limited stocks and availability of the
designs.

Manufacturability. Aligning the designs with existing methods, the researcher minimized the need for significant
modifications or investments in new equipment

Proposed Design
The proponents proposed two designs of Electronic Voting Machine for the study. Both designs have pros and cons that
will be evaluated through trade-off analysis using the pugh matrix.

Fig. 1 EVM Design 1

The proposed design shown in figure 1 is composition of hardwares that composed the electronic voting machine. It
uses a touchscreen LCD with a table-like set-up with covers at both sides for privacy voting. It is seen there the camera,
fingerprint sensor, and thermal printer.
Fig. 2 EVM Design 2

The second proposed design shown in figure 2 is more portable but uses monitor and mouse to navigate in the votation
process. The following devices such as camera, fingerprint sensor, and thermal printer. It also has covers at the sides for voter’s
privacy while voting.

Pugh Matrix

Selection Criteria Design 1 Design 2 Total


Complexity - + 0
Performance + + 2
Attractive + - 0
Economical - + 0
Maintenance - + 0
Availability - + 0
Manufacturability - + 0
Social + - 0
Total -2 4
Rank 2nd 1st
Table 1: Pugh Matrix Table

As shown in Table 1 the pugh matrix table is used to assess the proposed designs by evaluating the parameters. Using
the “+”,”0”, and “-“ used to rate the parameters. Wherein, "+" (Plus): A "+" rating signifies that the alternative performs better,
offers more benefits, or meets the criteria to a higher degree. "0" (Zero): A "0" rating there is neither a significant advantage or
disadvantage compared to the baseline. And, "-" (Minus): A "-" rating suggests that the alternative performs worse, offers fewer
benefits, or fails to meet the criteria as effectively. For the two designs, design 2 is 1st in rank with the highest rating of 4 while
design 1 rating is -2. Design 2 is considered more advantageous compared to design 2 due to its complexity, cost-effectiveness,
maintenance, availability, and manufacturability.
APPENDIX B
Fingerprint Authentication Accuracy Test Raw Data

Fingerprint Authentication
Registered
Trial 1 Trial 2 Trial 3
Voters
1 ✗ ✗ ✓
2 ✓ ✓ ✓
3 ✓ ✓ ✓
4 ✓ ✓ ✓
5 ✓ ✓ ✓
6 ✓ ✓ ✓
7 ✓ ✓ ✓
8 ✓ ✓ ✗
9 ✗ ✓ ✓
10 ✓ ✓ ✓
11 ✓ ✓ ✓
12 ✗ ✓ ✓
13 ✗ ✗ ✓
14 ✗ ✓ ✓
15 ✗ ✓ ✓
APPENDIX C
Face Authentication Accuracy Test Raw Data

Face Authentication
Name Trial 1 Trial 2 Trial 3
Registered Face Confidence A-Spoof Face Confidence A-Spoof Face Confidence A-Spoof
1 Voter 1 ✓ 98.43 ✓ ✓ 98.64 ✓ ✓ 98.09 ✓
2 Voter 2 ✓ 98.88 ✓ ✓ 99.30 ✓ ✓ 99.47 ✓
3 Voter 3 ✓ 98.72 ✓ ✓ 99.82 ✓ ✓ 98.80 ✓
4 Voter 4 ✓ 98.67 ✓ ✓ 98.97 ✓ ✓ 99.48 ✓
5 Voter 5 ✓ 98.37 ✓ ✓ 98.81 ✓ ✓ 98.54 ✓
6 Voter 6 ✓ 99.76 ✓ ✓ 99.38 ✓ ✓ 99.74 ✓
7 Voter 7 ✓ 99.16 ✓ ✓ 99.79 ✓ ✓ 99.49 ✓
8 Voter 8 ✓ 99.62 ✓ ✓ 99.16 ✓ ✓ 99.52 ✓
9 Voter 9 ✓ 99.24 ✓ ✓ 98.96 ✓ ✓ 99.07 ✓
10 Voter 10 ✓ 99.09 ✓ ✓ 99.53 ✓ ✓ 98.83 ✓
11 Voter 11 ✓ 99.25 ✓ ✓ 99.45 ✓ ✓ 99.79 ✓
12 Voter 12 ✓ 99.38 ✓ ✓ 99.20 ✓ ✓ 98.78 ✓
13 Voter 13 ✓ 99.77 ✓ ✓ 99.30 ✓ ✓ 99.24 ✓
14 Voter 14 ✓ 99.60 ✓ ✓ 99.19 ✓ ✓ 98.68 ✓
15 Voter 15 ✓ 99.40 ✓ ✓ 99.28 ✓ ✓ 99.03 ✓
Unregistered Face Confidence A-Spoof Face Confidence A-Spoof Face Confidence A-Spoof
16 Voter 16 ✓ 95.19 ✓ ✓ 95.75 ✓ ✓ 95.83 ✓
17 Voter 17 ✓ 93.09 ✓ ✓ 94.64 ✓ ✓ 93.78 ✓
18 Voter 18 ✓ 95.79 ✓ ✓ 95.47 ✓ ✓ 95.63 ✓
19 Voter 19 ✓ 95.49 ✓ ✓ 94.10 ✓ ✓ 93.70 ✓
20 Voter 20 ✓ 89.98 ✓ ✓ 90.22 ✓ ✓ 90.58 ✓
21 Voter 21 ✓ 93.91 ✓ ✓ 92.09 ✓ ✓ 92.63 ✓
22 Voter 22 ✓ 93.16 ✓ ✓ 94.16 ✓ ✓ 94.86 ✓
23 Voter 23 ✓ 94.26 ✓ ✓ 91.47 ✓ ✓ 94.08 ✓
24 Voter 24 ✓ 92.62 ✓ ✓ 93.14 ✓ ✓ 92.93 ✓
25 Voter 25 ✗ 97.29 ✓ ✓ 94.57 ✓ ✓ 92.64 ✓
26 Voter 26 ✗ 95.03 ✓ ✗ 97.98 ✓ ✗ 97.47 ✓
27 Voter 27 ✓ 95.39 ✓ ✓ 96.08 ✓ ✓ 96.27 ✓
28 Voter 28 ✓ 93.09 ✓ ✓ 95.92 ✓ ✓ 95.79 ✓
29 Voter 29 ✓ 95.34 ✓ ✓ 94.09 ✓ ✓ 92.99 ✓
30 Voter 30 ✓ 91.76 ✓ ✓ 93.83 ✓ ✓ 90.31 ✓
Tabulated Table for Face Authentication Accuracy Test

Trial 1
Confusion Matrix REGISTERED VOTERS = 30 samples
n = 30 Predicted Required Positive Negative
15 0 sample for 15 0
face
(True Positive) (False Positive) UNREGISTERED VOTERS = 30 samples
Actual recognition
2 13 testing: 30 Fake Unknown
(False Negative) (True Negative) 2 13

Trial 2
Confusion Matrix REGISTERED VOTERS = 30 samples
n = 30 Predicted Required Positive Negative
15 0 sample for 15 0
face
(True Positive) (False Positive) UNREGISTERED VOTERS = 30 samples
Actual recognition
1 14 testing: 30 Fake Unknown
(False Negative) (True Negative) 1 14

Trial 3
Confusion Matrix REGISTERED VOTERS = 30 samples
n = 30 Predicted Required Positive Negative
15 0 sample for 15 0
face
(True Positive) (False Positive) UNREGISTERED VOTERS = 30 samples
Actual recognition
1 14 testing: 30 Fake Unknown
(False Negative) (True Negative) 1 14

Face Authentication Threshold Range

Confidence Percentage: Threshold Value = 97%


Minimum Maximum Range
Registered Voters 98.09 99.82 98.09 - 99.82
Unregistered Voters 89.98 97.98 89.98 - 97.98
APPENDIX D
Mock Election Extracted Result from Database

name pres internal external sec treasurer auditor


MABOLOC, CAIN,
PRINCESS DE CASTRO, BALDOS, KIM FRANCHESKA CARTAGENA, ETHAN
Voter 1 DIANE MARK JOY PEREZ, JERIMOTH JOSEPH LOUISE RIGEL
AYUSTE, VINCENT LISONDATO, JOHN BALDOS, KIM ALBUTRA, JUAN AQUINO, GENREI
Voter 2 ABSTAIN JUPITER CARLO JOSEPH CARLOS PAUL
CAIN,
GENTILES, DE CASTRO, MARK LISONDATO, JOHN BALDOS, KIM FRANCHESKA AQUINO, GENREI
Voter 3 HARRY JOY CARLO JOSEPH LOUISE PAUL
GENABE, DE CASTRO, MARK GENABE, CHRIST ALBUTRA, JUAN AQUINO, GENREI
Voter 4 CHRISTOPHER JOY LAI, SHEN WA JOHN CARLOS PAUL
CAIN,
GENABE, DE CASTRO, MARK LISONDATO, JOHN BALDOS, KIM FRANCHESKA CARTAGENA, ETHAN
Voter 5 CHRISTOPHER JOY CARLO JOSEPH LOUISE RIGEL
MABOLOC,
PRINCESS AYUSTE, VINCENT LISONDATO, JOHN CAGAPE, HERNANDEZ, CARTAGENA, ETHAN
Voter 6 DIANE JUPITER CARLO BRANDON SAGE RODANTE JR. RIGEL
GENABE, AYUSTE, VINCENT LISONDATO, JOHN GENABE, CHRIST HERNANDEZ, AQUINO, GENREI
Voter 7 CHRISTOPHER JUPITER CARLO JOHN RODANTE JR. PAUL
GENABE, AYUSTE, VINCENT LISONDATO, JOHN BALDOS, KIM HERNANDEZ, CONDEZ, JOHNWELL
Voter 8 CHRISTOPHER JUPITER CARLO JOSEPH RODANTE JR. CAESAR
GENABE, AYUSTE, VINCENT BALDOS, KIM HERNANDEZ, AQUINO, GENREI
Voter 9 CHRISTOPHER JUPITER LAI, SHEN WA JOSEPH RODANTE JR. PAUL
GENABE, AYUSTE, VINCENT LISONDATO, JOHN CAGAPE, ALBUTRA, JUAN CARTAGENA, ETHAN
Voter 10 CHRISTOPHER JUPITER CARLO BRANDON SAGE CARLOS RIGEL
CAIN,
GENTILES, AYUSTE, VINCENT GENABE, CHRIST FRANCHESKA CARTAGENA, ETHAN
Voter 11 HARRY JUPITER PEREZ, JERIMOTH JOHN LOUISE RIGEL
LISONDATO, JOHN BALDOS, KIM ALBUTRA, JUAN CONDEZ, JOHNWELL
Voter 12 ABSTAIN OSIBA, GARY CARLO JOSEPH CARLOS CAESAR
CAIN,
DE CASTRO, MARK BALDOS, KIM FRANCHESKA CARTAGENA, ETHAN
Voter 13 ABSTAIN JOY PEREZ, JERIMOTH JOSEPH LOUISE RIGEL
MABOLOC,
PRINCESS AYUSTE, VINCENT LISONDATO, JOHN GENABE, CHRIST HERNANDEZ, AQUINO, GENREI
Voter 14 DIANE JUPITER CARLO JOHN RODANTE JR. PAUL
GENABE, DE CASTRO, MARK CAGAPE, CARTAGENA, ETHAN
Voter 15 CHRISTOPHER JOY PEREZ, JERIMOTH BRANDON SAGE ABSTAIN RIGEL
GENABE, DE CASTRO, MARK GENABE, CHRIST HERNANDEZ, AQUINO, GENREI
Voter 16 CHRISTOPHER JOY PEREZ, JERIMOTH JOHN RODANTE JR. PAUL
GENTILES, LISONDATO, JOHN BALDOS, KIM ALBUTRA, JUAN CARTAGENA, ETHAN
Voter 17 HARRY OSIBA, GARY CARLO JOSEPH CARLOS RIGEL
MABOLOC, CAIN,
PRINCESS DE CASTRO, MARK BALDOS, KIM FRANCHESKA AQUINO, GENREI
Voter 18 DIANE JOY LAI, SHEN WA JOSEPH LOUISE PAUL
GENTILES, AYUSTE, VINCENT GENABE, CHRIST ALBUTRA, JUAN CARTAGENA, ETHAN
Voter 19 HARRY JUPITER PEREZ, JERIMOTH JOHN CARLOS RIGEL
GENABE, LISONDATO, JOHN BALDOS, KIM ALBUTRA, JUAN CARTAGENA, ETHAN
Voter 20 CHRISTOPHER OSIBA, GARY CARLO JOSEPH CARLOS RIGEL
APPENDIX E
Registration GUI

Voting GUI
APPENDIX F
Software Source Code

For Registration Code


import shutil
import sys
from PyQt5.QtGui import *
from PyQt5.QtWidgets import *
from PyQt5.QtCore import *
import cv2
from PyQt5 import QtCore
import os.path, os
from PIL import Image
import numpy as np
import pickle
import re
from PyQt5.uic import loadUi
import sqlite3
import time
from pyfingerprint.pyfingerprint import PyFingerprint
import hashlib
import face_recognition
from pyFire import votecountsPrintCSV, updateFiretoSql, voteCounts

BASE_DIR = os.path.dirname(os.path.abspath(__file__))
image_dir = os.path.join(BASE_DIR, "images")

conn = sqlite3.connect('voting_database.db')
curs = conn.cursor()

class Registration(QMainWindow):
known_face_encodings = []
known_face_names = []

def __init__(self):
super(Registration, self).__init__()
loadUi('Registrationtest.ui', self)
self.image=None
self.screenshot = False
self._image_counter = 1
self.start_webcam()
self.fingerCapture.clicked.connect(self.fingercapture)
self.pushCapture.clicked.connect(self.nameFirstLast)
self.deleteButton.clicked.connect(self.deleteRow)
self.saveButton.clicked.connect(self.save)
self.trainButton.clicked.connect(self.trainbutton)
self.csvexp.clicked.connect(self.csvexport)
self.printtally.clicked.connect(self.votecounts)
self.face_cascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')
self.firstNameInput.moveCursor(QTextCursor.Start)
self.firstNameInput.setTabChangesFocus(True)
self.lastNameInput.setTabChangesFocus(True)

def votecounts(self):
#updateFiretoSql()
print("done")
voteCounts()
self.warningText.setText("Done Printing")
def csvexport(self):
votecountsPrintCSV()
self.warningText.setText("Done Exporting")
def trainbutton(self):
faceencodePKL = open('faceencode.pkl', 'wb')
count = 0
for root, dirs, files in os.walk(image_dir):
count += (len(files))
actualcount = 0
for root, dirs, files in os.walk(image_dir):
for file in files:
actualcount += 1
print(f"{actualcount}/{count}")
print(root)
self.warningText.setText(f"{actualcount}/{count}")
path = os.path.join(root, file)
label = os.path.basename(root).replace("-", " ").lower()
face_image = face_recognition.load_image_file(path)
face_encoding = face_recognition.face_encodings(face_image)[0]
self.known_face_encodings.append(face_encoding)
self.known_face_names.append(label)
faceencode = []
for name, encoded in zip(self.known_face_names, self.known_face_encodings):
faceencode.append([name, encoded])
pickle.dump(faceencode, faceencodePKL)
self.warningText.setText("TRAINING DONE!!!")
faceencodePKL.close()

def camOtherButtons(self, y):


self.fingerCapture.setEnabled(y)
self.deleteButton.setEnabled(y)
self.saveButton.setEnabled(y)
self.trainButton.setEnabled(y)

@QtCore.pyqtSlot()
def start_webcam(self):
self.capture=cv2.VideoCapture(0)
self.capture.set(cv2.CAP_PROP_FRAME_HEIGHT,480)
self.capture.set(cv2.CAP_PROP_FRAME_WIDTH,640)
self.timer=QTimer(self)
self.timer.timeout.connect(self.update_frame)
self.timer.start(5)

@QtCore.pyqtSlot()
def update_frame(self):
ret, self.image=self.capture.read()
if self.screenshot == True:
if self._image_counter != 100: #number of images
firstName1 = self.firstNameInput.toPlainText().replace(" ", "").lower()
lastName1 = self.lastNameInput.toPlainText().replace(" ", "").lower()
fullName1 = firstName1 + ' ' + lastName1
path = os.path.join(image_dir, fullName1)
name = "{}.png".format(self._image_counter+1)
print(os.path.join(path, name))
cv2.imwrite(os.path.join(path, name), self.image)
self._image_counter += 1
cv2.imwrite(os.path.join(path, name), self.image)
self.warningText.setText('Please wait! Capturing Images\n'+str(self._image_counter))
self.camOtherButtons(False)
else:
self.screenshot = False
self.saveButton.setEnabled(True)
else:
self.screenshot = False
self._image_counter = 0
self.image=cv2.flip(self.image,1)
self.displayImage(self.image, 1)
detected_image=self.faceDetection(self.image)
self.displayImage(detected_image,1)

@QtCore.pyqtSlot()
def faceDetection(self,img):
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
faces = self.face_cascade.detectMultiScale(gray,1.2,minNeighbors=5)
for (x, y, w, h) in faces:
cv2.rectangle(img, (x, y), (x + w, y + h), (255, 0, 0), 2)
return img

def displayImage(self,img,window=1):
qformat=QImage.Format_Indexed8
if len(img.shape)==3:
if img.shape[2]==4:
qformat=QImage.Format_RGBA8888
else:
qformat=QImage.Format_RGB888
outImage=QImage(img,img.shape[1],img.shape[0],img.strides[0],qformat)
outImage=outImage.rgbSwapped()

if window==1:
self.displayVideo.setPixmap(QPixmap.fromImage(outImage))
self.displayVideo.setScaledContents(True)

@QtCore.pyqtSlot()
def nameFirstLast(self):
firstName = self.firstNameInput.toPlainText().replace(" ", "").lower()
lastName = self.lastNameInput.toPlainText().replace(" ", "").lower()
fullName = firstName + ' ' + lastName

special_char = re.compile('[@_!#$%^&*()<>?/\|}{~:]')
if ((any(map(str.isdigit, firstName)) == True) or (special_char.search(firstName) != None)) or ((any(map(str.isdigit,
lastName)) == True) or (special_char.search(lastName) != None)):
self.warningText.setText("please don't use number/special characters")
else:
if os.path.exists(fullName) == False:
os.chdir("images")
os.makedirs(fullName)
self.screenshot=True
else:
self.warningText.setText("Name Already Taken")

@QtCore.pyqtSlot()
def save(self):
self.firstNameInput.setReadOnly(False)
self.lastNameInput.setReadOnly(False)
self.firstNameInput.clear()
self.lastNameInput.clear()
self.camOtherButtons(True)
os.chdir(os.path.dirname(os.path.abspath(__file__)))

def fingercapture(self):

firstName = self.firstNameInput.toPlainText().replace(" ", "").lower()


lastName = self.lastNameInput.toPlainText().replace(" ", "").lower()
fullName = firstName + ' ' + lastName

special_char = re.compile('[@_!#$%^&*()<>?/\|}{~:]')
if ((any(map(str.isdigit, firstName)) == True) or (special_char.search(firstName) != None)) or (
(any(map(str.isdigit, lastName)) == True) or (special_char.search(lastName) != None)):
self.warningText.setText("please don't use number/special characters")
return None
else:
curs.execute("SELECT name FROM voters ")

nameList=[]
for column in curs.fetchall():
nameList.append(column)
for i in range(len(nameList)):
if fullName in nameList[i]:
self.warningText.setText('Name Already taken')
return None

try:
f = PyFingerprint('COM3', 57600, 0xFFFFFFFF, 0x00000000)

if (f.verifyPassword() == False):
raise ValueError('The given fingerprint sensor password is wrong!')

except Exception as e:
self.warningText.setText('The fingerprint sensor could not be initialized!\n Exception message: ' + str(e))
return None

## Gets some sensor information


#print('Currently used templates: ' + str(f.getTemplateCount()) + '/' + str(f.getStorageCapacity()))

## Tries to enroll new finger


try:
self.warningText.setText('Waiting for finger...')

## Wait that finger is read


while (f.readImage() == False):
pass
self.warningText.setText('Waiting for finger...')
## Converts read image to characteristics and stores it in charbuffer 1
f.convertImage(0x01)

## Checks if finger is already enrolled


result = f.searchTemplate()
positionNumber = result[0]
if (positionNumber >= 0):
self.warningText.setText('Template already exists at position #' + str(positionNumber))
return None

self.warningText.setText('Remove finger')
time.sleep(2)
self.warningText.setText('Waiting for same finger again...')

## Wait that finger is read again


while (f.readImage() == False):
pass

## Converts read image to characteristics and stores it in charbuffer 2


f.convertImage(0x02)

## Compares the charbuffers


if (f.compareCharacteristics() == 0):
raise Exception('\nFingers do not match')

## Creates a template
f.createTemplate()

## Saves template at new position number


positionNumber = f.storeTemplate()
#r = input("Enter the name = ")

## Loads the found template to charbuffer 1


f.loadTemplate(positionNumber, 0x01)

## Downloads the characteristics of template loaded in charbuffer 1


characterics = str(f.downloadCharacteristics(0x01)).encode('utf-8')

## Hashes characteristics of template


cre_hash = hashlib.sha256(characterics).hexdigest()
#curs = conn.cursor()
curs.execute(
'INSERT INTO voters(name,hashval,position) values(?, ?,?)',
(fullName, cre_hash, positionNumber))
self.warningText.setText('Finger enrolled successfully! \n New template position #' + str(positionNumber))
self.firstNameInput.setReadOnly(True)
self.lastNameInput.setReadOnly(True)
conn.commit()
## conn.close()

except Exception as e:
self.warningText.setText('Operation failed!Exception message: ' + str(e))
return None

def deleteRow(self):
self.warningText.setText("")
if self.lastNameInput.toPlainText() != "" or self.lastNameInput.toPlainText() != "" :
try:
f = PyFingerprint('COM3', 57600, 0xFFFFFFFF, 0x00000000)

if (f.verifyPassword() == False):
raise ValueError('The given fingerprint sensor password is wrong!')
except Exception as e:
self.warningText.setText('The fingerprint sensor could not be initialized!\nException message: ' + str(e))
return None

## Gets some sensor information


# print('Currently used templates: ' + str(f.getTemplateCount()) + '/' + str(f.getStorageCapacity()))

## Tries to delete the template of the finger


try:
firstName = self.firstNameInput.toPlainText().replace(" ", "").lower()
lastName = self.lastNameInput.toPlainText().replace(" ", "").lower()
fullName = firstName + ' ' + lastName

curs.execute("SELECT name, position FROM voters ")


nameList = []
print("im here")

path = os.path.join(image_dir, fullName)


if os.path.exists(path):
path = os.path.join(image_dir, fullName)
#os.rmdir(path)
shutil.rmtree(path)
for columns in curs.fetchall():
nameList.append(columns)
for nameSql, pos in nameList:
if fullName in nameSql:
print(nameSql)
print(pos)
#curs.execute('DELETE FROM voters WHERE position=? ', pos) ## delete the selected row using name
curs.execute('''DELETE FROM voters WHERE position=? ''', (pos,)) ## delete the selected row using name

print("wew")
if (f.deleteTemplate(int(pos)) == True):
self.warningText.setText(nameSql + "\nfingerprint and pictures are deleted")
conn.commit()
return None

self.warningText.setText('Name not found!')


return None
except Exception as e:
print('Operation failed!')
print('Exception message: ' + str(e))
exit(1)
else:
self.warningText.setText('Please put a Name')

app=QApplication(sys.argv)
window = Registration()
window.show()
try:
sys.exit(app.exec_())
conn.close()
except:
print('exiting')
conn.close()
For Voting Code

import time
from PyQt5.QtGui import QImage, QPixmap, QMovie
from PyQt5.QtCore import pyqtSlot, QTimer, Qt, QThread, pyqtSignal
import face_recognition
import os, sys
import cv2
import numpy as np
import math
import pickle
from PyQt5.QtWidgets import *
from PyQt5.uic import loadUi
from pyfingerprint.pyfingerprint import PyFingerprint
from PyQt5 import QtWidgets
from pyFire import *
import sqlite3
import dlib
from printertest import printcandidates
from config import *
#################################################################################################
##Initialize
conn = sqlite3.connect('voting_database.db')
curs = conn.cursor()

BASE_DIR = os.path.dirname(os.path.abspath(__file__))
image_dir = os.path.join(BASE_DIR, "images")

def face_confidence(face_distance, face_match_threshold=0.6):


range = (1.0 - face_match_threshold)
linear_val = (1.0 - face_distance) / (range * 2.0)

if face_distance > face_match_threshold:


return str(round(linear_val * 100, 2)) + '%'
else:
value = (linear_val + ((1.0 - linear_val) * math.pow((linear_val - 0.5) * 2, 0.2))) * 100
return str(round(value, 2)) + '%'

detector = dlib.get_frontal_face_detector()
predictor = dlib.shape_predictor("model_landmarks/shape_predictor_68_face_landmarks.dat")

# define eye aspect ratio function


def eye_aspect_ratio(eye):
# compute distances between vertical eye landmarks
A = np.linalg.norm(eye[1] - eye[5])
B = np.linalg.norm(eye[2] - eye[4])
# compute distance between horizontal eye landmarks
C = np.linalg.norm(eye[0] - eye[3])
# compute eye aspect ratio
ear = (A + B) / (2.0 * C)
return ear
############################################################################
class Voting(QMainWindow):
def __init__(self, parent = None):
super(Voting, self).__init__(parent)
loadUi('votersGui1.ui', self)
self.addimage()
updateFiretoSql() #Internet
self.initFinger.clicked.connect(self.putFinger)

def addimage(self):
qpixmap = QPixmap('background.png')
self.background.setPixmap(qpixmap)

def putFinger(self):
f = PyFingerprint(fingerprint(), 57600, 0xFFFFFFFF, 0x00000000)
try:
#self.startwords()
# print('Waiting for finger...')
## Wait that finger is read
while (f.readImage() == False):
pass

## Converts read image to characteristics and stores it in charbuffer 1


f.convertImage(0x01)
## Searchs template
result = f.searchTemplate()
positionNumber = result[0]

if (positionNumber == -1):
#widget.setCurrentIndex(widget.currentIndex() + 1)
print("here")
#self.showFinger()
# print("im here")
#return None
#print('No match found!\n Try Again!')

else:
##print('The accuracy score is: ' + str(accuracyScore))
pass

if checkAlreadyVoted(str(positionNumber))==True:#FIREBASE CHECK IF ALREADY VOTED has internet


print(positionNumber)
#if fingerNoInternet(str(positionNumber)) == True: ### no internet
widget.setCurrentIndex(widget.currentIndex()+2)
return None
else:
pass

## Loads the found template to charbuffer 1


f.loadTemplate(positionNumber, 0x01)

curs.execute("SELECT name,position FROM voters")


nameList = []
posList = []
for name, position in curs.fetchall():
nameList.append(name)
posList.append(position)
for i in range(len(posList)):
if str(positionNumber) in posList[i]:
nameShow = nameList[i]
posShow = posList[i]
# self.off = Recognition()
# self.off.captureOff()
capcheck.append("yes")
self.openSecondWindow(nameShow,posShow)
print('Found Match')
widget.setCurrentIndex(widget.currentIndex() + 4)

except Exception as e:
widget.setCurrentIndex(widget.currentIndex() + 1)
print('Operation failed!')
print('Exception message: ' + str(e))

def openSecondWindow(self,name,pos):
self.window7 = selectedResult()
self.window7.namePOS(name,pos)
print(name+pos)
#widget.setCurrentIndex(widget.currentIndex()+1)

class error1(QMainWindow):
def __init__(self):
super(error1, self).__init__()
loadUi('votersGui11.ui', self)
qpixmap = QPixmap('errorsym.png')
self.background.setPixmap(qpixmap)
self.initFinger.clicked.connect(self.home)
self.initFinger_2.clicked.connect(self.face)

def home(self):
widget.setCurrentIndex(widget.currentIndex() - 1)

def face(self):
widget.setCurrentIndex(widget.currentIndex() + 2)

class error2(QMainWindow):
def __init__(self):
super(error2, self).__init__()
loadUi('votersGui12.ui', self)
qpixmap = QPixmap('errorsym.png')
self.background.setPixmap(qpixmap)
self.initFinger.clicked.connect(self.home)

def home(self):
widget.setCurrentIndex(widget.currentIndex() - 2)

############################################################################################

capcheck = []
nameget = []
facecheck = []
class Recognition(QMainWindow):
def __init__(self):
super(Recognition, self).__init__()
loadUi("faceRECOGNITIONWINDOW.ui", self)
self.encoded()
self.webcam.clicked.connect(self.faceREC)
self.exit.clicked.connect(self.quit)

face_locations = []
face_encodings = []
face_names = []
known_face_encodings = []
known_face_names = []

def recclear(self):
self.image.clear()
self.namePrev.clear()
self.camStatus.clear()
self.image.setText(">Press FACE RECOGNITION to Start Authentication\n\n >Press EXIT when you want to Quit")
def quit(self):
self.recclear()
widget.setCurrentIndex(widget.currentIndex() - 3)
@pyqtSlot()
def encoded(self):
faceencodePKL = open('faceencode.pkl', 'rb')
faceencodeloop = pickle.load(faceencodePKL)
for label, face_encoding in faceencodeloop:
self.known_face_encodings.append(face_encoding)
self.known_face_names.append(label)

def faceREC(self): ##### TO FIX UNKNOWN PART##########


video_capture = cv2.VideoCapture(camera())
self.face_names = []
self.face_names.clear()
facetoeyes = True
eyes_closed = False
blink_msg = "Blink twice"
font = cv2.FONT_HERSHEY_SIMPLEX
blink_count = 0
if not video_capture.isOpened():
sys.exit('Video source not found...')
while True:
ret, frame = video_capture.read()

if facetoeyes == True:
small_frame = cv2.resize(frame, (0, 0), fx=0.25, fy=0.25)
rgb_small_frame = small_frame[:, :, ::-1]
self.face_locations = face_recognition.face_locations(rgb_small_frame)
self.face_encodings = face_recognition.face_encodings(rgb_small_frame, self.face_locations)

for face_encoding in self.face_encodings:


matches = face_recognition.compare_faces(self.known_face_encodings, face_encoding)

face_distances = face_recognition.face_distance(self.known_face_encodings, face_encoding)


best_match_index = np.argmin(face_distances)

if matches[best_match_index]:
confidence = face_confidence(face_distances[best_match_index])
numconfidence = float(confidence.strip('%'))
if numconfidence >= 97:
name = self.known_face_names[best_match_index]
else:
name = "Unknown"
self.face_names.append(f'{name}')
if len(self.face_names) > 1:
for i in range(len(self.face_names)-1):
#print("im here")
if self.face_names[i] == self.face_names[i+1]:
#print(self.face_names)
if len(self.face_names) == 3:
print(self.face_names)
self.namePrev.setText(name)
if checkAlreadyVotedName(name) == True: # FIREBASE CHECK IF ALREADY VOTED
#if sqlNoInternet(name) == True: ## NO internet
video_capture.release()
cv2.destroyAllWindows()
return self.exit.setEnabled(True), self.camStatus.setText("already voted")
elif self.face_names[0] == "Unknown":
self.namePrev.setText(name)
video_capture.release()
cv2.destroyAllWindows()
return self.image.setText("Unknown Voter: \n Press Face Recognition to try again or \n Press Exit to
Quit"),self.camStatus.setText("Not Registered"), self.exit.setEnabled(True)
else:
facetoeyes = False
else:
self.face_names.clear()

for (top, right, bottom, left), name in zip(self.face_locations, self.face_names):


top *= 4
right *= 4
bottom *= 4
left *= 4

cv2.rectangle(frame, (left, top), (right, bottom), (0, 0, 255), 2)


cv2.rectangle(frame, (left, bottom - 35), (right, bottom), (0, 0, 255), cv2.FILLED)
cv2.putText(frame, name, (left + 6, bottom - 6), cv2.FONT_HERSHEY_DUPLEX, 0.8, (255, 255, 255), 1)

else:

# convert frame to grayscale


gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)

# detect faces in grayscale frame


faces = detector(gray, 0)

# iterate over detected faces


for face in faces:
# detect landmarks in face region
landmarks = predictor(gray, face)

# extract left and right eye landmarks


left_eye = np.array([[landmarks.part(36).x, landmarks.part(36).y],
[landmarks.part(37).x, landmarks.part(37).y],
[landmarks.part(38).x, landmarks.part(38).y],
[landmarks.part(39).x, landmarks.part(39).y],
[landmarks.part(40).x, landmarks.part(40).y],
[landmarks.part(41).x, landmarks.part(41).y]], dtype=np.float32)
right_eye = np.array([[landmarks.part(42).x, landmarks.part(42).y],
[landmarks.part(43).x, landmarks.part(43).y],
[landmarks.part(44).x, landmarks.part(44).y],
[landmarks.part(45).x, landmarks.part(45).y],
[landmarks.part(46).x, landmarks.part(46).y],
[landmarks.part(47).x, landmarks.part(47).y]], dtype=np.float32)

# compute eye aspect ratio for left and right eyes


ear_left = eye_aspect_ratio(left_eye)
ear_right = eye_aspect_ratio(right_eye)
ear_avg = (ear_left + ear_right) / 2.0

# determine if eyes are closed


if ear_avg < 0.2:
eyes_closed = True
else:
# eyes are open
if eyes_closed:
# increment blink count and reset flag
blink_count += 1
eyes_closed = False
if blink_count == 2:
video_capture.release()
cv2.destroyAllWindows()
print(facecheck)
self.namePrev.setText(name)
return self.camStatus.setText("Found Match"), self.exit.setEnabled(False),self.proceedVote()

#draw bounding box around face


cv2.rectangle(frame, (face.left(), face.top()), (face.right(), face.bottom()), (0, 255, 0), 2)

# display blink count above frame


cv2.putText(frame, "Blink count: {}".format(blink_count), (20, 60), font, 1, (0, 255, 0), 2)

# display "Blink twice" message above frame


cv2.putText(frame, blink_msg, (20, 30), font, 1, (0, 255, 0), 2)

cv2.imshow('Face Recognition', frame)


if cv2.waitKey(1) == ord('q'):
break

def proceedVote(self):
curs.execute("SELECT name,position FROM voters")
nameList = []
posList = []

for name, position in curs.fetchall():


nameList.append(name)
posList.append(position)
for i in range(len(nameList)):
if self.namePrev.text() == nameList[i]:
nameShow = nameList[i]
posShow = posList[i]
print(nameShow)
print(posShow)
print('Found Match')
self.nextwindow(nameShow, posShow)

def nextwindow(self, name, pos):


self.window7 = selectedResult()
self.window7.namePOS(name, pos)
self.namePrev.clear()
self.camStatus.clear()
self.exit.setEnabled(True)
self.image.setText(">Press FACE RECOGNITION to Start Authentication\n\n >Press EXIT when you want to Quit")
widget.setCurrentIndex(widget.currentIndex() + 1)

############################################################################################
getResult = []
namE = []
poS = []

class instructionVoting(QMainWindow):
def __init__(self):
super(instructionVoting, self).__init__()
loadUi('votersGui2.ui', self)
self.nextButton.clicked.connect(self.presvote)

def presvote(self):
widget.setCurrentIndex(widget.currentIndex() + 1)

class presVote(QMainWindow):

def __init__(self):
super(presVote, self).__init__()
loadUi('votersPres.ui', self)
self.pres1.setCheckable(True)
self.pres2.setCheckable(True)
self.pres3.setCheckable(True)
self.pres4.setCheckable(True)
self.pres1.clicked.connect(self.togglepres1)
self.pres2.clicked.connect(self.togglepres2)
self.pres3.clicked.connect(self.togglepres3)
self.pres4.clicked.connect(self.togglepres4)
self.nextPres.clicked.connect(self.nextButPres)

def untoggle(self):
for statusBut in [self.pres1,self.pres2,self.pres3,self.pres4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def togglepres1(self):
if self.pres1.isChecked():
for statusBut in [self.pres2,self.pres3,self.pres4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def togglepres2(self):
if self.pres2.isChecked():
for statusBut in [self.pres1, self.pres3, self.pres4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def togglepres3(self):
if self.pres3.isChecked():
for statusBut in [self.pres1, self.pres2, self.pres4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def togglepres4(self):
if self.pres4.isChecked():
for statusBut in [self.pres1, self.pres2, self.pres3]:
if statusBut.isChecked() == True:
statusBut.toggle()

def nextButPres(self):
for statusBut in [self.pres1, self.pres2, self.pres3, self.pres4]:
if statusBut.isChecked() == True:
#print(statusBut.text())
self.showVote(statusBut.text())
self.untoggle()
widget.setCurrentIndex(widget.currentIndex() + 1)

def showVote(self, getVote):


window7 = selectedResult()
window7.selected(getVote)

class internalVP(QMainWindow):

def __init__(self):
super(internalVP, self).__init__()
loadUi('votersInternalVice.ui', self)
self.vice1.setCheckable(True)
self.vice2.setCheckable(True)
self.vice3.setCheckable(True)
self.vice4.setCheckable(True)
self.vice1.clicked.connect(self.togglevice1)
self.vice2.clicked.connect(self.togglevice2)
self.vice3.clicked.connect(self.togglevice3)
self.vice4.clicked.connect(self.togglevice4)
self.backVice.clicked.connect(self.backButVice)
self.nextVice.clicked.connect(self.nextButVice)

def untoggle(self):
for statusBut in [self.vice1, self.vice2, self.vice3, self.vice4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def togglevice1(self):
if self.vice1.isChecked():
for statusBut in [self.vice2,self.vice3,self.vice4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def togglevice2(self):
if self.vice2.isChecked():
for statusBut in [self.vice1, self.vice3, self.vice4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def togglevice3(self):
if self.vice3.isChecked():
for statusBut in [self.vice1, self.vice2, self.vice4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def togglevice4(self):
if self.vice4.isChecked():
for statusBut in [self.vice1, self.vice2, self.vice3]:
if statusBut.isChecked() == True:
statusBut.toggle()

def nextButVice(self):
for statusBut in [self.vice1, self.vice2, self.vice3, self.vice4]:
if statusBut.isChecked() == True:
self.showVote(statusBut.text())
self.untoggle()
widget.setCurrentIndex(widget.currentIndex() + 1)

def showVote(self, getVote):


window7 = selectedResult()
window7.selected(getVote)

def backButVice(self):
getResult.pop()
self.untoggle()
widget.setCurrentIndex(widget.currentIndex() - 1)

class externalVP(QMainWindow):

def __init__(self):
super(externalVP, self).__init__()
loadUi('votersExternalVice.ui', self)
self.vice1.setCheckable(True)
self.vice2.setCheckable(True)
self.vice3.setCheckable(True)
self.vice4.setCheckable(True)
self.vice1.clicked.connect(self.togglevice1)
self.vice2.clicked.connect(self.togglevice2)
self.vice3.clicked.connect(self.togglevice3)
self.vice4.clicked.connect(self.togglevice4)
self.backVice.clicked.connect(self.backButVice)
self.nextVice.clicked.connect(self.nextButVice)

def untoggle(self):
for statusBut in [self.vice1, self.vice2, self.vice3, self.vice4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def togglevice1(self):
if self.vice1.isChecked():
for statusBut in [self.vice2,self.vice3,self.vice4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def togglevice2(self):
if self.vice2.isChecked():
for statusBut in [self.vice1, self.vice3, self.vice4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def togglevice3(self):
if self.vice3.isChecked():
for statusBut in [self.vice1, self.vice2, self.vice4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def togglevice4(self):
if self.vice4.isChecked():
for statusBut in [self.vice1, self.vice2, self.vice3]:
if statusBut.isChecked() == True:
statusBut.toggle()

def nextButVice(self):
for statusBut in [self.vice1, self.vice2, self.vice3, self.vice4]:
if statusBut.isChecked() == True:
self.showVote(statusBut.text())
self.untoggle()
widget.setCurrentIndex(widget.currentIndex() + 1)

def showVote(self, getVote):


window7 = selectedResult()
window7.selected(getVote)

def backButVice(self):
getResult.pop()
self.untoggle()
widget.setCurrentIndex(widget.currentIndex() - 1)

class secVote(QMainWindow):

def __init__(self):
super(secVote, self).__init__()
loadUi('votersSec.ui', self)
self.sec1.setCheckable(True)
self.sec2.setCheckable(True)
self.sec3.setCheckable(True)
self.sec4.setCheckable(True)
self.sec1.clicked.connect(self.togglesec1)
self.sec2.clicked.connect(self.togglesec2)
self.sec3.clicked.connect(self.togglesec3)
self.sec4.clicked.connect(self.togglesec4)
self.backSec.clicked.connect(self.backButSec)
self.nextSec.clicked.connect(self.nextButSec)

def untoggle(self):
for statusBut in [self.sec1, self.sec2, self.sec3, self.sec4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def togglesec1(self):
if self.sec1.isChecked():
for statusBut in [self.sec2,self.sec3,self.sec4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def togglesec2(self):
if self.sec2.isChecked():
for statusBut in [self.sec1, self.sec3, self.sec4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def togglesec3(self):
if self.sec3.isChecked():
for statusBut in [self.sec1, self.sec2, self.sec4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def togglesec4(self):
if self.sec4.isChecked():
for statusBut in [self.sec1, self.sec2, self.sec3]:
if statusBut.isChecked() == True:
statusBut.toggle()

def nextButSec(self):
for statusBut in [self.sec1, self.sec2, self.sec3, self.sec4]:
if statusBut.isChecked() == True:
#print(statusBut.text())
self.showVote(statusBut.text())
self.untoggle()
widget.setCurrentIndex(widget.currentIndex() + 1)

def showVote(self, getVote):


window7 = selectedResult()
window7.selected(getVote)

def backButSec(self):
getResult.pop()
self.untoggle()
widget.setCurrentIndex(widget.currentIndex() - 1)

class treaVote(QMainWindow):

def __init__(self):
super(treaVote, self).__init__()
loadUi('votersTrea.ui', self)
self.trea1.setCheckable(True)
self.trea2.setCheckable(True)
self.trea3.setCheckable(True)
self.trea4.setCheckable(True)
self.trea1.clicked.connect(self.toggletrea1)
self.trea2.clicked.connect(self.toggletrea2)
self.trea3.clicked.connect(self.toggletrea3)
self.trea4.clicked.connect(self.toggletrea4)
self.backTrea.clicked.connect(self.backButTrea)
self.nextTrea.clicked.connect(self.nextButTrea)

def untoggle(self):
for statusBut in [self.trea1, self.trea2, self.trea3, self.trea4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def toggletrea1(self):
if self.trea1.isChecked():
for statusBut in [self.trea2,self.trea3,self.trea4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def toggletrea2(self):
if self.trea2.isChecked():
for statusBut in [self.trea1, self.trea3, self.trea4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def toggletrea3(self):
if self.trea3.isChecked():
for statusBut in [self.trea1, self.trea2, self.trea4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def toggletrea4(self):
if self.trea4.isChecked():
for statusBut in [self.trea1, self.trea2, self.trea3]:
if statusBut.isChecked() == True:
statusBut.toggle()

def nextButTrea(self):
for statusBut in [self.trea1, self.trea2, self.trea3, self.trea4]:
if statusBut.isChecked() == True:
self.showVote(statusBut.text())
self.untoggle()
#print("Im here")
widget.setCurrentIndex(widget.currentIndex() + 1)

def showVote(self,getVote):
window7 = selectedResult()
window7.selected(getVote)

def backButTrea(self):
getResult.pop()
self.untoggle()
widget.setCurrentIndex(widget.currentIndex() - 1)

class auditor(QMainWindow):

def __init__(self):
super(auditor, self).__init__()
loadUi('votersAuditor.ui', self)
self.trea1.setCheckable(True)
self.trea2.setCheckable(True)
self.trea3.setCheckable(True)
self.trea4.setCheckable(True)
self.trea1.clicked.connect(self.toggletrea1)
self.trea2.clicked.connect(self.toggletrea2)
self.trea3.clicked.connect(self.toggletrea3)
self.trea4.clicked.connect(self.toggletrea4)
self.backTrea.clicked.connect(self.backButTrea)
self.nextTrea.clicked.connect(self.nextButTrea)

def untoggle(self):
for statusBut in [self.trea1, self.trea2, self.trea3, self.trea4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def toggletrea1(self):
if self.trea1.isChecked():
for statusBut in [self.trea2,self.trea3,self.trea4]:
if statusBut.isChecked() == True:
statusBut.toggle()

def toggletrea2(self):
if self.trea2.isChecked():
for statusBut in [self.trea1, self.trea3, self.trea4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def toggletrea3(self):
if self.trea3.isChecked():
for statusBut in [self.trea1, self.trea2, self.trea4]:
if statusBut.isChecked() == True:
statusBut.toggle()
def toggletrea4(self):
if self.trea4.isChecked():
for statusBut in [self.trea1, self.trea2, self.trea3]:
if statusBut.isChecked() == True:
statusBut.toggle()

def nextButTrea(self):
for statusBut in [self.trea1, self.trea2, self.trea3, self.trea4]:
if statusBut.isChecked() == True:
self.showVote(statusBut.text())
window7.result()
self.untoggle()
widget.setCurrentIndex(widget.currentIndex() + 1)

def showVote(self,getVote):
window7 = selectedResult()
window7.selected(getVote)

def backButTrea(self):
getResult.pop()
widget.setCurrentIndex(widget.currentIndex() - 1)

class selectedResult(QMainWindow):

def __init__(self):
super(selectedResult, self).__init__()
loadUi('votersPreview.ui', self)
self.backSelect.clicked.connect(self.backButSelect)
self.finalizeSelect.clicked.connect(self.finalizeVote)

def selected(self,candidates):
getResult.append(candidates)

def result(self):
self.presSelect.setText(getResult[0])
self.internalSelect.setText(getResult[1])
self.externalSelect.setText(getResult[2])
self.secSelect.setText(getResult[3])
self.treaSelect.setText(getResult[4])
self.auditorSelect.setText(getResult[5])

def backButSelect(self):
getResult.pop()
widget.setCurrentIndex(widget.currentIndex() - 1)

def namePOS(self, name, pos):


namE.append(name)
poS.append(pos)
print(namE)
print(poS)

def finalizeVote(self):
for pos in poS:
poss = pos
for name in namE:
namee = name
print(getResult)
finalizeFire(namee, poss ,getResult[0],getResult[1],getResult[2],getResult[3],getResult[4],getResult[5]) # internet
#finalizeNoInternet(namee, poss ,getResult[0],getResult[1],getResult[2],getResult[3],getResult[4],getResult[5]) #NO
internet
printcandidates(getResult[0],getResult[1],getResult[2],getResult[3],getResult[4],getResult[5],namee)
updateFiretoSql() ####### internet

poS.clear()
getResult.clear()
namE.clear()
capcheck.clear()
nameget.clear()
widget.setCurrentIndex(widget.currentIndex() + 1)

class thankyou(QMainWindow):

def __init__(self):
super(thankyou, self).__init__()
loadUi('thankyou.ui', self)
self.done.clicked.connect(self.done1)

def done1(self):
widget.setCurrentIndex(0)

app=QApplication(sys.argv)
window = Voting()
window1 = error1()
window12 = error2()
windowRec = Recognition()
window2 = instructionVoting()
window3 = presVote()
window4 = internalVP()
window44 = externalVP()
window5 = secVote()
window6 = treaVote()
window67 = auditor()
window7 = selectedResult()
window8 = thankyou()

widget = QtWidgets.QStackedWidget()
widget.addWidget(window)
widget.addWidget(window1)
widget.addWidget(window12)
widget.addWidget(windowRec)
widget.addWidget(window2)
widget.addWidget(window3)
widget.addWidget(window4)
widget.addWidget(window44)
widget.addWidget(window5)
widget.addWidget(window6)
widget.addWidget(window67)
widget.addWidget(window7)
widget.addWidget(window8)
window.setFixedWidth(720)
window.setFixedHeight(480)
# windowRec.setFixedWidth(800)
# windowRec.setFixedHeight(600)
#widget.showFullScreen()
widget.show()

try:
sys.exit(app.exec_())
conn.close()
except:
print('exiting')
conn.close()
For Cloud Database Code
import sqlite3
import pyrebase
import csv
import pandas as pd
from urllib import request
from printertest import totalVotes
config = {
"apiKey": "AIzaSyCeHmYk0JWMGrqaY7zE12TagjEklxdH8ME",
"authDomain": "evmthesis.firebaseapp.com",
"databaseURL": "https://evmthesis-default-rtdb.firebaseio.com",
"projectId": "evmthesis",
"storageBucket": "evmthesis.appspot.com",
"messagingSenderId": "762023483827",
"appId": "1:762023483827:web:eef8e7b36420e575fc53c6",
"measurementId": "G-0941KERDG6"
}

firebase=pyrebase.initialize_app(config)
db = firebase.database()

conn = sqlite3.connect('voting_database.db')
curs = conn.cursor()

def checkAlreadyVoted(posnumber):
for result in db.child("thesis").get().each():
for pos in db.child("thesis").child(result.key()).get().each():
if pos.val() == posnumber:
return True

def checkAlreadyVotedName(name):
for result in db.child("thesis").get().each():
if result.key() == name:
return True

def finalizeFire(name,pos,pres,internal,external,sec,trea,auditor):
posdata = {"pos": pos}
data = {"pres": pres, "internalVP": internal, "exteralVP": external, "secretary": sec, "trea": trea, "auditor": auditor}
db.child("thesis").child(name).set(posdata)
db.child("thesis").child(name).child('selection').set(data)
print('DONE!')

def getDataFiretoSQL():
data = []
for result in db.child("thesis").get():
if result.key() != "zero zero":
dataParent = []
for pos in db.child("thesis").child(result.key()).get():
if pos.key() == "pos":
dataParent.append(result.key())
dataParent.append(pos.val())

for selection in db.child("thesis").child(result.key()).child("selection").get():


dataParent.append(selection.val())
data.append(dataParent)
return data

def updateFiretoSql():
data = getDataFiretoSQL()
curs.execute("DELETE FROM syncvotes;")
sql = """INSERT INTO syncvotes(name,pos,pres,internal,external,sec,trea,auditor) VALUES(?,?,?,?,?,?,?,?);"""
curs.executemany(sql, data)
conn.commit()
print("Firebase data updated to SQL syncvotes") # sync firebase and sql

def votecountsPrintCSV():
sqlquery = "SELECT * FROM syncvotes"
curs.execute(sqlquery)
for row in curs.fetchall():
df = pd.read_sql_query(sqlquery, conn)
df.to_csv('votes.csv', index=False) #microsoft excel file

def voteCounts():
presA = []
inviceB = []
exviceB = []
secC = []
treaD = []
audE = []
curs.execute("SELECT pres,internal,external,sec,trea,auditor FROM syncvotes")

for pres,internal,external,sec,trea,auditor in curs.fetchall():


presA.append(pres)
inviceB.append(internal)
exviceB.append(external)
secC.append(sec)
treaD.append(trea)
audE.append(auditor)
print("done1")
A = {x: presA.count(x) for x in presA}
B = {x: inviceB.count(x) for x in inviceB}
BB = {x: exviceB.count(x) for x in exviceB}
C = {x: secC.count(x) for x in secC}
D = {x: treaD.count(x) for x in treaD}
E = {x: audE.count(x) for x in audE}
totalVotes(A,B,BB,C,D,E) ## to print the tally

############# NO INTERNET #################################


def sqlNoInternet(names): #no internet check voted already
curs.execute("SELECT name FROM syncvotes")
checknames = []
for name in curs.fetchall():
checknames.append(name)
for i in range(len(checknames)):
if names in checknames[i]:
conn.commit()
return True

def finalizeNoInternet(namee,poss ,result1,result2,result3,result4,result5,result6):


print("here")
curs.execute("INSERT INTO syncvotes(name,pos,pres,internal,external,sec,trea,auditor)
VALUES(?,?,?,?,?,?,?,?);",(namee,poss ,result1,result2,result3,result4,result5,result6))
print("here2")
conn.commit()

def fingerNoInternet(poss):
curs.execute("SELECT pos FROM syncvotes")
checkpos = []
for pos in curs.fetchall():
checkpos.append(pos)
for i in range(len(checkpos)):
if poss in checkpos[i]:
conn.commit()
print("yes")
return True
For Thermal Printer Code
import adafruit_thermal_printer
import serial
from config import *

uart = serial.Serial(printer(), baudrate=9600, timeout=3000)

import adafruit_thermal_printer
ThermalPrinter = adafruit_thermal_printer.get_printer_class(2.69)
printer = ThermalPrinter(uart, auto_warm_up=True)
printer.warm_up(120)

def printcandidates(pres,internal,external,sec,trea,auditor,candidatename):
printer.bold = True
printer.underline = adafruit_thermal_printer.UNDERLINE_THICK
printer.size = adafruit_thermal_printer.SIZE_MEDIUM
printer.justify = adafruit_thermal_printer.JUSTIFY_CENTER
printer.bold = True
printer.print('CANDIDATES SELECTED')
printer.bold = True
printer.print(candidatename)
printer.feed(1)
printer.underline = None
printer.size = adafruit_thermal_printer.SIZE_SMALL
printer.justify = adafruit_thermal_printer.JUSTIFY_LEFT
printer.bold = True
printer.print('President: ')
printer.print(pres)
printer.bold = True
printer.print('Internal Vice-President: ')
printer.print(internal)
printer.bold = True
printer.print('External Vice-President: ')
printer.print(external)
printer.bold = True
printer.print('Secretary: ')
printer.print(sec)
printer.bold = True
printer.print('Treasurer: ')
printer.print(trea)
printer.bold = True
printer.print('Auditor: ')
printer.print(auditor)
printer.bold = True
printer.feed(3)
printer.bold = False

def totalVotes(pres,internal,external,sec,trea,auditor):
printer.bold = True
printer.underline = adafruit_thermal_printer.UNDERLINE_THICK
printer.size = adafruit_thermal_printer.SIZE_MEDIUM
printer.justify = adafruit_thermal_printer.JUSTIFY_CENTER
printer.print('Vote Counts')
printer.feed(1)
printer.underline = None
printer.size = adafruit_thermal_printer.SIZE_SMALL
printer.justify = adafruit_thermal_printer.JUSTIFY_LEFT
printer.print('President:')
for Akey, Aval in pres.items():
printer.print(f"{Akey} = {Aval}")
printer.print('\nInternal VP:')
for Bkey, Bval in internal.items():
printer.print(f"{Bkey} = {Bval}")
printer.print('\nExternal VP:')
for BBkey, BBval in external.items():
printer.print(f"{BBkey} = {BBval}")
printer.print('\nSecretary:')
for Ckey, Cval in sec.items():
printer.print(f"{Ckey} = {Cval}")
printer.print('\nTreasurer:')
for Dkey, Dval in trea.items():
printer.print(f"{Dkey} = {Dval}")
printer.print('\nAuditor:')
for Ekey, Eval in auditor.items():
printer.print(f"{Ekey} = {Eval}")
printer.feed(4)
printer.bold = False
For Configuration Code
def fingerprint():
return str("COM14")

def printer():
return str("COM9")

def camera():
return int(0)
APPENDIX F
Informed Consent

Title of Research: Fingerprint and Face Authentication Portable Digital Electronic Voting Machine
Investigators, Affiliations, and Contact Information:

John Carlo S. Lisondato Princess Diane L. Maboloc Brandon Sage M. Cagape


University of Mindanao University of Mindanao University of Mindanao
College of Engineering Education College of Engineering Education College of Engineering Education
Electronics Engineering Program Electronics Engineering Program Electronics Engineering Program
Davao City, Philippines Davao City, Philippines Davao City, Philippines
j.lisondato.488253@umindanao.edu.ph p.maboloc.499584@umindanao.edu.ph b.cagape.513587@umindanao.edu.p

Introduction and Purpose of the Study


Our study proposes to create an Electronic Voting Machine (EVM) that uses two
authentication methods: fingerprint and face recognition to prevent fraud and the possibility
of ghost voting. Producing an EVM will also remove the need to print out ballots where there
is unavoidable production of defective ones.

Description of the Research


By agreeing to participate in our study, you will be asked to register your basic personal
information such as your name and contact details. Furthermore, you will be asked to then
register your fingerprint and face biometrics to our database. After the registration process,
you will be asked to participate in our accuracy testing wherein we will check through several
trials if the registered voter is recognized by our authentication methods (fingerprint and
face). Finally, you will be asked to attend a mock election to test the functionality of our
device.

Subject Participation
20 participants are needed for this study. They must be a bonafide student at University of
Mindanao for the second semester of the academic year 2022-2023, enrolled in BS
Electronics Engineering program. Participants will be involved in 3 meetings in total,
approximately 30 minutes to 1 hour in length.

Potential Risks and Discomforts


No known risks will be apparent throughout the participation of our study.

Confidentiality
All information collected from the study will be coded to protect each subject’s name.
Although your information will be stored in our database as they are needed for the
authentication processes, no names or other identifying information will be used when
discussing or reporting data. The researchers will safely keep all data gathered from the said
database, and only them can access it. Once the data has been fully analyzed, it will be
destroyed.
Authorization

By signing this form, you authorize the use and disclosure of the following information
for this research:

I authorize the use of the data collected from me during this study for the sole purpose of the
said research.

Voluntary Participation and Authorization


Your decision to participate in this study is completely voluntary. If you decide to not participate
in this study, it will not affect you in any way.

Withdrawal from the Study and/or Withdrawal of Authorization


If you decide to participate in this study, you may withdraw from your participation at any
time without penalty by informing us, the researchers. However, any data collected prior to
the withdrawal will still be included in the study.

Cost
There is no cost for participating in this study.

I voluntarily agree to participate in this research study.


□ Yes
□ No
I understand that I will be given a copy of this signed Consent Form.

Name of Participant

Signature: Date Signed:

Name of Witness

Signature: Date Signed:

Person Obtaining Consent

Signature: Date Signed:


APPENDIX G
JOHN CARLO LISONDATO
Address: Northcrest Subd., Cabantian, Davao City
Contact Number: +639491383815
Email Address: j.lisondato.488253@umindanao.edu.ph

Objective
• I am eager to find a job that aligns with my educational background, allowing me to gain
valuable experience and enhance my professional growth.

Education
• Bachelor of Science in Electronics Engineering
University of Mindanao, Matina, Davao City
October 2018 – Present

Affiliations
• Society of Electronics Engineering Students (SECES)
Member
2019 - Present
• Junior Institute of Electronics Engineers of the Philippines - Southern Mindanao
Sector
Member
2018 – Present
PRINCESS DIANE MABOLOC
Address: 37-A Andaya St., Daliao, Toril, Davao City
Contact Number: +639971104688
Email Address: p.maboloc.499584@umindanao.edu.ph

Objective
• To seize job opportunities that will nurture my capabilities and skills as a future professional
electronics engineer.

Education
• Bachelor of Science in Electronics Engineering
University of Mindanao, Matina, Davao City
October 2018 – Present

Affiliations
• Society of Electronics Engineering Students (SECES)
Member
2020 - Present

• Junior Institute of Electronics Engineers of the Philippines - Southern Mindanao


Sector
Member
2018 – Present
BRANDON SAGE M. CAGAPE
Address: Catalunan Pequeño, Davao City
Contact Number: +639954940385
Email Address: b.cagape.513587@umindanao.edu.ph

Objective
• To proliferate and enhance my skill set in engineering in order to become a competitive
professional electronics engineer.

Education
• Bachelor of Science in Electronics Engineering
University of Mindanao, Matina, Davao City
June 2019 – Present

Affiliations
• Society of Electronics Engineering Students (SECES)
Member
2020 - Present

• Junior Institute of Electronics Engineers of the Philippines - Southern Mindanao


Sector
Member
2019 – Present

You might also like