Professional Documents
Culture Documents
Android-Based Specialized Answer Sheet Checker For Objective Type Questions Using Optical Character Recognition (February 2020)
Android-Based Specialized Answer Sheet Checker For Objective Type Questions Using Optical Character Recognition (February 2020)
Abstract— An Android-based automated scoring of answer sheets is developed to address the problem of scoring test papers.
This application is designed to check test with objective-type answers written in a specialized answer sheet using machine learning
vision-based character recognition. The application can also save the results into a portable document format (PDF) and Excel
file. The study was split into three (3) parts: designing the specialized answer sheet, the development of the application, and
carrying out tests of the built system. The researchers designed the customized and standard answer sheet for objective type
questionnaires. Android Studio was used as a platform in programming the application. Moreover, there are five (5) parts in
application development: Main Application designing, OCR system development, Data and File Storage System development,
Results-to-PDF Writer system development, and Results-to-Excel File writer system development. After the development of the
mobile application, two types of test were conducted: (i) accuracy, and (ii) acceptability. The tests were carried out to determine
the highest average accuracy rate of the application and a rating-scale based survey to evaluate its acceptability. In the accuracy
test, three (3) camera specifications,8 MP, 13 MP and 16 MP, and a sample size of sixty-one (61) students were considered. Results
show that the application has the highest average accuracy rate of 61.3115% in the given samples and an overall rating of 4.5 out
of 5 given by the respondents of the survey. Researchers expect innovations of application in the future works for the
improvement of its functionality.
Index Terms— Android, Automated scoring, Causal mode filter, Objective-type test, Optical Character Recognition, PDF writer,
Shared Preferences object storage
I. INTRODUCTION
E xamination plays a vital part in a school’s education system. Not only it tests students’ knowledge, but also used to
determine the necessary actions needed by the teacher/institution to improve student’s competence. As of today,
there are many ways to conduct an examination such that they differ in strategies and tasks to assess student’s skills and
knowledge[ CITATION Fie18 \l 13321 ]. However, the burden of evaluating and checking test papers are usually carried by
the teachers especially in evaluating test papers of a class with large population. This is why automated scoring of answer
sheets are helpful to address this problem with promising accuracy in evaluating examination papers.Automatic answer
sheet checker usually evaluates answer sheets using machines with a scanner. Most of the existing answer sheet scanner
uses Bubble sheet multiple choice test because it can be easily recognized by machines through Optical Mark Recognition
(OMR) which is the analysis of “human-marked documents” and translate them to machine code [ CITATION Wat19 \l
13321 ]. Scantron Corporation manufactures OMR scanners called OpScan® Series used for checking multiple choice
tests. These scanners are currently used as grading machines for Scholastic Aptitude Test (SAT) in 98% of institutions in
the United States[ CITATION Ros16 \l 13321 ].
Due to continuous improvement of technology, the automated answer sheet checking is already possible through mobile
phones. Using the same system of OMR, checking of multiple-choice answer sheets can now be done using an Android
mobile grading app which uses the phone’s camera to scan the document instead of bulky scanner. Such feature offers
portability as mobile phones can be carried anywhere and cost-friendly because of its potential as an alternative to
expensive OMR scanners[ CITATION Tja17 \l 13321 ].
These existing projects involving answer sheet checking system are bounded on Multiple choice questions using a bubble
sheet and OMR system for data collection and analysis. Other types of test or exam such as objective test questions cannot
apply the OMR system since handwritten characters vary on examinees and lack in patterns to be easily recognized by
OMR. This present study focused in capability of developing an automated answer sheet checker for exams with objective-
BISCAST College of Engineering and Architecture – ECE Department
type questions using a vision-based character recognition called Optical Character Recognition (OCR). It is also an Android
based system thus ensuring portability and cost effectiveness. The whole system involves converting an input image of the
specialized answer sheet into text and automatically checking the specialized answer sheet and store the results to the
storage of the phone). With this system, the user, specifically the teacher/proctor, can check the examinations without
consuming so much time, especially with the objective-type exam that requires a lot of effort and time.
II.METHODOLOGY
A. Designing the Specialized Answer Sheet
The researchers designed their customized and standard answer sheet for objective type questionnaires. Using a
computer, the researchers created a template document using Microsoft Word which is the customized answer sheet and it
was designed such that it can easily recognized by the system’s image acquisition system. The student name and subject
code were used to identify the owner of the paper and categorize it based on the class record stored in the application’s data
storage. The answer sheet can accumulate 30 item numbers and has two perforated sections that were designed to enable
folding of such a section to hide the item numbers and the student's identification section once the user starts scanning the
answer sheet. Answers should be placed inside the box beside its corresponding item number.
The researchers constructed data and file storage protocol in saving app data using different data storage options.
SharedPreferences was used to store the Recycler View lists in the Home Screen UI since these lists don’t require structure.
It was also used to store the password and the username of the administrator, Key-to-Corrections items, lists of Class
Records and Class Profiles. In creating PDF file for the printing of results, the researchers used the shared external file
storage as the save file destination of the PDF file as this data storage option provides easy accessibility to the user. On
saving file in external storage, the researchers wrote the WRITE_EXTERNAL_STORAGE permission code in order to
request permission to the phone that will enable the application to access and add files to the external storage.
TABLE I
SMARTPHONES SPECIFICATION
Operating Back Camera
Resolution
System Specification
Phone A Android 7.0 8 Megapixels 720 x 1280
Phone B Android 8.1 13 Megapixels 720 x 1520
Phone C Android 9.0 16 Megapixels 720 x 1544
The evaluation of the project was divided into two parts: Accuracy test and Acceptability test. The accuracy test of the
application was conducted where the researchers collected numbers of selected students’ handwriting samples while the
acceptability test was conducted by preparing a feedback or evaluation form wherein the respondents can assess the
application’s overall performance and features.
For the Accuracy test, the researchers considered the population of the Grade 8 students of Concepcion Pequeña National
High School as the respondents of the study. A sample size of 61 students was subjected to experimentation. The teachers
gave a short examination to be answered by the students using the answer sheet provided by the researchers. The instruction
was given to the students such as writing the answers in uppercase letters and numbers, writing them clean and eligible in
the answer sheet and writing the word “none” instead of leaving it with no answer. Filled-up answer sheets were then
subjected to the application and determine the accuracy of the application in recognizing handwritten text. The researchers
also prepared a set-up to find out if the camera specification of the phone had a great impact on achieving good results.
Percent error formula was used to find relative accuracy with respect to the correct recognition of each of the samples. The
percent error formula (in this case) is given by:
(T −M )
%error= × 100 %
T
BISCAST College of Engineering and Architecture – ECE Department
where T is the no. of total items and M is the no. of correctly identified items. Subtracting the percent error from 100% will
provide the percent accuracy of the sample. The researchers computed the average percent accuracy given by the
respondents to determine the overall percent accuracy of the application based on the given handwriting samples. Statistical
testing was done by determining the significance of the mean differences between the setups. Another test was conducted
wherein the manual evaluation and evaluation using the application served as the experimental set-up. This test was
undertaken for the purpose of determining if there is a notable difference between the two methods of checking.
Next is the acceptability test which is done by conducting satisfaction survey to the target users of the app. The target
respondents of the survey were the English teachers of Concepcion Pequeña National High School which are the teachers of
the students subjected to the accuracy test. The researchers prepared a demonstration as a guide for the respondents in using
the application. In checking the filled-up answer sheet, the respondents subjected it to the application. After using the
application, the respondents evaluated the system. An evaluation form was given to the respondents which serves as a
reference in appraising the application, such as performance, directions, and instructions, aesthetics, file exportation,
customization, security, compatibility and time management. It also has a rating scale of one (1) as the minimum to five (5)
as the maximum. A description is also provided on each scale in every category.
Evaluation was conducted right after the installation of the application. Table II shows the numerical value of the highest,
lowest and average percent accuracy for each camera specifications with three (3) trials. It also presents the range of error in
percent in each trial for the three (3) specifications.
TABLE II
FUNCTIONALITY BASED ON TESTING OF THE APPLICATION USING THREE (3) CAMERA SPECIFICATIONS
Camera Trials No. of Highest Lowest Range Average
Specification Samples Percent Percent of Percent
s of of Error Accuracy
Accurac Accuracy in
y Percent
8 MP 1 61 90 0 90 44.2623
2 61 90 10 80 44.7213
3 61 90 0 90 46.8852
13 MP 1 61 100 20 80 60.1639
2 61 100 20 80 60
3 61 100 20 80 61.3115
16 MP 1 61 100 20 80 58.8525
2 61 100 20 80 57.54098
3 61 100 20 80 57.8689
For 8MP, the application obtained an average percent accuracy of 44.2623, 44.7213 and 46.8852 for the three (3) trials
respectively with a difference of the highest and lowest percent accuracy of 90%, 80%, and 90% correspondingly. For 13
MP, the application attained an average percent accuracy of 60.1639%, 60% and 61.3115 for trial 1,2,3 respectively and a
range of error of 80% in three (3) trials. For 16 MP, an average percent accuracy of 58.8525%, 57.54098%, and 57.8689%
were reached by the application with a range of error of 80% in all trials. Using one-way ANOVA at a significance level of
0.05, it was found out that at least one population mean accuracy from the three (3) set-ups is different, implying that the
system has various accuracy when it comes to different camera specifications. Using the Least square difference (LSD) test
at a significance level of 0.05, the researchers were able to appraise what camera specifications have a wider opportunity to
have approximately similar accuracy in recognizing handwritten characters.
TABLE III
COMPARISON AMONG PAIRS OF MEANS
G-G Mean Diff Comments
Table III shows which pairs have a significant difference with each other and vice versa. The pairs that have a significant
difference with each other are 1-2, 1-3, 2-1, and 3-1. On the other hand, pairs 2-3 and 3-2 show no significant difference
with each other. This infers that the possible grouping is (1) (3,2). Based on the result, camera specifications, 13 MP and 16
MP have approximately similar accuracy when it comes to recognizing written characters compared to 8 MP.
The researchers conducted another experiment to test whether the erasures and letter corrections affect the resulting text
displayed by the application. Answer sheets with erasure, letter corrections and answer sheets which both have erasure and
letter corrections were separated. The researchers also collected sample answer sheets with eligible handwritings to be
subjected to the application. Four (4) set-ups each composed of three (3) trials were prepared. A camera specification of 13
MP was used in all four (4) set-ups.
TABLE IV
HIGHEST, LOWEST AND AVERAGE PERCENT ACCURACY OF 10 SAMPLES WITH ERASURES ONLY
Trials
Trial 1 Trial 2 Trial 3
Highest Percent
100 100 100
Accuracy
Lowest Percent
0 0 0
Accuracy
Average Percent
82.5 79.167 77.5
Accuracy
Based on Table IV, the average percent accuracy for trials 1, 2, and 3 were 82.5, 79.167, 77.5 respectively for 10 samples
with erasures only. For the three (3) trials, the highest percent accuracy was 100 while the lowest is 0%.
TABLE V
HIGHEST, LOWEST AND AVERAGE PERCENT ACCURACY OF 9 SAMPLES WITH LETTER CORRECTIONS ONLY
Trials
Trial 1 Trial 2 Trial 3
Highest Percent
100 100 100
Accuracy
Lowest Percent
0 0 0
Accuracy
Average Percent
60.18556 49.07444 52.77778
Accuracy
As seen in Table V, the researchers gathered 9 samples for this set-up. The highest percent accuracy for trials 1, 2, and 3
was 100 whereas the lowest was 0%. The average percent accuracy for the three (3) trials were 60.18556, 49.07444, and
52.77778 correspondingly. The same camera specification with the first set-up was used in the experiment.
TABLE VI
HIGHEST, LOWEST AND AVERAGE PERCENT ACCURACY OF 14 SAMPLES WITH BOTH ERASURES AND LETTER CORRECTIONS
Trials
Trial 1 Trial 2 Trial 3
Highest Percent
100 100 100
Accuracy
Lowest Percent
0 14.29 25
Accuracy
Average Percent
57.61889 61.95811 54.84111
Accuracy
Another set-up was considered for samples with both erasures and letter corrections. Fourteen (14) samples were
collected to be subjected to the application. The highest, lowest and average percent accuracy for the three trials was shown
in Table VI. The highest percent accuracy for the three trials was 100 while the lowest was 0%, 14.29%, and 25%
respectively. The average percent accuracy was also computed and the results were 57.61889%, 61.95811%, 54.84111% for
the three (3) trials correspondingly.
TABLE VII
BISCAST College of Engineering and Architecture – ECE Department
HIGHEST, LOWEST, AND AVERAGE PERCENT ACCURACY FOR 5 SAMPLES WITH ELIGIBLE HANDWRITINGS
Trials
Trial 1 Trial 2 Trial 3
Highest Percent
100 100 100
Accuracy
Lowest Percent
0 14.29 25
Accuracy
Average Percent
57.61889 61.95811 54.84111
Accuracy
The last set-up was conducted to test the accuracy of the application in checking answer sheets with eligible handwriting.
Five (5) samples were collected which is exhibited in Table VII. The highest and lowest percent accuracy has a small
difference compared to Table 4.4a, 4.4b, and 4.4c. The table also depicted the average percent accuracy for trials 1 and 2
reaching 90% and trial 3 attaining 88% accuracy.
TABLE VIII
MEAN OF MANUAL CHECKING VS APPLICATION WITH 33 SAMPLES
Mean
Number of Samples Manual Checking Application
Average Percent
57.61889 61.95811
Accuracy
Table VIII shows the mean score obtained by manual checking and the mean score attained using the application. Thirty-
three (33) samples were collected and subjected to manual checking and the application. Through manually checking the
answer sheets, a mean score of 6.76 was achieved and with the use of the application, a mean score of 6.67 was gained.
For the acceptability test, a total of twelve (12) teachers tested the performance of the application and evaluated it based
on the factors enumerated on the feedback form.
TABLE IX
MEAN SCORES OF THE FACTORS OF THE APPLICATION LISTED ON THE EVALUATION FORM
FACTORS MEAN SCORE
Performance 4.5
User-friendly 4.833
Appeal 4.333
File exportation 4.667
Customization 4.417
Security 4.667
Compatibility 4.333
Time management 4.5
As seen in Table IX, the factors that truly affect the functionality and efficiency of the application were listed. The
researchers computed the mean score of each factor from the twelve (12) evaluation form used by the respondents in
appraising the application. The mean score of each factor was presented on the table. Factors such as performance, user-
friendly, file exportation, security and time management achieved an average score of 4.5, 4.833, 4.667, 4.667 and 4.5
respectively. On the other hand, factors like appeal, customization, and compatibility attained a mean score of 4.333, 4.417,
and 4.333 respectively.
BISCAST College of Engineering and Architecture – ECE Department
REFERENCES
[1] P. Field, "The importance of exams in today's society," 7 June 2018. [Online]. Available:
https://thecentraltrend.com/43742/opinion/the-importance-of-exams-in-todays-society/.
BISCAST College of Engineering and Architecture – ECE Department
[2] A. Watson, "Optical Mark Recognition: How it works? What are its pros & cons?," 5 March 2019. [Online].
Available: https://medium.com/@annywatson/optical-mark-recognition-how-it-works-what-are-its-pros-cons-
20dd9dd26cfb.
[3] A. Rosebrock, "Bubble sheet multiple choice scanner and test grader using OMR, Python and OpenCV," 3 October
2016. [Online]. Available: https://www.pyimagesearch.com/2016/10/03/bubble-sheet-multiple-choice-scanner-and-
test-grader-using-omr-python-and-opencv/.
[4] H. Tjahyadi, Y. G. Budijono, S. Lukas and D. Krisnadi, "Android Based Automated Scoring of Multiple-Choice
Test," International Journal of Machine Learning and Computing, pp. 110-113, 2017.
Mark Jason S. Baduria was born in December 31, 1998 at Calabanga, Camarines Sur. He is
currently an Electronics Engineering student of Bicol State College of Applied Sciences and
Technology.
Nikka Abegail F. Ebrada was born in Naga City on June 16, 1997. She
finished her primary education in Haring Elementary School and her secondary education at
Barcelonita Fisheries School. She is currently a 5th Year BSECE student at Bicol State College of
Applied Sciences and Technology
Bryan O. Franco was born in February 17, 1999. He’s a high school graduate
from Camarines Sur National High School. He is currently an Electronics
Engineer Student in Bicol State College of Applied Sciences and Technology