Professional Documents
Culture Documents
Attendence System Based On Face Detection
Attendence System Based On Face Detection
FACE RECOGNITION
A PROJECT REPORT
Submitted to
Certificate
This is to certify that the Project entitled “STUDENT ATTENDANCE
SYSTEM USING FACE RECOGNITION” is a bonafide work carried out by
in N. Gopi (15KN1A05B7), M. Tarun (15KN1A05A1), N. Sowmya (15KN1A0576),
M. Sundar Kishore (15KN1A0594), M. Jayanth (15KN1A0588) partial fulfilment for the
award of degree of Bachelor of Technology in Computer Science &
Engineering of Jawaharlal Nehru Technological University Kakinada,
Kakinada during the year 2018-2019.
EXTERNAL EXAMINER
ACKNOWLEDGEMENT
I take this opportunity to thank all who have rendered their full support to my work.
The pleasure, the achievement, the glory, the satisfaction, the reward, the appreciation and the
construction of my project cannot be expressed with a few words for their valuable suggestions.
I am grateful to my Project Guide Dr. D. Ratna Kishore, Professor for rendering the
valuable suggestions and extended his support to complete the project successfully.
I am expressing my heartfelt thanks to Dr. K.V. Sambasiva Rao garu, Head of the
Department, for his continuous guidance for completion of my Project work.
I am thankful to the Dr. C. Naga Bhaskar garu, Principal for his encouragement to
complete the Project work.
I am extending my sincere and honest thanks to the Dr. R. Venkata Rao garu &
Chairman, Secretary, Sri K. Sridhar garu for their continuous support in completing the
Project work.
15KN1A05B7(N. GOPI)
15KN1A05A1(M. TARUN)
15KN1A0576(K. SOWMYA)
15KN1A0594(M. SUNDAR KISHORE)
15KN1A0588(M. JAYANTH)
INDEX
I. List of Figures
II. List of Abbreviation
1. Introduction 4
1.1. Introduction to project 5
1.2. Problem Definition
1.3. Objectives
1.4. Scope of the project
1.5. Process Diagram 7
2. Literature Review
2.1. Digital image processing
2.2. Applications
3. Methodology 8
4. System Analysis 10
4.1. Existing System 11
4.2. Proposed System
4.3. Modules 13
5. Introduction to MAT Lab
5.1. What is MAT Lab
5.2.
6. Conclusion 16
7. System Requirement Specification 17
7.1. Functional Requirements
7.2. Non-Functional Requirements
7.3. System Requirements 20
7.3.1. Software Requirements
7.3.2. Hardware Requirements
8. System Design 21
8.1. UML Modeling 22
7.1.1. Importance of UML in Modeling
7.2. Class Diagram 23
8.2. Use-Case Diagram
8.3. Sequence Diagram 29
8.4. Component Diagram
9. Coding 34
9.1. Sample Code 39
10. Testing 49
11. Screen Shots
12. Future Enhancement 63
12.Bibliography
Abstract:
Face detection (human) plays an important role in applications such as human computer
interface, face recognition video surveillance and face image database management. In the
human face detection applications, face(s) most frequently form an inconsequential part of the
images. Consequently, preliminary segmentation of images into regions that contain "non-
face" objects and regions that may contain "face" candidates can greatly accelerate the process
of human face detection. Most existing face detection approaches have assumptions, which
make them applicable only under some specific conditions. Existing techniques for face
detection in colour images are plagued by poor performance in the presence of scale variation,
variation in illumination, variation in skin colours, complex backgrounds etc. In this research
work we have made a humble attempt to propose an algorithm for face detection in colour
images in the presence of varying lighting conditions, for varied skin colours as well as with
complex backgrounds. Based on a novel tangible skin component extraction modus operandi
and detection of the valid face candidates, our method detects skin regions over the entire image
and engenders face candidates based on the signatures of the detected skin patches. The
1. Process Diagram
2. Class Diagram
3. Use Case Diagram
4. Activity Diagram
5. Sequence Diagram
6. Screenshots
List of Abbreviations
Format name Description
INTRODUCTION
INTRODUCTION
one such application of face recognition. Maintenance and monitoring of attendance records
plays a vital role in the analysis of performance of any organization. The purpose of developing
marking and analysis with reduced human intervention. The prevalent techniques and
methodologies for detecting and recognizing face fail to overcome issues such as scaling, pose,
illumination, variations, rotation, and occlusions. The proposed system aims to overcome the
pitfalls of the existing systems and provides features such as detection of faces, extraction of
the features, detection of extracted features, and analysis of students' attendance. The system
integrates techniques such as image contrasts, integral images, color features and cascading
classifier for feature detection. The system provides an increased accuracy due to use of a large
number of features (Shape, Colour, LBP, wavelet, Auto-Correlation) of the face. Faces are
recognized using Euclidean distance and k-nearest neighbor algorithms. Better accuracy is
attained in results as the system takes into account the changes that occur in the face over the
period of time and employs suitable learning algorithms. The system is tested for various use
cases. We consider a specific area such as classroom attendance for the purpose of testing the
accuracy of the system. The metric considered is the percentage of the recognized faces per
total number of tested faces of the same person. The system is tested under varying lighting
conditions, various facial expressions, presence of partial faces (in densely populated
classrooms) and presence or absence of beard and spectacles. An increased accuracy (nearly
The traditional manual methods of monitoring student attendance in lectures are tedious as the
signed attendance sheets have to be manually logged in to a computer system for analysis. This
is tedious, time consuming and prone to inaccuracies as some students in the department often
sign for their absent colleagues, rendering this method ineffective in tracking the students’ class
attendance. Use of the face detection and recognition system in lieu of the traditional methods
will provide a fast and effective method of capturing student attendance accurately while
offering a 2 secure, stable and robust storage of the system records, where upon authorization;
one can access them for purposes like administration, parents or even the students themselves.
1.3 OBJECTIVES
1. Detection of unique face image amidst the other natural components such as walls,
backgrounds etc.
3. Detection of faces amongst other face characters such as beard, spectacles etc.
This module is a desktop application that does face recognition of the captured images (faces)
in the file, marks the students register and then stores the results in a database for future
analysis.
1.5 Process Diagram
CHAPTER – 2
LITERATURE REVIEW
2 DIGITAL IMAGE PROCESSING
2.1 BACKGROUND:
Digital image processing is an area characterized by the need for extensive experimental
characteristic underlying the design of image processing systems is the significant level of
testing & experimentation that normally is required before arriving at an acceptable solution.
This characteristic implies that the ability to formulate approaches &quickly prototype
candidate solutions generally plays a major role in reducing the cost & time required to arrive
An image may be defined as a two-dimensional function f(x, y), where x & y are spatial
coordinates, & the amplitude of f at any pair of coordinates (x, y) is called the intensity or
gray level of the image at that point. When x, y & the amplitude values of f are all finite
discrete quantities, we call the image a digital image. The field of DIP refers to processing
digital image
by means of digital computer. Digital image is composed of a finite number of elements, each
of which has a particular location & value. The elements are called pixels.
Vision is the most advanced of our sensor, so it is not surprising that image play the
single most important role in human perception. However, unlike humans, who are limited to
the visual band of the EM spectrum imaging machines cover almost the entire EM spectrum,
ranging from gamma to radio waves. They can operate also on images generated by sources
& other related areas such as image analysis& computer vision start. Sometimes a distinction
is made by defining image processing as a discipline in which both the input & output at a
process are images. This is limiting & somewhat artificial boundary. The area of image analysis
There are no clear-cut boundaries in the continuum from image processing at one end to
complete vision at the other. However, one useful paradigm is to consider three types of
computerized processes in this continuum: low-, mid-, & high-level processes. Low-level
process involves primitive operations such as image processing to reduce noise, contrast
enhancement & image sharpening. A low- level process is characterized by the fact that both
its inputs & outputs are images. Mid-level process on images involves tasks such as
segmentation, description of that object to reduce them to a form suitable for computer
fact that its inputs generally are images but its outputs are attributes extracted from those
recognized objects, as in image analysis & at the far end of the continuum performing the
Digital image processing, as already defined is used successfully in a broad range of areas
An image is represented as a two dimensional function f(x, y) where x and y are spatial
co-ordinates and the amplitude of ‘f’ at any pair of coordinates (x, y) is called the intensity of
A grayscale image is a function I(xylem) of the two spatial coordinates of the image
plane.
I(x, y) is the intensity of the image at the point (x, y) on the image plane.
I (xylem)takes non-negative values assume the image is bounded by arectangle[0, a] [0, b]I:
(xylem)for blue.
An image may be continuous with respect to the x and y coordinates and also
in amplitude. Converting such an image to digital form requires that the coordinates as well as
the amplitude to be digitized. Digitizing the coordinate’s values is called sampling. Digitizing
The result of sampling and quantization is a matrix of real numbers. We use two principal
ways to represent digital images. Assume that an image f(x, y) is sampled so that the resulting
image has M rows and N columns. We say that the image is of size M X N. The values of the
coordinates (xylem) are discrete quantities. For notational clarity and convenience, we use
integer values for these discrete coordinates. In many image processing books, the image origin
is defined to be at (xylem)=(0,0).The next coordinate values along the first row of the image
are (xylem)=(0,1).It is important to keep in mind that the notation (0,1) is used to signify the
second sample along the first row. It does not mean that these are the actual values of physical
coordinates when the image was sampled. Following figure shows the coordinate convention.
Note that x ranges from 0 to M-1 and y from 0 to N-1 in integer increments.
The coordinate convention used in the toolbox to denote arrays is different from the
preceding paragraph in two minor ways. First, instead of using (xylem) the toolbox uses the
notation (race) to indicate rows and columns. Note, however, that the order of coordinates is
the same as the order discussed in the previous paragraph, in the sense that the first element of
a coordinate topples, (alb), refers to a row and the second to a column. The other difference is
that the origin of the coordinate system is at (r, c) = (1, 1); thus, r ranges from 1 to M and c
from 1 to N in integer increments. IPT documentation refers to the coordinates. Less frequently
the toolbox also employs another coordinate convention called spatial coordinates which uses
x to refer to columns and y to refers to rows. This is the opposite of our use of variables x and
y.
The preceding discussion leads to the following representation for a digitized image
function:
f(xylem)= . . .
. . .
is called an image element, picture element, pixel or pel. The terms image and pixel are used
throughout the rest of our discussions to denote a digital image and its elements.
MATLAB quantities). Clearly the two representations are identical, except for the shift in
origin. The notation f(p ,q) denotes the element located in row p and the column q. For example
f(6,2) is the element in the sixth row and second column of the matrix f. Typically we use the
letters M and N respectively to denote the number of rows and columns in a matrix. A 1xN
matrix is called a row vector whereas an Mx1 matrix is called a column vector. A 1x1 matrix
is a scalar.
Matrices in MATLAB are stored in variables with names such as A, a, RGB, real array
and so on. Variables must begin with a letter and contain only letters, numerals and
underscores. As noted in the previous paragraph, all MATLAB quantities are written using
mono-scope characters. We use conventional Roman, italic notation such as f(x ,y), for
mathematical expressions
Images are read into the MATLAB environment using function imread whose syntax is
imread(‘filename’)
Here filename is a spring containing the complete of the image file(including any applicable
reads the JPEG (above table) image chestxray into image array f. Note the use of single quotes
(‘) to delimit the string filename. The semicolon at the end of a command line is used by
MATLAB for suppressing output. If a semicolon is not included. MATLAB displays the
results of the operation(s) specified in that line. The prompt symbol(>>) designates the
When as in the preceding command line no path is included in filename, imread reads the
file from the current directory and if that fails it tries to find the file in the MATLAB search
path. The simplest way to read an image from a specified directory is to include a full or relative
reads the image from a folder called my images on the D: drive, whereas
reads the image from the my images subdirectory of the current of the current working
directory. The current directory window on the MATLAB desktop toolbar displays
MATLAB’s current working directory and provides a simple, manual way to change it.
Above table lists some of the most of the popular image/graphics formats supported by imread
and imwrite.
Although we work with integers coordinates the values of pixels themselves are not
restricted to be integers in MATLAB. Table above list various data classes supported by
MATLAB and IPT are representing pixels values. The first eight entries in the table are refers
to as numeric data classes. The ninth entry is the char class and, as shown, the last entry is
All numeric computations in MATLAB are done in double quantities, so this is also a
frequent data class encounter in image processing applications. Class unit 8 also is encountered
frequently, especially when reading data from storages devices, as 8 bit images are most
common representations found in practice. These two data classes, classes logical, and, to a
lesser degree, class unit 16 constitute the primary data classes on which we focus. Many ipt
functions however support all the data classes listed in table. Data class double requires 8 bytes
to represent a number uint8 and int 8 require one byte each, uint16 and int16 requires 2bytes
1 .Intensity images
2. Binary images
3. Indexed images
4. R G B images
Most monochrome image processing operations are carried out using binary or intensity
images, so our initial focus is on these two image types. Indexed and RGB colour images.
An intensity image is a data matrix whose values have been scaled to represent intentions.
When the elements of an intensity image are of class unit8, or class unit 16, they have integer
values in the range [0,255] and [0, 65535], respectively. If the image is of class double, the
values are floating _point numbers. Values of scaled, double intensity images are in the range
[0, 1] by convention.
Binary images have a very specific meaning in MATLAB.A binary image is a logical
array 0s and1s.Thus, an array of 0s and 1s whose values are of data class, say unit8, is not
B=logical (A)
If A contains elements other than 0s and 1s.Use of the logical function converts all
nonzero quantities to logical 1s and all entries with value 0 to logical 0s.
islogical(c)
If c is a logical array, this function returns a 1.Otherwise returns a 0. Logical array can be
Matrix map is an m*3 arrays of class double containing floating_ point values in the range
[0, 1].The length m of the map are equal to the number of colors it defines. Each row of map
specifies the red, green and blue components of a single color. An indexed images uses “direct
mapping” of pixel intensity values color map values. The color of each pixel is determined by
using the corresponding value the integer matrix x as a pointer in to map. If x is of class double
,then all of its components with values less than or equal to 1 point to the first row in map, all
components with value 2 point to the second row and so on. If x is of class units or unit 16,
then all components value 0 point to the first row in map, all components with value 1 point to
An RGB color image is an M*N*3 array of color pixels where each color pixel is triplet
corresponding to the red, green and blue components of an RGB image, at a specific spatial
location. An RGB image may be viewed as “stack” of three gray scale images that when fed in
Produce a color image on the screen. Convention the three images forming an RGB color
image are referred to as the red, green and blue components images. The data class of the
components images determines their range of values. If an RGB image is of class double the
Similarly the range of values is [0,255] or [0, 65535].For RGB images of class units or
unit 16 respectively. The number of bits use to represents the pixel values of the component
images determines the bit depth of an RGB image. For example, if each component image is
Generally, the number of bits in all component images is the same. In this case the number
of possible color in an RGB image is (2^b) ^3, where b is a number of bits in each component
1. The system can be used for places that require security like bank, military etc.
2. It can also be used in houses and society to recognize the outsiders and save their identity.
3. The software can used to mark attendance based on face recognition in organizations.
CHAPTER – 3
METHODOLOGY
METHODOLOGY
In this proposed system, the system is instantiated by the mobile. After it triggers then the
system starts processing the image for which we want to mark the attendance. Image Capturing
phase is one in which we capture the image. This is basic phase from which we start initializing
our system. We capture an image from a camera which is predominantly checked for certain
constraints like lightning, spacing, density, facial expressions. The captured image is resolute
for our requirements. Once it is resolute we make sure it is either in png or jpeg format else it
is converted. We take individuals different frontal postures so that the accuracy can be attained
to the maximum extent. This is the training database in which every individual has been
classified based on labels. For the captured image, from anevery object we detect only frontal
faces from viola-jones algorithm which detects only the frontal face posture of an every
individual from the captured image. This detects only faces and removes every other parts since
we are exploring the features of only faces. These detected faces are stored in the test database
for further enquiry. Features are extracted in this extraction phase. The detected bounding
boxes are further queried to look for features extraction and the extracted features are stored in
matrix. For every detected phase this feature extraction is done. Features we look here are
Shape, Edge, Color, Wavelet, Auto-Correlation and LBP. Face is recognized once we
completed extracting features. The feature which is already trained with every individual is
compared with the detected faces feature and if both features match then it is recognised. Once,
it recognizes it is going to update in the student attendance database. Once, the process is
completed the testing images gets deleted since, we are trying to design it for both the accuracy
3. Face recognition is achieved using machine learning and the basic pipeline used for it
is as follows:
4.3 MODULES
User
CHAPTER – 5
INTRODUCTION TO MAT
LAB
MATLAB is a high-performance language for technical computing. It integrates computation,
Algorithm development
Data acquisition
MATLAB is an interactive system whose basic data element is an array that does not
require dimensioning. This allows you to solve many technical computing problems, especially
those with matrix and vector formulations, in a fraction of the time it would take to write a
The name MATLAB stands for matrix laboratory. MATLAB was originally written to
provide easy access to matrix software developed by the LINPACK and EISPACK projects.
Today, MATLAB engines incorporate the LAPACK and BLAS libraries, embedding the state
MATLAB has evolved over a period of years with input from many users. In university
environments, it is the standard instructional tool for introductory and advanced courses in
mathematics, engineering, and science. In industry, MATLAB is the tool of choice for high-
Very important to most users of MATLAB, toolboxes allow you to learned apply specialized
extend the MATLAB environment to solve particular classes of problems. Areas in which
toolboxes are available include signal processing, control systems, neural networks, fuzzy
Development Environment:
This is the set of tools and facilities that help you use MATLAB functions and files.
Many of these tools are graphical user interfaces. It includes the MATLAB desktop and
Command Window, a command history, an editor and debugger, and browsers for viewing
This is a vast collection of computational algorithms ranging from elementary functions like
sum, sine, cosine, and complex arithmetic, to more sophisticated functions like matrix inverse,
This is a high-level matrix/array language with control flow statements, functions, data
"programming in the small" to rapidly create quick and dirty throw-away programs, and
"programming in the large" to create complete large and complex application programs.
Graphics:
MATLAB has extensive facilities for displaying vectors and matrices as graphs, as well
as annotating and printing these graphs. It includes high-level functions for two-dimensional
graphics. It also includes low-level functions that allow you to fully customize the appearance
applications.
This is a library that allows you to write C and Fortran programs that interact with MATLAB.
It includes facilities for calling routines from MATLAB (dynamic linking), calling MATLAB
MATLAB desktoP:-
Matlab Desktop is the main Matlab application window. The desktop contains five sub
windows, the command window, the workspace browser, the current directory window, the
command history window, and one or more figure windows, which are shown only when the
The command window is where the user types MATLAB commands and expressions at
the prompt (>>) and where the output of those commands is displayed. MATLAB defines the
workspace as the set of variables that the user creates in a work session. The workspace browser
shows these variables and some information about them. Double clicking on a variable in the
workspace browser launches the Array Editor, which can be used to obtain information and
The current Directory tab above the workspace tab shows the contents of the current
directory, whose path is shown in the current directory window. For example, in the windows
operating system the path might be as follows: C:\MATLAB\Work, indicating that directory
DRIVE C. clicking on the arrow in the current directory window shows a list of recently used
paths. Clicking on the button to the right of the window allows the user to change the current
directory.
MATLAB uses a search path to find M-files and other MATLAB related files, which
are organize in directories in the computer file system. Any file run in MATLAB must reside
in the current directory or in a directory that is on search path. By default, the files supplied
with MATLAB and math works toolboxes are included in the search path. The easiest way to
see which directories are on the search path. The easiest way to see which directories are soon
the search paths, or to add or modify a search path, is to select set path from the File menu the
desktop, and then use the set path dialog box. It is good practice to add any commonly used
directories to the search path to avoid repeatedly having the change the current directory.
The Command History Window contains a record of the commands a user has entered in
the command window, including both current and previous MATLAB sessions. Previously
entered MATLAB commands can be selected and re-executed from the command history
commands. This is useful to select various options in addition to executing the commands. This
The MATLAB editor is both a text editor specialized for creating M-files and a graphical
MATLAB debugger. The editor can appear in a window by itself, or it can be a sub window in
the desktop. M-files are denoted by the extension .m, as in pixelup.m. The MATLAB editor
window has numerous pull-down menus for tasks such as saving, viewing, and debugging files.
Because it performs some simple checks and also uses color to differentiate between various
elements of code, this text editor is recommended as the tool of choice for writing and editing
M-functions. To open the editor, type edit at the prompt opens the M-file filename.m in an
editor window, ready for editing. As noted earlier, the file must be in the current directory, or
Image Representation Image Format An image is a rectangular array of values (pixels). Each
pixel represents the measurement of some property of a scene measured over a finite area. The
property could be many things, but we usually measure either the average brightness (one
value) or the brightnesses of the image filtered through red, green and blue filters (three values).
The values are normally represented by an eight bit integer, giving a range of 256 levels of
brightness. We talk about the resolution of an image: this is defined by the number of pixels
and number of brightness values. A raw image will take up a lot of storage space. Methods
have been defined to compress the image by coding redundant data in a more efficient fashion,
or by discarding the perceptually less significant information. MATLAB supports reading all
of the common image formats. Image coding is not addressed in this course unit.
Image Loading and Displaying and Saving An image is loaded into working memory using the
command
The semicolon at the end of the command suppresses MATLAB output. Without it, MATLAB
will execute the command and echo the results to the screen. We assign the image to the array
f. If no path is specified, MATLAB will look for the image file in the current directory. The
f is the image to be displayed, G defines the range of intensity levels used to display it. If it is
omitted, the default value 256 is used. If the syntax [low, high] is used instead of G, values less
than low are displayed as black, and ones greater than high are displayed as white. Finally, if
low and high are left out, i.e. use [ ], low is set to the minimum value in the image and high to
the maximum one, which is useful for automatically fixing the range of the image if it is very
small or vary large. Images are usually displayed in a figure window. If a second image is
displayed it will overwrite the first, unless the figure function is used:
will generate a new figure window and display the image in it. Note that multiple functions
may be called in a single line, provided they are separated by commas. An image array may be
The format of the file can be inferred from the file extension, or can be specified by a third
argument. Certain file formats have additional arguments. 3.3 Image Information Information
4 Quantisation 4.1 Grey Level Ranges Images are normally captured with pixels in each
channel being represented by eight-bit integers. (This is partly for historical reasons – it has the
convenience of being a basic memory unit, it allows for a suitable range of values to be
represented, and many cameras could not capture data to any greater accuracy. Further, most
displays are limited to eight bits per red, green and blue channel.) But there is no reason why
pixels should be so limited, indeed, there are devices and applications that deliver and require
illumination (other wavelength ranges and more of them are possible). MATLAB provides
functions for changing images from one type to another. The syntax is
>> B = data_class_name(A)
where data_class_name is one of the data types in the above table, e.g.
>> B = uint8(A)
Number of Pixels Images come in all sizes, but are (almost) always rectangular. MATLAB
gives several methods of accessing the elements of an array, i.e. the pixels of an image. An
element can be accessed directly: typing the array name at the prompt will return all the array
elements (which could take a while), typing the array name followed by element indices in
round brackets, will return that value. Ranges of array elements can be accessed using colons.
Will return the first to last elements inclusive of the one-dimensional array A. Note that the
Will return every step element starting from first and finishing when last is reached or
exceeded. Step could be negative, in which case you’d have to ensure that first was greater than
last. Naturally, this notation can be extended to access portions of an image. An image, f, could
be flipped using
The keyword end is used to signify the last index. Using the colon alone implies that all index
values are traversed. This also indicates how multi-dimensional arrays are accessed. Or a
A note on colour images. If the input image is colour, these operations will return greyscale
results. A colour image has three values per pixel, which are accessed using a third index.
>> A (x, y, 1:3)
Would return all three colour values of the pixel at (x,y). A colour plane could be abstracted
using
Point Processing Point processing operations manipulate individual pixel values, without
regard to any neighbouring values. Two types of transforms can be identified, manipulating
Value Manipulation The fundamental value of a pixel is its brightness (in a monochrome
>> R = imadjust (A, [low in, high in], [low out, high out], gamma);
This takes the input range of values as specified and maps them to the output range that’s
specified. Values outside of the input range are clamped to the extremes of the output range
(values below low in are all mapped to low out). The range values are expected to be in the
interval [0, 1]. The function scales them to values appropriate to the image type before applying
the scaling operation. Whilst low in is expected to be less than high in, the same is not true for
low out and high out. The image can therefore be inverted. The value of gamma specifies the
shape of the mapped curve. Gamma = 1 gives a linear scaling, a smaller gamma gives a
mapping that expands the scale at lower values, a larger gamma expands the upper range of the
scale. This can make the contrast between darker or brighter tones more visible, respectively.
Omitting any of the parameters results in default values being assumed. The extremes of the
There may be various types of lighting conditions, seating arrangements and environments in
various classrooms. Most of these conditions have been tested on the system and system has
shown 100% accuracy for most of the cases. There may also exist students portraying various
facial expressions, varying hair styles, beard, spectacles etc. All of these cases are considered
and tested to obtain a high level of accuracy and efficiency. Thus, it can be concluded from
the above discussion that a reliable, secure, fast and an efficient system has been developed
replacing a manual and unreliable system. This system can be implemented for better results
regarding the management of attendance and leaves. The system will save time, reduce the
amount of work the administration has to do and will replace the stationery material with
electronic apparatus and reduces the amount of human resource required for the purpose. Hence
a system with expected results has been developed but there is still some room for improvement
CHAPTER - 7
System Requirement Specification
7.1 Functional Requirements
System functional requirement describes activities and services that must provide.
1. Accuracy and Precision: the system should perform its process in accuracy and
Precision to avoid problems.
2. Modifiability: the system should be easy to modify, any wrong should be correct.
3. Security: the system should be secure and saving student’s privacy.
4. Usability: the system should be easy to deal with and simple to understand.
5. Maintainability: the maintenance group should be able to fix any problem occur
suddenly.
6. Speed and Responsiveness: Execution of operations should be fast.
2. MATLAB
1. Processor – i3
2. Hard Disk – 5 GB
if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT
% --- Outputs from this function are returned to the command line.
function varargout = main_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function AD_NW_IMAGE_Callback(hObject, eventdata, handles)
% hObject handle to AD_NW_IMAGE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function DE_LETE_Callback(hObject, eventdata, handles)
% hObject handle to DE_LETE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function TRAIN_ING_Callback(hObject, eventdata, handles)
% hObject handle to TRAIN_ING (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function STA_RT_Callback(hObject, eventdata, handles)
% hObject handle to STA_RT (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function DATA_BASE_Callback(hObject, eventdata, handles)
% hObject handle to DATA_BASE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function RESET_ALL_Callback(hObject, eventdata, handles)
% hObject handle to RESET_ALL (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function EXI_T_Callback(hObject, eventdata, handles)
% hObject handle to EXI_T (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function HE_LP_Callback(hObject, eventdata, handles)
% hObject handle to HE_LP (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function READ_ME_Callback(hObject, eventdata, handles)
% hObject handle to READ_ME (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
%winopen('help.pdf')
% --------------------------------------------------------------------
function PRE_CAP_Callback(hObject, eventdata, handles)
% hObject handle to PRE_CAP (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
if exist('features.mat','file') == 0
msgbox('FIRST TRAIN YOUR DATABASE','INFO...!!!','MODAL')
return
end
ff = dir('database');
if length(ff) == 2
h = waitbar(0,'Plz wait Matlab is scanning ur
database...','name','SCANNING IS IN PROGRESS');
for k = 1:100
waitbar(k/100)
pause(0.03)
end
close(h)
msgbox({'NO IMAGE FOUND IN DATABASE';'FIRST LOAD YOUR DATABASE';'USE
''ADD NEW IMAGE'' MENU'},'WARNING....!!!','WARN','MODAL')
return
end
fd = vision.CascadeObjectDetector();
[f,p] = uigetfile('*.jpg','PLEASE SELECT AN FACIAL IMAGE');
if f == 0
return
end
p1 = fullfile(p,f);
im = imread(p1);
bbox = step(fd, im);
vo = insertObjectAnnotation(im,'rectangle',bbox,'FACE');
r = size(bbox,1);
if isempty(bbox)
axes(handles.axes1)
imshow(vo);
msgbox({'NO FACE IN THIS PIC';'PLEASE SELECT SINGLE FACE
IMAGE'},'WARNING...!!!','warn','modal')
uiwait
cla(handles.axes1); reset(handles.axes1);
set(handles.axes1,'box','on','xtick',[],'ytick',[])
return
elseif r > 1
axes(handles.axes1)
imshow(vo);
msgbox({'TOO MANY FACES IN THIS PIC';'PLEASE SELECT SINGLE FACE
IMAGE'},'WARNING...!!!','warn','modal')
uiwait
cla(handles.axes1); reset(handles.axes1);
set(handles.axes1,'box','on','xtick',[],'ytick',[])
returnf
end
axes(handles.axes1)
image(vo);
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
bx = questdlg({'CORRECT IMAGE IS SELECTED';'SELECT OPTION FOR FACE
EXTRACTION'},'SELECT AN OPTION','MANUALLY','AUTO','CC');
if strcmp(bx,'MANUALLY') == 1
while 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imc = imcrop(im);
bbox1 = step(fd, imc);
if size(bbox1,1) ~= 1
msgbox({'YOU HAVENT CROPED A FACE';'CROP AGAIN'},'BAD
ACTION','warn','modal')
uiwait
else
close gcf
break
end
close gcf
end
imc = imresize(imc,[300 300]);
image(imc)
text(20,20,'\bfUr Precaptured
image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
end
if strcmp(bx,'AUTO') == 1
imc = imcrop(im,[bbox(1)-50 bbox(2)-250 bbox(3)+100
bbox(4)+400]);
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(imc)
qx = questdlg({'ARE YOU SATISFIED WITH THE RESULTS?';' ';'IF YES THEN
PROCEED';' ';'IF NOT BETTER DO MANUAL
CROPING'},'SELECT','PROCEED','MANUAL','CC');
if strcmpi(qx,'proceed') == 1
close gcf
imc = imresize(imc,[300 300]);
axes(handles.axes1)
image(imc)
text(20,20,'\bfUr Precaptured
image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
elseif strcmpi(qx,'manual') == 1
while 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imc = imcrop(im);
bbox1 = step(fd, imc);
if size(bbox1,1) ~= 1
msgbox({'YOU HAVENT CROPED A FACE';'CROP AGAIN'},'BAD
ACTION','warn','modal')
uiwait
else
break
end
close gcf
end
close gcf
imc = imresize(imc,[300 300]);
axes(handles.axes1)
image(imc)
text(20,20,'\bfUr Precaptured
image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
else
end
end
immxx = getimage(handles.axes1);
zz = findsimilar(immxx);
zz = strtrim(zz);
fxz = imread(['database/' zz]);
q1= ehd(immxx,0.1);
q2 = ehd(fxz,0.1);
q3 = pdist([q1 ; q2]);
disp(q3)
if q3 < 0.5
axes(handles.axes2)
image(fxz)
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
text(20,20,'\bfUr Database Entered
Image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes2,'xtick',[],'ytick',[],'box','on')
xs = load('info.mat');
xs1 = xs.z2;
for k = 1:length(xs1)
st = xs1{k};
stx = st{1};
if strcmp(stx,zz) == 1
str = st{2};
break
end
end
fid = fopen('attendence_sheet.txt','a');
fprintf(fid,'%s %s %s
%s\r\n\n', 'Name','Date','Time', 'Attendence');
c = clock;
if c(4) > 12
s = [num2str(c(4)-12) ,':',num2str(c(5)), ':', num2str(round(c(6)))
];
else
s = [num2str(c(4)) ,':',num2str(c(5)), ':', num2str(round(c(6))) ];
end
fprintf(fid,'%s %s %s %s\r\n\n',
str, date,s,'Present');
fclose(fid);
set(handles.text5,'string',['Hello ' str ' ,Your attendence has been
Marked.'])
try
s = serial('com22');
fopen(s);
fwrite(s,'A');
pause(1)
fclose(s);
clear s
catch
% msgbox({'PLZ CONNECT CABLE OR';'INVALID COM PORT
SELECTED'},'WARNING','WARN','MODAL')
uiwait
delete(s)
clear s
end
else
msgbox('YOU ARE NOT A VALID PERSON', 'WARNING','WARN','MODAL')
cla(handles.axes1)
reset(handles.axes1)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5);
set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
end
% --------------------------------------------------------------------
function LIVE_CAM_Callback(hObject, eventdata, handles)
% hObject handle to LIVE_CAM (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
global co
if exist('features.mat','file') == 0
msgbox('FIRST TRAIN YOUR DATABASE','INFO...!!!','MODAL')
return
end
ff = dir('database');
if length(ff) == 2
h = waitbar(0,'Plz wait Matlab is scanning ur
database...','name','SCANNING IS IN PROGRESS');
for k = 1:100
waitbar(k/100)
pause(0.03)
end
close(h)
msgbox({'NO IMAGE FOUND IN DATABASE';'FIRST LOAD YOUR DATABASE';'USE
''ADD NEW IMAGE'' MENU'},'WARNING....!!!','WARN','MODAL')
return
end
if isfield(handles,'vdx')
vid = handles.vdx;
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)
set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
end
info = imaqhwinfo('winvideo');
did = info.DeviceIDs;
if isempty(did)
msgbox({'YOUR SYSTEM DO NOT HAVE A WEBCAM';' ';'CONNECT A
ONE'},'WARNING....!!!!','warn','modal')
return
end
fd = vision.CascadeObjectDetector();
did = cell2mat(did);
for k = 1:length(did)
devinfo = imaqhwinfo('winvideo',k);
na(1,k) = {devinfo.DeviceName};
sr(1,k) = {devinfo.SupportedFormats};
end
[a,b] = listdlg('promptstring','SELECT A WEB CAM
DEVICE','liststring',na,'ListSize', [125, 75],'SelectionMode','single');
if b == 0
return
end
if b ~= 0
frmt = sr{1,a};
[a1,b1] = listdlg('promptstring','SELECT
RESOLUTION','liststring',frmt,'ListSize', [150,
100],'SelectionMode','single');
if b1 == 0
return
end
end
frmt = frmt{a1};
l = find(frmt == '_');
res = frmt(l+1 : end);
l = find(res == 'x');
res1 = str2double(res(1: l-1));
res2 = str2double(res(l+1 : end));
axes(handles.axes1)
vid = videoinput('winvideo', a);
vr = [res1 res2];
nbands = get(vid,'NumberofBands');
h2im = image(zeros([vr(2) vr(1) nbands] , 'uint8'));
preview(vid,h2im);
handles.vdx = vid;
guidata(hObject,handles)
tx = msgbox('PLZ STAND IN FRONT OF CAMERA STILL','INFO......!!!');
pause(1)
delete(tx)
kx = 0;
while 1
im = getframe(handles.axes1);
im = im.cdata;
bbox = step(fd, im);
vo = insertObjectAnnotation(im,'rectangle',bbox,'FACE');
axes(handles.axes2)
imshow(vo)
if size(bbox,1) > 1
msgbox({'TOO MANY FACES IN FRAME';' ';'ONLY ONE FACE IS
ACCEPTED'},'WARNING.....!!!','warn','modal')
uiwait
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)
set(handles.axes1,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes2,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
return
end
kx = kx + 1;
if kx > 10 && ~isempty(bbox)
break
end
end
imc = imcrop(im,[bbox(1)+3 bbox(2)-35 bbox(3)-10 bbox(4)+70]);
imx = imresize(imc,[300 300]);
axes(handles.axes1)
image(imx)
text(20,20,'\bfUr Current
image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
immxx = imx;
zz = findsimilar(immxx);
zz = strtrim(zz);
fxz = imread(['database/' zz]);
q1= ehd(immxx,0.1);
q2 = ehd(fxz,0.1);
q3 = pdist([q1 ; q2]);
disp(q3)
if q3 < 0.5
axes(handles.axes2)
image(fxz)
set(handles.axes1,'xtick',[],'ytick',[],'box','on')
text(20,20,'\bfUr Database Entered
Image.','fontsize',12,'color','y','fontname','comic sans ms')
set(handles.axes2,'xtick',[],'ytick',[],'box','on')
xs = load('info.mat');
xs1 = xs.z2;
for k = 1:length(xs1)
st = xs1{k};
stx = st{1};
if strcmp(stx,zz) == 1
str = st{2};
break
end
end
fid = fopen('attendence_sheet.txt','a');
fprintf(fid,'%s %s %s
%s\r\n\n', 'Name','Date','Time', 'Attendence');
c = clock;
if c(4) > 12
s = [num2str(c(4)-12) ,':',num2str(c(5)), ':', num2str(round(c(6)))
];
else
s = [num2str(c(4)) ,':',num2str(c(5)), ':', num2str(round(c(6))) ];
end
fprintf(fid,'%s %s %s %s\r\n\n',
str, date,s,'Present');
fclose(fid);
set(handles.text5,'string',['Hello ' str ' ,Your attendence has been
Marked.'])
try
s = serial('com22');
fopen(s);
fwrite(s,'A');
pause(1)
fclose(s);
clear s
catch
msgbox({'PLZ CONNECT CABLE OR';'INVALID COM PORT
SELECTED'},'WARNING','WARN','MODAL')
uiwait
delete(s)
clear s
end
else
msgbox('YOU ARE NOT A VALID PERSON', 'WARNING','WARN','MODAL')
cla(handles.axes1)
reset(handles.axes1)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5);
set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
end
% --------------------------------------------------------------------
function SINGL_PIC_Callback(hObject, eventdata, handles)
% hObject handle to SINGL_PIC (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
flist = dir('database');
if length(flist) == 2
msgbox('NOTHING TO DELETE','INFO','modal');
return
end
cd('database')
[f,p] = uigetfile('*.jpg','SELECT A PIC TO DELETE IT');
if f == 0
cd ..
return
end
p1 = fullfile(p,f);
delete(p1)
flist = dir(pwd);
if length(flist) == 2
cd ..
return
end
for k = 3:length(flist)
z = flist(k).name;
z(strfind(z,'.') : end) = [];
nlist(k-2) = str2double(z);
end
nlist = sort(nlist);
h = waitbar(0,'PLZ WAIT, WHILE MATLAB IS RENAMING','name','PROGRESS...');
for k = 1:length(nlist)
if k ~= nlist(k)
p = nlist(k);
movefile([num2str(p) '.jpg'] , [num2str(k) '.jpg'])
waitbar((k-2)/length(flist),h,sprintf('RENAMED %s to
%s',[num2str(p) '.jpg'],[num2str(k) '.jpg']))
end
pause(.5)
end
close(h)
cd ..
% --------------------------------------------------------------------
function MULTI_PIC_Callback(hObject, eventdata, handles)
% hObject handle to MULTI_PIC (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
flist = dir('database');
if length(flist) == 2
msgbox('NOTHING TO DELETE','INFO','modal');
return
end
for k = 3:length(flist)
na1(k-2,1) = {flist(k).name};
end
[a,b] = listdlg('promptstring','SELECT FILE/FILES TO
DELETE','liststring',na1,'listsize',[125 100]);
if b == 0
return
end
cd ('database')
for k = 1:length(a)
str = na1{k};
delete(str)
end
cd ..
flist = dir('database');
if length(flist) == 2
msgbox({'NOTHING TO RENAME';'ALL DELETED'},'INFO','modal');
return
end
cd('database')
flist = dir(pwd);
for k = 3:length(flist)
z = flist(k).name;
z(strfind(z,'.') : end) = [];
nlist(k-2) = str2double(z);
end
nlist = sort(nlist);
h = waitbar(0,'PLZ WAIT, WHILE MATLAB IS RENAMING','name','PROGRESS...');
for k = 1:length(nlist)
if k ~= nlist(k)
p = nlist(k);
movefile([num2str(p) '.jpg'] , [num2str(k) '.jpg'])
waitbar((k-2)/length(flist),h,sprintf('RENAMED %s to
%s',[num2str(p) '.jpg'],[num2str(k) '.jpg']))
end
pause(.5)
end
close(h)
cd ..
% --------------------------------------------------------------------
function BR_OWSE_Callback(hObject, eventdata, handles)
% hObject handle to BR_OWSE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
[f,p] = uigetfile('*.jpg','PLEASE SELECT AN FACIAL IMAGE');
if f == 0
return
end
p1 = fullfile(p,f);
im = imread(p1);
fd = vision.CascadeObjectDetector();
bbox = step(fd, im);
vo = insertObjectAnnotation(im,'rectangle',bbox,'FACE');
r = size(bbox,1);
if isempty(bbox)
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(vo);
msgbox({'WHAT HAVE U CHOOSEN?';'NO FACE FOUND IN THIS PIC,';'SELECT
SINGLE FACE IMAGE.'},'WARNING...!!!','warn','modal')
uiwait
delete(fhx)
return
elseif r > 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(vo);
msgbox({'TOO MANY FACES IN THIS PIC';'PLEASE SELECT SINGLE FACE
IMAGE'},'WARNING...!!!','warn','modal')
uiwait
delete(fhx)
return
end
bx = questdlg({'CORRECT IMAGE IS SELECTED';'SELECT OPTION FOR FACE
EXTRACTION'},'SELECT AN OPTION','MANUALLY','AUTO','CC');
if strcmp(bx,'MANUALLY') == 1
while 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imc = imcrop(im);
bbox1 = step(fd, imc);
if size(bbox1,1) ~= 1
msgbox({'YOU HAVENT CROPED A FACE';'CROP AGAIN'},'BAD
ACTION','warn','modal')
uiwait
else
break
end
close gcf
end
close gcf
imc = imresize(imc,[300 300]);
cd ('database');
l = length(dir(pwd));
n = [int2str(l-1) '.jpg'];
imwrite(imc,n);
cd ..
while 1
qq = inputdlg('WHAT IS UR NAME?','FILL');
if isempty(qq)
msgbox({'YOU HAVE TO ENTER A NAME';' ';'YOU CANT CLICK
CANCEL'},'INFO','HELP','MODAL')
uiwait
else
break
end
end
qq = qq{1};
if exist('info.mat','file') == 2
load ('info.mat')
r = size(z2,1);
z2{r+1,1} = {n , qq};
save('info.mat','z2')
else
z2{1,1} = {n,qq};
save('info.mat','z2')
end
end
if strcmp(bx,'AUTO') == 1
imc = imcrop(im,[bbox(1)-50 bbox(2)-250 bbox(3)+100
bbox(4)+400]);
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(imc)
qx = questdlg({'ARE YOU SATISFIED WITH THE RESULTS?';' ';'IF YES THEN
PROCEED';' ';'IF NOT BETTER DO MANUAL
CROPING'},'SELECT','PROCEED','MANUAL','CC');
if strcmpi(qx,'proceed') == 1
imc = imresize(imc,[300 300]);
cd ('database');
l = length(dir(pwd));
n = [int2str(l-1) '.jpg'];
imwrite(imc,n);
cd ..
while 1
qq = inputdlg('WHAT IS UR NAME?','FILL');
if isempty(qq)
msgbox({'YOU HAVE TO ENTER A NAME';' ';'YOU CANT CLICK
CANCEL'},'INFO','HELP','MODAL')
uiwait
else
break
end
end
qq = qq{1};
if exist('info.mat','file') == 2
load ('info.mat')
r = size(z2,1);
z2{r+1,1} = {n , qq};
save('info.mat','z2')
else
z2{1,1} = {n,qq};
save('info.mat','z2')
end
close gcf
elseif strcmpi(qx,'manual') == 1
while 1
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imc = imcrop(im);
bbox1 = step(fd, imc);
if size(bbox1,1) ~= 1
msgbox({'YOU HAVENT CROPED A FACE';'CROP AGAIN'},'BAD
ACTION','warn','modal')
uiwait
else
break
end
close gcf
end
close gcf
imc = imresize(imc,[300 300]);
cd ('database');
l = length(dir(pwd));
n = [int2str(l-1) '.jpg'];
imwrite(imc,n);
cd ..
while 1
qq = inputdlg('WHAT IS UR NAME?','FILL');
if isempty(qq)
msgbox({'YOU HAVE TO ENTER A NAME';' ';'YOU CANT CLICK
CANCEL'},'INFO','HELP','MODAL')
uiwait
else
break
end
end
qq = qq{1};
if exist('info.mat','file') == 2
load ('info.mat')
r = size(z2,1);
z2{r+1,1} = {n , qq};
save('info.mat','z2')
else
z2{1,1} = {n,qq};
save('info.mat','z2')
end
else
return
end
end
% --------------------------------------------------------------------
function FRM_CAM_Callback(hObject, eventdata, handles)
% hObject handle to FRM_CAM (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
global co
if isfield(handles,'vdx')
vid = handles.vdx;
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)
set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
end
fd = vision.CascadeObjectDetector();
info = imaqhwinfo('winvideo');
did = info.DeviceIDs;
if isempty(did)
msgbox({'YOUR SYSTEM DO NOT HAVE A WEBCAM';' ';'CONNECT A
ONE'},'WARNING....!!!!','warn','modal')
return
end
did = cell2mat(did);
for k = 1:length(did)
devinfo = imaqhwinfo('winvideo',k);
na(1,k) = {devinfo.DeviceName};
sr(1,k) = {devinfo.SupportedFormats};
end
[a,b] = listdlg('promptstring','SELECT A WEB CAM
DEVICE','liststring',na,'ListSize', [125, 75],'SelectionMode','single');
if b == 0
return
end
if b ~= 0
frmt = sr{1,a};
[a1,b1] = listdlg('promptstring','SELECT
RESOLUTION','liststring',frmt,'ListSize', [150,
100],'SelectionMode','single');
if b1 == 0
return
end
end
frmt = frmt{a1};
l = find(frmt == '_');
res = frmt(l+1 : end);
l = find(res == 'x');
res1 = str2double(res(1: l-1));
res2 = str2double(res(l+1 : end));
axes(handles.axes1)
vid = videoinput('winvideo', a);
vr = [res1 res2];
nbands = get(vid,'NumberofBands');
h2im = image(zeros([vr(2) vr(1) nbands] , 'uint8'));
preview(vid,h2im);
handles.vdx = vid;
guidata(hObject,handles)
tx = msgbox('PLZ STAND IN FRONT OF CAMERA STILL','INFO......!!!');
pause(1)
delete(tx)
kx = 0;
while 1
im = getframe(handles.axes1);
im = im.cdata;
bbox = step(fd, im);
vo = insertObjectAnnotation(im,'rectangle',bbox,'FACE');
axes(handles.axes2)
imshow(vo)
if size(bbox,1) > 1
msgbox({'TOO MANY FACES IN FRAME';' ';'ONLY ONE FACE IS
ACCEPTED'},'WARNING.....!!!','warn','modal')
uiwait
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)
set(handles.axes1,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes2,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
return
end
kx = kx + 1;
if kx > 10 && ~isempty(bbox)
break
end
end
imc = imcrop(im,[bbox(1)+3 bbox(2)-35 bbox(3)-10 bbox(4)+70]);
imx = imresize(imc,[300 300]);
fhx = figure(2);
set(fhx,'menubar','none','numbertitle','off','name','PREVIEW')
imshow(imx)
cd ('database');
l = length(dir(pwd));
n = [int2str(l-1) '.jpg'];
imwrite(imx,n);
cd ..
while 1
qq = inputdlg('WHAT IS UR NAME?','FILL');
if isempty(qq)
msgbox({'YOU HAVE TO ENTER A NAME';' ';'YOU CANT CLICK
CANCEL'},'INFO','HELP','MODAL')
uiwait
else
break
end
end
qq = qq{1};
if exist('info.mat','file') == 2
load ('info.mat')
r = size(z2,1);
z2{r+1,1} = {n , qq};
save('info.mat','z2')
else
z2{1,1} = {n,qq};
save('info.mat','z2')
end
close gcf
stoppreview(vid)
delete(vid)
handles = rmfield(handles,'vdx');
guidata(hObject,handles)
cla(handles.axes1)
reset(handles.axes1)
set(handles.axes1,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
cla(handles.axes2)
reset(handles.axes2)
set(handles.axes2,'box','on','xtick',[],'ytick',[],'xcolor',[1 1
1],'ycolor',[1 1 1],'color',co,'linewidth',1.5)
% --- Executes on key press with focus on edit1 and none of its controls.
function edit1_KeyPressFcn(hObject, eventdata, handles)
% hObject handle to edit1 (see GCBO)
% eventdata structure with the following fields (see UICONTROL)
% Key: name of the key that was pressed, in lower case
% Character: character interpretation of the key(s) that was pressed
% Modifier: name(s) of the modifier key(s) (i.e., control, shift) pressed
% handles structure with handles and user data (see GUIDATA)
pass = get(handles.edit1,'UserData');
v = double(get(handles.figure1,'CurrentCharacter'));
if v == 8
pass = pass(1:end-1);
set(handles.edit1,'string',pass)
elseif any(v == 65:90) || any(v == 97:122) || any(v == 48:57)
pass = [pass char(v)];
elseif v == 13
p = get(handles.edit1,'UserData');
if strcmp(p,'123') == true
delete(hObject);
delete(handles.pushbutton2)
delete(handles.pushbutton1);
delete(handles.text2);
delete(handles.text3);
% delete(handles.text1);
delete(handles.text4);
msgbox('WHY DONT U READ HELP BEFORE
STARTING','HELP....!!!','help','modal')
set(handles.AD_NW_IMAGE,'enable','on')
set(handles.DE_LETE,'enable','on')
set(handles.TRAIN_ING,'enable','on')
set(handles.STA_RT,'enable','on')
set(handles.RESET_ALL,'enable','on')
set(handles.EXI_T,'enable','on')
set(handles.HE_LP,'enable','on')
set(handles.DATA_BASE,'enable','on')
set(handles.text5,'visible','on')
return
else
beep
msgbox('INVALID PASSWORD FRIEND...
XX','WARNING....!!!','warn','modal')
uiwait;
set(handles.edit1,'string','')
return
end
else
msgbox({'Invalid Password Character';'Can''t use Special
Character'},'warn','modal')
uiwait;
set(handles.edit1,'string','')
return
end
set(handles.edit1,'UserData',pass)
set(handles.edit1,'String',char('*'*sign(pass)))
% --------------------------------------------------------------------
function VI_EW_Callback(hObject, eventdata, handles)
% hObject handle to VI_EW (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
f = dir('database');
if length(f) == 2
msgbox('YOUR DATA BASE HAS NO IMAGE TO DISPLAY','SORRY','modal')
return
end
l = length(f)-2;
while 1
a = factor(l);
if length(a) >= 4
break
end
l = l+1;
end
d = a(1: ceil(length(a)/2));
d = prod(d);
d1 = a(ceil(length(a)/2)+1 : end);
d1 = prod(d1);
zx = sort([d d1]);
figure('menubar','none','numbertitle','off','name','Images of
Database','color',[0.0431 0.5176 0.7804],'position',[300 200 600 500])
for k = 3:length(f)
im = imread(f(k).name);
subplot(zx(1),zx(2),k-2)
imshow(im)
title(f(k).name,'fontsize',10,'color','w')
end
% --------------------------------------------------------------------
function Start_Training_Callback(hObject, eventdata, handles)
% hObject handle to Start_Training (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
ff = dir('database');
if length(ff) == 2
h = waitbar(0,'Plz wait Matlab is scanning ur
database...','name','SCANNING IS IN PROGRESS');
for k = 1:100
waitbar(k/100)
pause(0.03)
end
close(h)
msgbox({'NO IMAGE FOUND IN DATABASE';'FIRST LOAD YOUR DATABASE';'USE
''ADD NEW IMAGE'' MENU'},'WARNING....!!!','WARN','MODAL')
return
end
if exist('features.mat','file') == 2
bx = questdlg({'TRAINING HAS ALREDY BEEN DONE';' ';'WANT TO TRAIN
DATABASE AGAIN?'},'SELECT','YES','NO','CC');
if strcmpi(bx,'yes') == 1
builddatabase
msgbox('TRAINING DONE....PRESS OK TO CONTINUE','OK','modal')
return
else
return
end
else
builddatabase
msgbox('TRAINING DONE....PRESS OK TO CONTINUE','OK','modal')
return
end
% --------------------------------------------------------------------
function BYE_Callback(hObject, eventdata, handles)
% hObject handle to BYE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
close gcf
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%end%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%
% --------------------------------------------------------------------
function ATTENDENCE_Callback(hObject, eventdata, handles)
% hObject handle to ATTENDENCE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
if exist('attendence_sheet.txt','file') == 2
winopen('attendence_sheet.txt')
else
msgbox('NO ATTENDENCE SHEET TO DISPLAY','INFO...!!!','HELP','MODAL')
end
% --------------------------------------------------------------------
function DEL_ATTENDENCE_Callback(hObject, eventdata, handles)
% hObject handle to DEL_ATTENDENCE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
if exist('attendence_sheet.txt','file') == 2
delete('attendence_sheet.txt')
msgbox('ATTENDENCE DELETED','INFO...!!!','MODAL')
else
msgbox('NO ATTENDENCE SHEET TO DELETE','INFO...!!!','HELP','MODAL')
end
% --------------------------------------------------------------------
function Untitled_1_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
x = questdlg({'Resetting will Clear the followings: ';'1.
Attendence_sheet';'2. Database';'3. features.mat';'4. Info.mat';'Do u want
to continue?'},'Please select...!!');
if strcmpi(x,'yes') == 1
delete('attendence_sheet.txt')
delete('features.mat')
delete('info.mat')
cd ([pwd, '\database'])
f = dir(pwd);
for k = 1:length(f)
delete(f(k).name)
end
cd ..
cla(handles.axes1);
reset(handles.axes1);
set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
cla(handles.axes2);
reset(handles.axes2);
set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
set(handles.text5,'string','')
beep
msgbox('All Reset','Info','modal')
end
% --------------------------------------------------------------------
function Untitled_2_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
cla(handles.axes1);
reset(handles.axes1);
set(handles.axes1,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
cla(handles.axes2);
reset(handles.axes2);
set(handles.axes2,'box','on','xcolor','w','ycolor','w','xtick',[],'ytick',[
],'color',[0.0431 0.5176 0.7804],'linewidth',1.5)
set(handles.text5,'string','')
% --------------------------------------------------------------------
function Untitled_3_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function Untitled_4_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_4 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% --------------------------------------------------------------------
function Untitled_5_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_5 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
CHAPTER – 10
TESTING
In general, software engineers distinguish software faults from software failures. In case
of a failure, the software does not do what the user expects. A fault is a programming error that
may or may not actually manifest as a failure. A fault can also be described as an error in the
correctness of the semantic of a computer program. A fault will become a failure if the exact
computation conditions are met, one of them being that the faulty portion of computer software
executes on the CPU. A fault can also turn into a failure when the software is ported to a
different hardware platform or a different compiler, or when the software gets extended.
Software testing is the technical investigation of the product under test to provide stakeholders
with quality related information.
Software testing may be viewed as a sub-field of Software Quality Assurance but
typically exists independently (and there may be no SQA areas in some companies). In SQA,
software process specialists and auditors take a broader view on software and its development.
They examine and change the software engineering process itself to reduce the amount of faults
that end up in the code or deliver faster.
Regardless of the methods used or level of formality involved the desired result of
testing is a level of confidence in the software so that the organization is confident that the
software has an acceptable defect rate. What constitutes an acceptable defect rate depends on
the nature of the software. An arcade video game designed to simulate flying an airplane would
presumably have a much higher tolerance for defects than software used to control an actual
airliner.
A problem with software testing is that the number of defects in a software product can
be very large, and the number of configurations of the product larger still. Bugs that occur
infrequently are difficult to find in testing. A rule of thumb is that a system that is expected to
function without faults for a certain length of time must have already been tested for at least
that length of time. This has severe consequences for projects to write long-lived reliable
software.
A common practice of software testing is that it is performed by an independent group
of testers after the functionality is developed but before it is shipped to the customer. This
practice often results in the testing phase being used as project buffer to compensate for project
delays. Another practice is to start software testing at the same moment the project starts and
it is a continuous process until the project finishes.
Another common practice is for test suites to be developed during technical support escalation
procedures. Such tests are then maintained in regression testing suites to ensure that future
updates to the software don't repeat any of the known mistakes.
Software Testing is the process used to help identify the correctness, completeness,
security, and quality of developed computer software. Testing is a process of technical
investigation, performed on behalf of stakeholders, that is intended to reveal quality-related
information about the product with respect to the context in which it is intended to operate.
This includes, but is not limited to, the process of executing a program or application with the
intent of finding errors. Quality is not an absolute; it is value to some person. With that in mind,
testing can never completely establish the correctness of arbitrary computer software; testing
furnishes a criticism or comparison that compares the state and behavior of the product against
a specification. An important point is that software testing should be distinguished from the
separate discipline of Software Quality Assurance (SQA), which encompasses all business
process areas, not just testing.
There are many approaches to software testing, but effective testing of complex products is
essentially a process of investigation, not merely a matter of creating and following routine
procedure. One definition of testing is "the process of questioning a product in order to evaluate
it", where the "questions" are operations the tester attempts to execute with the product, and
the product answers with its behavior in reaction to the probing of the tester[citation needed].
Although most of the intellectual processes of testing are nearly identical to that of review or
inspection, the word testing is connoted to mean the dynamic analysis of the product—putting
the product through its paces. Some of the common quality attributes include capability,
reliability, efficiency, portability, maintainability, compatibility and usability. A good test is
sometimes described as one which reveals an error; however, more recent thinking suggests
that a good test is one which reveals information of interest to someone who matters within the
project community.
Testing Methodologies: -
Black Box Testing
It is the testing process in which tester can perform testing on an application without
having any internal structural knowledge of application. Usually Test Engineers are involved
in the black box testing.
White Box Testing
It is the testing process in which tester can perform testing on an application with having
internal structural knowledge. Usually The Developers are involved in white box testing.
Gray Box Testing
It is the process in which the combination of black box and white box tonics’ are used.
Types of Testing
1. Regression Testing.
2. Re-Testing.
3. Static Testing.
4. Dynamic Testing.
5. Alpha Testing.
6. Beta Testing.
7. Monkey Testing
8. Compatibility Testing.
9. Installation Testing.
1. Regression Testing: It is one of the best and important testing. Regression testing is the
process in which the functionality, which is already tested before, is once again tested
whenever some new change is added in order to check whether the existing functionality
remains same.
2. Re-Testing: It is the process in which testing is performed on some functionality which
is already tested before to make sure that the defects are reproducible and to rule out the
environment’s issues if at all any defects are there.
3. Static Testing: It is the testing, which is performed on an application when it is not been
executed.
Ex: GUI, Document Testing
4. Dynamic Testing: It is the testing which is performed on an application when it is being
executed. ex: Functional testing.
5. Alpha Testing: It is a type of user acceptance testing, which is conducted on an
application when it is just before released to the customer.
6. Beta-Testing: It is a type of UAT that is conducted on an application when it is released
to the customer, when deployed in to the real time environment and being accessed by the
real time users.
7. Compatibility testing: It is the testing process in which usually the products are tested
on the environments with different combinations of databases (application servers,
browsers…etc.) In order to check how far the product is compatible with all these
environments platform combination.
8. Installation Testing: It is the process of testing in which the tester try to install or try to
deploy the module into the corresponding environment by following the guidelines
produced in the deployment document and check whether the installation is successful or
not.
Test cases
MATCHING
IF IMAGE IS MATCHED
IF IMAGE IS NOT MATCHED
1. Currently, the system has reached the accuracy level up to 80% for partial and dense
2. Further, two or more IP cameras can be employed and each image can be processed
separately. The results of these can be merged to obtain better results and accuracy in denser
classrooms.
CHAPTER – 13
BIBLIOGRAPHY
http://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&arnumber=4153394&queryText
%3Dface+detection
http://ieeexplore.ieee.org/xpl/articleDetails.jsp?tp=&arnumber=5274642&queryText
%3Dface+detection
Advantages:
The software can be used for security purposes in organizations and in secured zones.
The software stores the faces that are detected and automatically marks attendance.
Disadvantages:
The system don’t recognize properly in poor light so may give false results.