Download as pdf or txt
Download as pdf or txt
You are on page 1of 39

TECHNOLOGICAL INSTITUTE OF THE PHILIPPINES

938 Aurora Boulevard, Cubao, Quezon City

COLLEGE OF ENGINEERING AND ARCHITECTURE


Electronics Engineering Department

Image Processing Using MATLAB

In Partial Fulfilment of the Requirements


Needed for the Completion of the Course
Signals, Spectra and Signal Processing (ECE401)

Peña, Judy-Ann, S

Sarmiento, John Michael A.

Soriano, Jessica T.

Soriano, Regine N.

Vitor, Jerome Kennedy B.

Engr. Sheryl G. Velasquez


Instructor
2019
Table of Contents
Abstract................................................................................................................................................................. 3
List of Figures ....................................................................................................................................................... 4
I. Introduction .................................................................................................................................................. 5
Background of the study.................................................................................................................................. 5
Objectives of the study .................................................................................................................................... 5
Significance of the study ................................................................................................................................. 5
Scope and Delimitations.................................................................................................................................. 6
II. Theoretical Framework ................................................................................................................................ 6
Review of Related Literatures ......................................................................................................................... 6
Concept of the Study ....................................................................................................................................... 8
Definition of Terms .......................................................................................................................................... 9
Line Edge ....................................................................................................................................................... 14
Interface Design............................................................................................................................................. 18
Codes Used ................................................................................................................................................... 18
III. Results and Discussion ............................................................................................................................. 24
Evaluation of Results ..................................................................................................................................... 24
Verification of Studies .................................................................................................................................... 31
IV. Summary, Conclusion, And Recommendations .................................................................................. 31
Summary ........................................................................................................................................................ 31
Conclusion ..................................................................................................................................................... 32
Recommendation........................................................................................................................................... 32
V. References ................................................................................................................................................. 32
VI. Appendices ............................................................................................................................................ 33
VII. Curriculum Vitae .................................................................................................................................... 34

2
Abstract

The researchers created a method on how to insert colours and modify objects into different scale into

pictures using the MATLAB; the ability to change colours from the actual picture to red, green and blue. The

ability to change the object into several types of imaging by using push button in graphical user interface is

the highlight. With the use of a single picture, the main goal for doing this project is to make sure that it will

be executed properly.

3
List of Figures
Figure 1: Interface Used .................................................................................................................................... 18
Figure 2: Retrieve Image ................................................................................................................................... 24
Figure 3: Red Image .......................................................................................................................................... 25
Figure 4: Green Image ....................................................................................................................................... 25
Figure 5: Blue Image .......................................................................................................................................... 26
Figure 6: Grayscale Image ................................................................................................................................ 26
Figure 7: Binary Image....................................................................................................................................... 27
Figure 8: Indexed Image .................................................................................................................................... 27
Figure 9: Intensity of the Image ......................................................................................................................... 28
Figure 10: Morphological Opening .................................................................................................................... 28
Figure 11: Subtracted Background.................................................................................................................... 29
Figure 12: Adjusted Contrast ............................................................................................................................. 29
Figure 13: Threshold of the Image .................................................................................................................... 30
Figure 14: Sobel Gradient.................................................................................................................................. 30
Figure 15: Edge Detection ................................................................................................................................. 31
Figure 16: Reference codes in Image Processing Using MATLAB ................................................................. 33

4
I. Introduction

Background of the study

MATLAB is a high-level language an interactive environment that enables one to perform

computationally intensive tasks faster than with traditional programming languages such as C, C++ and

Fortran. The Image Processing Toolbox is a collection of functions that extend the capabilities of the

MATLAB’s numeric computing environment. The toolbox supports a wide range of image processing

operations, including Geometric operations, Transforms, Linear Filtering and Filter design, etc. This study

covers an application that deals with Linear Filtering and Filter Design like Edge Detection.

Objectives of the study

In this project, we are expected to:

 Design programs based on written description

 Demonstrate skill in devising, programming, executing, running, and troubleshooting of

programs

 Design a project related to digital signal processing

Significance of the study

By presenting this study, we are hoping that this project will be very beneficial to students who will also

take this course to learn more knowledge about how MATLAB’s uses and what are the advantages of using

this tool in doing image processing. We also would like to use this project as one of researcher’s reference

in the event that they will encounter the same project.

5
Scope and Delimitations

This study is focused on making an image processing application using MATLAB.

The result of this study is delimited and will mainly provide benefit to the following:

Students. The project would be a reference for the future ECE and non-ECE students in making their

own application.

Researchers. The project would acquire ideas and methods in making an application that deals

with image processing which could serve as reference and

School. The project would provide benefit in terms of assessment and evaluation on the student’s

perception and understanding on the output attained.

II. Theoretical Framework

Review of Related Literatures

In computer science, digital image processing is the use of computer algorithms to perform image

processing on digital images. As a subcategory or field of digital signal processing, digital image processing

has many advantages over analog image processing. It allows a much wider range of algorithms to be applied

to the input data and can avoid problems such as the build-up of noise and signal distortion during processing.

Since images are defined over two dimensions (perhaps more) digital image processing may be modeled in

the form of multidimensional systems. (Wikipedia, 2019)

An image filter is a technique through which size, colors, shading and other characteristics of an image

are altered. An image filter is used to transform the image using different graphical editing techniques. Image

filters are usually done through graphic design and editing software.

6
Image filters are primarily used to edit an image using computer software. An image filter generally

changes the image at the pixel level, meaning each pixel individually is affected. It can be applied to 2-D and

3-D images. Typically, the image filter process includes options such as:

 Editing the color scheme/theme/contrast of the image

 Adjusting image brightness

 Adding effects to the image

 Changing the texture

The term image filter may also refer to the process of filtering (excluding) images from data, folders or Web

searches. (Technopedia, n.d.)

Filtering in image processing is a process that cleans up appearances and allows for selective

highlighting of specific information. A number of techniques are available and the best options can depend

on the image and how it will be used. Both analog and digital image processing may require filtering to yield

a usable and attractive end result. This can be a routine part of the editing process used to prepare images

for distribution.

In the case of film photography, when a photographer develops prints, it may be necessary to use

filtering to get the desired effects. Filters can be mounted in the enlarger to improve image quality, or for

activities like developing black and white prints from color negatives. The photographer may perform tests

with several filters to find the most appropriate.

Film photographers can use filtering in image processing for activities like sharpening up contrast. The

filter can change the wavelength of the light as it passes through the enlarger, altering the resulting exposure

and developed image. Kits of common filters for enlargers and cameras are widely available commercially.

7
Digital filtering offers a number of advanced photo manipulation options beyond the basic filters used

in photo development. One common use of filtering in image processing is to remove blur. Images may be

blurry because of file degradation, moving objects in the frame when the photo was taken, and other issues.

The photographer can use a filtration algorithm to selectively target pixels and smooth the image out. More

complex filters may be able to reconstruct partially damaged images through averaging, using available data

to estimate missing contents in an image.

Another use for filtering in image processing is in the handling of images where technicians want to

highlight specific objects of interest in the picture. For example, astronomers might pass an image through

filters to selectively restrict data from certain wavelengths. This can allow other information in the image to

pop into relief. Filters can also remove noise like haze from images to make them cleaner and clearer, even

if they are not specifically blurred.

Software programs allow for very complex filtering in image processing. Many come with presets that

people can use for basic tasks like adding soft filters to portraits or sharpening up contrast in dim images.

Users can also develop their own filters, coding in specific parameters to make a custom version for a

particular need or project. This may require advanced programming skills, as well as a thorough knowledge

of how photography works to yield the best results. (Wisegeek, 2019)

Concept of the Study

This study focuses on how the graphical user interface is used. It comes with the image processing of

editing the color scheme/theme/contrast of the image, adjusting image brightness, adding effects to the

image and changing the texture. The researchers used MATLAB as a software for this project.

8
Definition of Terms

Average Greyscale

The shade of grey produced by adding the brightnesses of the RGB components of a color pixel and

dividing by three.

Bitmap Image

An image composed of black and white pixels.

Bounding Box

The smallest rectangle that encloses a shape so that each of the four sides touches an extremity of

the shape.

Brightness

Determines the intensity of the color presented by a pixel in a color image, or the shade of grey

presented by a pixel in a greyscale image.

Brightness Transformation

A point process that maps input brightnesses onto output brightnesses with a linear or non-linear

mathematical function.

Closing

A morphological operation produced by following a dilation by an erosion. Often used for filling holes

in bitmap images.

9
Color Model

Determines how the color in a digital image is represented numerically. Examples include the RGB

and HSB color models.

Composition

A point process that overlays the pixels of a foreground input image onto the pixels of a background

input image.

Contrast

The difference between the lightest and darkest regions of an image.

Contrast Expansion

An image-processing technique that re-distributes the brightness in an image to eliminate regions

that are either too dark or too light. Examples include basic and ends-in contrast expansion.

Convolution

A method of calculating the new value of a central pixel in a neighborhood by multiplying each pixel

in the neighborhood by a corresponding weight; the new value of the central pixel is the sum of the

multiplications.

Diagonal Axis

The line that runs from the top-left corner of an image to the bottom-right corner, or from the top-right

corner to the bottom-left corner.

10
Digital Image

An image captured by an imaging device and represented in a computer as a rectangular grid of

pixels.

Dilation

A morphological operation that increases the size of objects in an image by adding a layer of

foreground pixels around each object.

Edge

Edges mark the boundaries between the objects in a scene. A large change in pixel brightness over

a small number of pixels often indicates the presence of an edge.

Edge Detector

An image-processing routine that flags the large changes in pixel brightness that indicate potential

edges. Edge detectors often visualize their results in edge maps. Examples include the Sobel, Prewitt, Kirsch

and Laplacian edge detectors.

Edge Direction

The angle that specifies the direction of an edge. The angle is perpendicular to the direction of the

large change in brightness that indicates the edge.

Edge Magnitude

A number that represents how confident an edge detector is that it has found an edge in an image.

11
Edge Map

A greyscale output image that visualizes the magnitude of the edge found at each pixel in an input

image; the greater the magnitude, the brighter the corresponding edge-map pixel. Thresholding an edge map

highlights the strongest edges.

Edge Mask

A set of convolution weights that highlight the size and direction of the edges in an image.

Erosion

A morphological operation that decreases the size of objects in an image by removing a layer of

foreground pixels around each object. Often used for removing projections and blobs in bitmap images.

Frame Averaging

A point process that removes noise from a series of input images taken of the same subject. Each

output-image pixel value is the average of the corresponding input-image pixel values.

Greyscale Image

An image composed of pixels that present shades of grey.

High-key Image

An image that represents a naturally light subject.

High-contrast Image

An image with large numbers of pixels in the shadows and highlights.

12
Horizontal Axis

The line that runs through the center of an image from the left of the image to the right.

HSB

A color model that represents each color with three numbers that specify the hue (H), the saturation

(S) and the brightness (B) of the color.

Hue

The color in the HSB color model.

Image

An image records a visual snapshot of the world around us.

Image Processing

The field of computer science that develops techniques for enhancing digital images to make them

more enjoyable to look at, and easier to analyze by computers as well as humans.

Input Image

The image transformed by an image-processing routine.

Inversion

A point process that produces an effect similar to photographic negatives: dark pixels become light

and light pixels become dark.

13
Kernel

A rectangular grid of convolution weights.

Line Edge

A line chain of pixels that separates a region of light pixels from a region of dark pixels.

Low-contrast Image

An image that uses only a small range of the available brightness. Low-contrast images are mostly

dark, mostly dull or mostly light.

Low-key Image

An image that represents a naturally dark subject.

Morphological Operation

A category of image-processing techniques that operate on the structure of the objects in an image.

Noise

Unwanted changes to the values of the pixels in an image, often introduced by the imaging device

during capture. Examples include impulse noise and Gaussian noise.

Non-primary Color

A color created by mixing the red, green and blue primary colors of the RGB color model.

14
NTSC Greyscale

A shade of grey produced by multiplying the brightnesses of the RGB components of a color pixel

by a set of weights that emphasize the green component. Named after the committee that oversees US

television.

Opening

A morphological operation produced by following an erosion by a dilation. Often used for filling holes

in bitmap images.

Outlying Pixel

A pixel with an extreme brightness that is much higher or lower than the brightnesses of the other

pixels in the image.

Output Image

An image that contains the results of applying an image-processing routine to an image.

Photo Restoration

The application of a series of image-processing routines to enhance a damaged photograph.

Pixel

A square unit of visual information that represents a tiny part of a digital image.

Pixel Depth

The number of colors or shades of grey a pixel can present. Bitmap pixels have depth two, typical

greyscale pixels have depth 256, and typical color pixels have depth 16,777,216.

15
Pixel Neighborhood

A region of pixels processed by an area process. Typical neighborhood dimensions are 3x3 pixels

and 5x5 pixels.

Point Processes

A category of image-processing techniques that calculate the value of each output-image pixel from

the value of the corresponding input-image pixel. Examples include inversion and pseudo-color.

Potential Edge

Edge detectors flag all large changes in pixel brightness over a small number of pixels as a potential

edge. An edge-analysis system then decides whether the change in brightness represents the border of an

object—a real edge—or some other feature of the object, such as its texture.

Primary Colors

The colors red, green and blue from which all other colors in the RGB color model are mixed.

Quantization

The calculation that maps the fractional measurements made by imaging devices onto proportional

integer pixel brightnesses.

Ramp Edge

A region of pixels that separates a region of light pixels from a region of dark pixels. The pixels in the

region change gradually from light to dark.

16
Raw Color

The color of the pixels in an image captured by a color CCD before the two unknown RGB-component

brightnesses of each pixel have been interpolated from the known brightnesses of the corresponding

components of neighboring pixels.

Resolution

The number of pixels available to represent the details of the subject of a digital image.

RGB

A color model that represents each color with three numbers that specify the amounts of red (R),

green (G) and blue (B) that produce the color.

RGB Color Cube

Visualizes the amounts of red, green and blue required to produce each color in the RGB color model

as a point in a cube at co-ordinates (x, y, z).

Sobel Operator

The Sobel operator, sometimes called the Sobel–Feldman operator or Sobel filter, is used in image

processing and computer vision, particularly within edge detection algorithms where it creates an image

emphasising edges.

Thresholding

A point process that produces a bitmap version of a greyscale image. Black bitmap pixels represent

greyscale pixels darker than a threshold brightness; white bitmap pixels represent greyscale pixels lighter

than the threshold.


17
Vertical Axis

The line that runs through the center of an image from the top of the image to the bottom. (Morgan,

n.d.)

Interface Design

Figure 1: Interface Used

Codes Used

function varargout = GUI(varargin)


% GUI MATLAB code for GUI.fig
% GUI, by itself, creates a new GUI or raises the existing
% singleton*.
%
% H = GUI returns the handle to a new GUI or the handle to
% the existing singleton*.
%
% GUI('CALLBACK',hObject,eventData,handles,...) calls the local
% function named CALLBACK in GUI.M with the given input arguments.
%
% GUI('Property','Value',...) creates a new GUI or raises the
% existing singleton*. Starting from the left, property value pairs are
% applied to the GUI before GUI_OpeningFcn gets called. An
% unrecognized property name or invalid value makes property application

18
% stop. All inputs are passed to GUI_OpeningFcn via varargin.
%
% *See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one
% instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES

% Edit the above text to modify the response to help GUI

% Last Modified by GUIDE v2.5 21-Oct-2019 01:40:37

% Begin initialization code - DO NOT EDIT


gui_Singleton = 1;
gui_State = struct('gui_Name', mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @GUI_OpeningFcn, ...
'gui_OutputFcn', @GUI_OutputFcn, ...
'gui_LayoutFcn', [] , ...
'gui_Callback', []);
if nargin && ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end

if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT

% --- Executes just before GUI is made visible.


function GUI_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% varargin command line arguments to GUI (see VARARGIN)

% Choose default command line output for GUI


handles.output = hObject;

% Update handles structure


guidata(hObject, handles);

% UIWAIT makes GUI wait for user response (see UIRESUME)


% uiwait(handles.figure1);

% --- Outputs from this function are returned to the command line.
function varargout = GUI_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

19
% Get default command line output from handles structure
varargout{1} = handles.output;

% --- Executes on button press in pushbutton1.


function pushbutton1_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
axes(handles.axes1);
imshow(mycolorimage);

% --- Executes on button press in RGB.


function RGB_Callback(hObject, eventdata, handles)
% hObject handle to RGB (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --- Executes on button press in GRAYSCALE.


function GRAYSCALE_Callback(hObject, eventdata, handles)
% hObject handle to GRAYSCALE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
mygrayimage=rgb2gray(mycolorimage);
axes(handles.axes2);
imshow(mygrayimage); title('Grayscale');

% --- Executes on button press in BINARY.


function BINARY_Callback(hObject, eventdata, handles)
% hObject handle to BINARY (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
mybinimage=im2bw(mycolorimage);
axes(handles.axes2);
imshow(mybinimage); title('Binary Image');

% --- Executes on button press in INTENSITY.


function INTENSITY_Callback(hObject, eventdata, handles)
% hObject handle to INTENSITY (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
mygrayimage=rgb2gray(mycolorimage);
axes(handles.axes2);
improfile(mygrayimage,[10,50],[45,100]);

20
ylabel('Pixel Value');
xlabel('Distance')
title('Intensity profile of the Gray image');
% --- Executes on button press in MORPH.
function MORPH_Callback(hObject, eventdata, handles)
% hObject handle to MORPH (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
mygrayimage=rgb2gray(mycolorimage);
background = imopen(mygrayimage,strel('disk',50));
axes(handles.axes2);
imshow(background); title('Morphological Opening');

% --- Executes on button press in SUB.


function SUB_Callback(hObject, eventdata, handles)
% hObject handle to SUB (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
mygrayimage=rgb2gray(mycolorimage);
background = imopen(mygrayimage,strel('disk',45));
I2 = imsubtract(mygrayimage,background);
axes(handles.axes2);
imshow(I2); title('Subtracted Background');

% --- Executes on button press in ADJCONTR.


function ADJCONTR_Callback(hObject, eventdata, handles)
% hObject handle to ADJCONTR (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
mygrayimage=rgb2gray(mycolorimage);
background = imopen(mygrayimage,strel('disk',40));
I2 = imsubtract(mygrayimage,background);
I3 = imadjust(I2);
axes(handles.axes2);
imshow(I3); title('Adjusted Contrast');

function edit1_Callback(hObject, eventdata, handles)


% hObject handle to edit1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit1 as text


% str2double(get(hObject,'String')) returns contents of edit1 as a
double

% --- Executes during object creation, after setting all properties.


function edit1_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit1 (see GCBO)

21
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

% --- Executes on button press in RED.


function RED_Callback(hObject, eventdata, handles)
% hObject handle to RED (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
red = mycolorimage(:,:,1);
a = zeros(size(mycolorimage, 1), size(mycolorimage, 2));
just_red = cat(3, red, a, a);
imshow(just_red, 'Parent', handles.axes2);

% --- Executes on button press in GREEN.


function GREEN_Callback(hObject, eventdata, handles)
% hObject handle to GREEN (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
green = mycolorimage(:,:,2);
a = zeros(size(mycolorimage, 1), size(mycolorimage, 2));
just_green = cat(3, a, green, a);
imshow(just_green, 'Parent', handles.axes2);

% --- Executes on button press in BLUE.


function BLUE_Callback(hObject, eventdata, handles)
% hObject handle to BLUE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
blue = mycolorimage(:,:,3);
a = zeros(size(mycolorimage, 1), size(mycolorimage, 2));
just_blue = cat(3, a, a, blue);
imshow(just_blue, 'Parent', handles.axes2);

% --- Executes on button press in THRESH.


function THRESH_Callback(hObject, eventdata, handles)
% hObject handle to THRESH (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
im=rgb2gray(mycolorimage);
level=graythresh(im);
a=im2bw(im, level);

22
axes(handles.axes2);
imshow(a); title('Threshold of the image');

% --- Executes on button press in INDEX.


function INDEX_Callback(hObject, eventdata, handles)
% hObject handle to INDEX (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mycolorimage=imread('dogi.jpg')
[IND, map] =rgb2ind(mycolorimage, 32);
imshow(IND, 'Parent', handles.axes2); title('Index');

% --- Executes on button press in SOBEL GRADIENT.


function SOBEL_Callback (hObject, eventdata, handles)
img=imread('dogi.jpg');
B=rgb2gray(img);
pause(2)
I=double(B);

for i=1:size(I,1)-2
for j=1:size(I,2)-2
%Sobel mask for x-direction:
mx=((2*I(i+2,j+1)+I(i+2,j)+I(i+2,j+2))-(2*I(i,j+1)+I(i,j)+I(i,j+2)));
%Sobel mask for y-direction:
my=((2*I(i+1,j+2)+I(i,j+2)+I(i+2,j+2))-(2*I(i+1,j)+I(i,j)+I(i+2,j)));

B(i,j)=sqrt(mx.^2+my.^2);
end
end
pause(2)
axes(handles.axes2);
imshow(B); title('Sobel gradient');

% --- Executes on button press in EDGE.


function EDGE_Callback (hObject, eventdata, handles)
img=imread('dogi.jpg');
B=rgb2gray(img);
pause(2)
I=double(B);

for i=1:size(I,1)-2
for j=1:size(I,2)-2
%Sobel mask for x-direction:
mx=((2*I(i+2,j+1)+I(i+2,j)+I(i+2,j+2))-(2*I(i,j+1)+I(i,j)+I(i,j+2)));
%Sobel mask for y-direction:
my=((2*I(i+1,j+2)+I(i,j+2)+I(i+2,j+2))-(2*I(i+1,j)+I(i,j)+I(i+2,j)));

B(i,j)=sqrt(mx.^2+my.^2);
end
end

%Define a threshold value


Thresh=100;

23
B=max(B,Thresh);
B(B==round(Thresh))=0;
B=uint8(B);
imshow(~B);title('Edge detected Image');

III. Results and Discussion

Evaluation of Results

Figure 2: Retrieve Image

24
Figure 3: Red Image

Figure 4: Green Image

25
Figure 5: Blue Image

Figure 6: Grayscale Image

26
Figure 7: Binary Image

Figure 8: Indexed Image

27
Figure 9: Intensity of the Image

Figure 10: Morphological Opening

28
Figure 11: Subtracted Background

Figure 12: Adjusted Contrast

29
Figure 13: Threshold of the Image

Figure 14: Sobel Gradient

30
Figure 15: Edge Detection

Verification of Studies

As shown on the images above, the researches successfully get the desired outcome of each
parameters.

IV. Summary, Conclusion, And Recommendations

Summary

The researchers used MATLAB in executing this study – a study on how to incorporate the graphical

user interface to perform an application. The image processing has been visible by testing and running the

system. This study covered the following filter characteristics: Binary Image, Image type, Indexed Image,

Intensity Image, and RGB (red, green, and blue) image only. The type of filter will be displayed using

pushbuttons, and sliders. Other features include morphological opening to estimate the background,

subtraction of the background image from the original image, image contrast adjustment, threshold of the

image, sobel gradient and edge detection.

31
Conclusion

The researchers concluded:

 That the codes of the program stated abovementioned is working and ready to use;

 That the skills is devising, programming, executing, running and troubleshooting the program has

been attained;

 That an application using graphical user interface has been design properly that deals with digital

signal processing of an image.

Recommendation

The researchers recommend to: seek an advice to the instructor beforehand, be familiarize with the
MATLAB environment, study the codes and learn how to troubleshoot.

V. References

Morgan, J. (n.d.). Usabilty YetC. Retrieved from A Glossary of Image Processing Terms:
https://usabilityetc.com/articles/image-processing-glossary/

Technopedia. (n.d.). Retrieved from Image FIlter: https://www.techopedia.com/definition/7687/image-filter

Wikipedia. (2019, October 19). Retrieved from https://en.wikipedia.org/wiki/Digital_image_processing

Wisegeek. (2019). Retrieved from What is Fillter in Image Processing: https://www.wisegeek.com/what-is-


filtering-in-image-processing.htm

32
VI. Appendices

Figure 16: Reference codes in Image Processing Using MATLAB

33
VII. Curriculum Vitae

34
35
36
37
38
39

You might also like