Professional Documents
Culture Documents
ch1-1
ch1-1
INTRODUCTION
1.1 Overview
Image compression is the method of efficiently coding digital images to reduce the
number of bits required in representing an image. Recent advances in digital technology have
led to communication media in which visual information plays the key role. Some of the
emerging applications include high-definition TV, videoconferencing, video telephony,
medical imaging, virtual reality, video wireless transmission and video server.
The raw biometric digital image and video signal usually contain huge amount of
information and therefore require a large channel or storage capacity if the number of users is
large. In spite of advances in communications channel and storage capacity, the
implementation cost often put constraint on capacity. Generally, the transmission or storage
cost increase with increase in bandwidth requirement. To meet the channel or storage
capacity requirement, it is necessary to employ compression techniques, which reduce the
data rate while maintaining the subjective quality of the decoded image or video signal.
Image compression techniques achieve compression by exploiting statistical redundancies in
the data and eliminating or reducing data to which the human eye is less sensitive. Two types
of redundancies occur in images which are spatial and spectral redundancy.
Spatial redundancy happens due to the correlation between neighboring pixels meanwhile
spectral redundancy is due to correlation between different color planes [1]. In the
compression theory both spatial and spectral redundancies can be eliminated using subband
coding or transform coding (Discrete Cosine Transform).
There is other type of redundancy called temporal redundancy which is due to the correlation
of different frames in a sequence of images such as in video conferencing applications or
broadcast images. These temporal redundancies are removed by an interframe coding called
Motion Compensated Predictive Coding [2].
1
There are several of compression algorithms that can be used to compress video and image.
Some of these algorithms have been adopted as compression standards such as JPEG, H.261,
H.263 and [3] These algorithms are performed in block-based compression algorithms and
are computationally expensive running in standard processor.
The three most computationally demanding functions or components in these algorithms are
DCT, DFT and WHT. They are widely used in the image processing applications. These
linear image transforms are chosen in the image processing application because of their
[flexibility, energy compaction, and robustness. These transforms effectively extract the
edges and also provide energy compaction in the state-of-the-art methods. Among all these
transforms, WHT is very gorgeous one because of its simplicity and its computational
efficiency].
Image compression plays an essential role for effective transmission system and
storage of images. Requirements for storage, management and transfer of digitized biometric
image, have grown explosively especially for the 2D biometric data and video signal. These
stored image size can be very huge and can use a lots of memory of the storage device. For
example a 512x512 gray image has more than 50,000 components available for storage; on
the other hand an ideal color image that is 640 x 480 pixels has closely a million elements. It
is very time consuming process to transfer these records from biometric sensor to the internet
servers running the classification process.
Biometrics image is the measurement and statistical analysis of people's physical
characteristics. The technology is mainly used for identification and access control, or for
identifying individuals that are under surveillance. The basic premise of biometric
authentication is that everyone is unique and an individual can be identified by his or her
intrinsic physical traits. For example, if any government wants to launch program to collect
the biometric categorizing features specifically iris, fingerprint and facial patterns for its
residents. This system storage is very large to manage including database transferring over
internet or designing a portable sensor device to carry. This actually motivates us to
investigate the compression algorithm for several biometrics image based on WHT
compression techniques.
It is also necessary to measure the compression performance of compressed biometric image
using several measurement standards. In general image occupies the vital portion of
2
bandwidth for communication. Therefore the improvement of efficient image compression
technique has turned into quite compulsory .The fundamental aim of image compression is to
remove redundancy and omit irrelevancy. [Redundancy helps to remove redundancy from the
signal source and irrelevancy omits pixel values which are not noticeable by the human
eye][3].
For various applications that are used for image processing, assessment of image
quality is important. The examination of quality of the image is almost similar to the
examination of similarity of the image in which the basis of quality is the differences (or
similarities) between the actual image and the compressed image. Using the examination of
image quality there are two methods that can help us assess quality of the image.
While assessing quality of the image subjectively, human eye is considered to be the best
instrument. It is based on the perception of an individual i.e. the way he sees the object. This
assessment is most of the ties slow and costly and is very complicated to be repeated or
checked for verification. Hence, in recent years objective methods of assessment have gained
popularity. A mathematical model objective quality assessment produces results that are
almost similar to the results obtained through subjective measures for the measurement of
image quality. The basic objective to use this method is to get measurement in terms of
quantities which can then asses the observed quality of the image [4]. It has many functions
such as, controlling the quality of the image so that it can be useful for quality control
systems, to create standards for image processing systems and to adjust algorithms and
limitations.
3
A proper compression method will reduce the size of data storage and the transmission time.
There is a compression technique based on a frequency transform that is able to compress
biometric image which is Walsh Hadmaard Transform (WHT). This transform contains
unique characteristics which allow for the creation of an efficient image compression. The
principal advantage of image transformation using WHT is the removal of redundancy
between neighboring pixels. Efficiency of the transformation scheme can be directly gauged
by its ability to pack input data into as few coefficients as possible [5].
Image compression can be implemented in both software and hardware. However, using
hardware implementations can achieve faster processing than software implementation due to
highly parallel algorithms exist in specific hardware such as DSP processor. In designing any
compression algorithm, two most contrasting problems are reducing data rate and increasing
processing time along with conservation of image quality. In this project, WHT compression
algorithm is implemented using Matlab environment software to increase the save storage
with preservation of image quality.
The aim of this project is to investigate and design compression algorithm for biometric
images using lossy compression technique implemented using Matlab environment .The main
objectives of the research are.
4
1.6 Brief Methodology
Phase 1: The first part is designing phase, In this phase, the research will be designed set of
algorithms to compress several number of biometric images using MATLAB , the following
steps are involved in WHT algorithms .
According to our survey, the project is started from collecting biometric images, and
developing compression algorithms by WHT using MATLAB. This algorithms is
implemented using general purpose processor. It can be concluded that, the project is limited
to the use of general purpose processor for evaluation the system in terms of quality metrics
with different quality levels which is means that (different coefficient truncated code)
5
1.8 Organization of project
This project organized with five chapters and the contents of each chapter are as follows:
Chapter 1: This chapter presents the problem statement, background of the topic, motivation
for this a research, objectives and brief research methodology alongside this chapter cover the
organization of this project.
Chapter 2: In this chapter, an introduction to image compression will be explained and data
redundancies, steps involved in compressing image. Techniques of Image Compression,
Image Quality Assessment Methods, back ground, Image Quality.
Need of Quality Measure, Measurement of Mean Square Error (MSE), Measurement of Peak
Signal to Noise Ratio (PSNR), Correlation, Compression measures, Theory of Structural
Similarity Index Metric (SSIM), Measurement of Structural Similarity Index Metric (SSIM).
Chapter 6: In this chapter we will take results on compressing several images Experimental
Results of (Iris, Face, Palm, Fingerprint) Image Compression, conclusion, Future Work