Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Introduction to Vector Quantization

Dr. Waquar Ahmad

National Institute of Technology Calicut


waquar@niitc.ac.in

November 15, 2021

Dr. Waquar Ahmad (NITC) VC November 15, 2021 1/9


Introduction

Earlier, we looked at quantization of single samples. In vector


quantization, we look at ways of quantizing blocks of samples which
are called as vectors.
Looking at blocks of samples rather than individual samples increases
our flexibility and allows us to take dependence between samples into
account.
This flexibility comes at the cost of increased complexity

Dr. Waquar Ahmad (NITC) VC November 15, 2021 2/9


Vector Quantization

Figure: The Vector Quantization Procedure

In vector quantization we group the source output into blocks or


vectors. For example, we can treat L consecutive samples as the
components of an L-dimensional vector.
This vector of source outputs forms the input to the vector quantizer
Dr. Waquar Ahmad (NITC) VC November 15, 2021 3/9
Vector Quantization

At both the encoder and decoder of the vector quantizer, we have a


set of L-dimensional vectors called the codebook.
The vectors in this codebook, known as code-vectors, are selected to
be representative of the vectors we generate from the source output.
Each code-vector is assigned a binary index.
At the encoder, the input vector is compared to each code-vector in
order to find the code-vector closest to the input vector.
In order to inform the decoder about which code-vector was found to
be the closest to the input vector, we transmit or store the binary
index of the code-vector. Because the decoder has exactly the same
codebook, it can retrieve the code-vector given its binary index.

Dr. Waquar Ahmad (NITC) VC November 15, 2021 4/9


Vector Quantization

Although the encoder may have to perform a considerable amount of


computations in order to find the closest reproduction vector to the
vector of source outputs, the decoding consists of a table lookup.
This makes vector quantization a very attractive encoding scheme for
applications in which the resources available for decoding are
considerably less than the resources available for encoding.
The most important thing in Vector quantiation is to design
codebook.
Codebook can be desined using LBG algorithm.
We will understand it by using an example.

Dr. Waquar Ahmad (NITC) VC November 15, 2021 5/9


LBG

Our Goal is to design codebook. Codebook consists of code vectors.


We have to design a codebook such that error is minimum.
By using LBG alogorithm we can find out such a codebook.
We can randomly initiate a codebook which is called initial vectors
and we have some training vectors.
Initial Set of vectors for codebook
Height Weight
45 50
80 180
45 117

Dr. Waquar Ahmad (NITC) VC November 15, 2021 6/9


LBG
Training vectors
Height Weight
a 44 41
b 59 117
c 60 110
d 72 180
e 64 180
f 80 182

Now we have to mark the training vectors to the closest initital


vectors
Height Weight intital vector
a 44 41 I
b 59 117 III
c 60 110 III
d 72 180 II
e 64 180 II
f 80 182 II

Dr. Waquar Ahmad (NITC) VC November 15, 2021 7/9


LBG
Now we will group training vectors according to its closeness to the
initial vectors
Region Training Vectors
I (44,41)
II (72, 180)
(64, 180)
(80, 182)
III (59,117)
(60,110)
We will calculate distorion Now we have to calculate the distortion

D (0) = (44 − 45)2 + (41 − 50)2 + (45 − 59)2 + (117 − 117)2 + (45 − 60
(80 − 72)2 + (180 − 180)2 + (80 − 64)2 + (180 − 180)2 + (80 − 8

(82 + 119 + 239 + 64 + 256 + 4)


D1(0) = = 127.3
6
Dr. Waquar Ahmad (NITC) VC November 15, 2021 8/9
LBG

Now we have to update out code vectors (intital vectors).


We will take average of the training vector that are lying in that region
Region updated Vectors
I (44,41)
II (72, 180)
III (59, 113)

Now we have to repeat the same process and we have to find out new
updated intial vectors.
we will stop when D (i) ≤ D (i+1) Then the updated vectors would be
your final code vectors

Dr. Waquar Ahmad (NITC) VC November 15, 2021 9/9

You might also like