Download as pdf or txt
Download as pdf or txt
You are on page 1of 51

New Methods in Bayer Demosaicking

Algorithms



Nick Dargahi
Vin Deshpande

March 20, 2007

Psychology 221 Applied Vision
& Image Systems Engineering
2
Table of Contents
Table of Contents .......................................................................................................................................... 2
Table of Figures ............................................................................................................................................ 3
I. Introduction ............................................................................................................................................... 5
II. Demosaicking Algorithms ...................................................................................................................... 6
II. (a) Non-Adaptive Algorithms ............................................................................................................. 6
II. (b) Adaptive Algorithms ..................................................................................................................... 8
II. (c) Bilateral Demosaicking Algorithm ..............................................................................................10
Bilateral Filtering Overview ...............................................................................................................10
Demosaicking .......................................................................................................................................13
Implementation of Bilateral Filter for Bayer Demosaicking ...........................................................15
Demosaicking Error ...............................................................................................................................16
III. Motivation for New Bayer Demosaicking algorithms in the Frequency Domain ............................17
III. a) Alternating Projections Demosaicking Algorithm ..........................................................................18
Correlation of High Frequencies & Nyquist Exploitation of Green Channel's Greater Bandwidth
...............................................................................................................................................................19
III. b) S-CIELAB Error for Alternating Projections & Matlab Code ........................................................25
III. a) Homogeneity Demosaicking Algorithm ..........................................................................................27
III. b) S-CIELAB Homogeneity Method Error & Matlab Code ................................................................30
IV. Adaptive Bilateral Algorithm ...............................................................................................................33
IV. a) S-CIELAB Error Results For Adaptive Bilateral Filter ..................................................................34
V. REPORT SUMMARY AND CONCLUSIONS ....................................................................................40
V. a) Error Comparison Results for Alternating Projections, Homogeneity, Bilateral, and Bilinear
Algorithms: ................................................................................................................................................45
Appendix I. References & Bibliography ....................................................................................................50
Appendix II. Matlab Code ..........................................................................................................................51
Appendix III. Project Responsibilities .......................................................................................................51

3
Table of Figures
Figure 1: A Bayer Color Filter Array (CFA) requires demosaicking to reconstruct the missing color
pixel components for each color plane. ....................................................................................................... 5
Figure 2: Original Lighthouse and Bayer Image. ........................................................................................... 6
Figure 3: Bayer Mosaic (RGGB) for Bilinear Interpolation & Nearest Neighbor ......................................... 7
Figure 4: Bayer Mosaic for Smooth Hue Transition ...................................................................................... 8
Figure 5: Bayer Mosaic for Adaptive Edges Sensing Algorithm ................................................................... 9
Figure 6: Gaussian Kernel for the Adaptive Bilateral Filter ..........................................................................11
Figure 7: Original photo (left) versus bilateral smoothed image (right). .......................................................12
Figure 8: Bayer CFA .....................................................................................................................................13
Figure 9: Masks for Adaptive Bilateral Demosaicking (Ramanath [11]) ......................................................14
Figure 10 .......................................................................................................................................................15
Figure 11: Bilateral Demosaicking Implementation Images .........................................................................17
Figure 12: Bicubic interpolation of Bayer Mosaic on right introduces errors into the picture (original on
left). ...............................................................................................................................................................18
Figure 13: High frequency substitution from the green channel to the red and blue reduces color artifacts. 19
Figure 14: 2-D Fourier transform of lighthouse image. Notice how the high frequencies are strongly
correlated in each color channel. ...................................................................................................................20
Figure 15: Red color plane minus green color plane leaves only low frequency data. ..........................21
Figure 16: Fourier transform of Original, Red-Green, and Blue-Green color planes. Frequency
spectrum of the difference images, shows that the high frequencies are highly cross correlated in the
red and green frequencies, as well as the blue and green frequencies. ...................................................21
Figure 17: Wavelet Decomposition: High frequency components are stored in the wavelet coefficients
found in the horizontal, vertical and diagonal detail windows on the top right, bottom left, and
bottom right of the decomposition illustration. The low frequency content is found in the top left
window. .........................................................................................................................................................22
Figure 18: Wavelet decomposition into high/low frequencies with vertical/horizontal/diagonal
subbands. ......................................................................................................................................................23
Figure 19: Flow chart showing how the Alternating Projections demosaicking works using wavelets.
.......................................................................................................................................................................24
Figure 20: Iteration allows Alternating Projections to converge to a solution for the interpolated
samples in the green and red channels. .....................................................................................................25
Figure 21: Histogram of errors for Alternating Projections ...........................................................................26
Figure 22: S-CIELAB Errors for Alternating Projections. Green areas are errors greater than 10 in CIELAB
space. Red lines indicate edges, based on the original figure. .......................................................................27
Figure 23: Red & Blue Channel Reconstruction for Homogeneity Algorithm .............................................29
Figure 24: Homogeneity refers to deciding which direction to interpolate the picture, either vertically or
horizontally, based on the shared chromaticity and luminosity values of a pixel region. .............................30
Figure 25: Error Map for Homogeneity Method ...........................................................................................31
Figure 26: Histogram of Errors for Homogeneity Method ............................................................................32
Figure 27: S-CIELAB Edge Error Map. Green areas are where errors are greater than 10, while red lines
indicate edges from original image................................................................................................................33
Figure 28: Original (Left) and Bilateral Filtered (Right). No Bayer demosaicking was implemented for this
image. ............................................................................................................................................................34
Figure 29: S-CIELAB Error Map for Bilateral Filtered Image. ....................................................................35
Figure 30: Histogram of Errors for Bilateral Filtered Image. ........................................................................36
Figure 31: Bilateral Error Edge Map: Green areas are locations where errors are greater than 10. Red lines
indicate edges from original image................................................................................................................37
Figure 32: Original (Left) Versus Bilateral Filtered (w=5 sigma =3 & 0.1).................................................38
Figure 33: Error map for Bilateral Filtered Image: (w=5 sigma =3 & 0.1). ................................................39
Figure 34: Histogram of Errors for Bilateral Filtered Image: (w=5 sigma =3 & 0.1). ................................39
Figure 35: Edge Error Map for Bilateral Filtered Image: (w=5 sigma =3 & 0.1). Green areas indicate
errors greater than 10, while red lines show edges from original image. ......................................................40
4
Figure 36: S-CIELAB Comparison of Errors for Bilinear, Alternating Projections, and Homogeneity
Demosaicking Algorithms. Green areas are locations where errors are greater than 10. The red lines
indicate original image's edges. ..................................................................................................................41
Figure 37: Homogeneity versus Alternating Projections. ........................................................................42
Figure 38: Bilinear versus Alternating Projections ..................................................................................43
Figure 39: Macbeth Color Chart Comparisons ..............................................................................................44
Figure 40: Frequency Orientation Test ..........................................................................................................44
Figure 41: Slanted Bar Test ...........................................................................................................................45
Figure 42: Comparison of MSE (Data for this table includes data compiled from [3]) ................................46
Figure 43: Computational Complexity (comparison made using Matlab's Tic/Toc time elapsed functions).
.......................................................................................................................................................................47
Figure 44: Computational Complexity (Normalized to Bilinear) ..................................................................48
Figure 45: S-CIELAB AE Error ....................................................................................................................49

5

I. Introduction

Cost effective digital cameras use a single-image sensor, applying alternating patterns of red, green, and
blue color filters to each pixel location. The problem of reconstructing a full three-color representation of
color images by estimating the missing pixel components in each color plane is called demosaicking (see
Figure 1 ).

In this report we will examine some traditional methods of demosaicking versus some new frequency
domain methods. We will discuss the Bilinear and Adaptive Bilateral algorithms, and then compare them
with two newer frequency domain algorithms: the Homogeneity algorithm and the Alternating Projections
algorithm.


Figure 1: A Bayer Color Filter Array (CFA) requires demosaicking to reconstruct the missing color
pixel components for each color plane.
6



II. Demosaicking Algorithms

The use of Bayer matrices simplifies the semiconductor and hardware parts of image capture, but adds
to the complexity of image processing. The problem, as can be seen in Figure 2 below is how to best
reconstruct the original image from the Bayer sampled image. This section reviews of some of the
traditional algorithms used for Bayer Demosaicking.

Original & Bayer Image
100 200 300 400 500 600 700
50
100
150
200
250
300
350
400
450
500
550

Figure 2: Original Lighthouse and Bayer Image.


II. (a) Non-Adaptive Algorithms

Non-adaptive demosaicking algorithms are algorithms that do not take into account the specific
photometric pattern content of the mosaic that is being processed. They interpolate the picture's missing
color elements the picture by averaging the neighbor pixels in adjacent regions, generally of the same color.
They are simple to implement and have low computational time requirements. Three such algorithms we
look at here are nearest neighbor, bilinear interpolation, and smooth hue transition, though are others
mentioned in [3].

Nearest Neighbor

The simplest algorithm for demosaicking is nearest neighbor. Nearest neighbor assigns a color value
with the nearest known red, green or blue pixel value in the same color plane. There is usually some
ordering as to which nearest neighbor to use (left, right, top, or below) for the particular implementation.
However, it does not do a good job of interpolation, and it creates zig-zag zipper color artifact that distort
the image.

7
For example, in the Bayer CFA show in Figure 3, for nearest neighbor interpolation, the green pixel
located at B7 might be approximated by one of either G8, G2, G6 or G12.

Bilinear Interpolation

Bilinear interpolation goes one step further from Nearest Neighbor Interpolation by taking the average
value of all the nearest neighbors. For instance, the Bayer matrix in Figure 3



Figure 3: Bayer Mosaic (RGGB) for Bilinear Interpolation & Nearest Neighbor

will have a green value for B7 (which is mosaicked as a blue pixel) of the average of adjacent green values
namely G2, G6, G8, and G12.

For interpolation of red/blue pixels at a green position, the average of the two adjacent pixels of the
same color is assigned to the interpolated pixel. Thus, for example: B8=B7 + B9, and R8=R3+R13.

Smooth Hue Transition

Smooth Hue Transition is an elaboration on bilinear interpolation in that this algorithm takes into
account hue transition. One problem with the bilinear interpolation is that the hues of adjacent pixels
change abruptly and unnaturally [3]. Smooth hue transition attempts to deal with this problem by treating
the green channel as a luminosity channel and the red and blue channels as chromaticities. Consequently,
the algorithm demosaicks the three colors differently. For the green, luminous channel, a bilinear
interpolation scheme is undertaken. For smoother hue transitions within the R/G channels, the smooth hue
transitions algorithm defines an intrinsic hue value for each pixel that is just the value of the blue or red
pixel value divided by the value of the green value of the pixel. Therefore a blue or red value for a pixel
can be found in the following manner:


Nearest Neighbor
G7=G8
B8=B9
R7=R1

Bilinear Interpolation
G7= (G2 + G6 + G8 + G12) /2
R7= (R1 + R3 + R11 + R13)/4
B8= (B7 + B9)/2
R8= (R3 + R13)/2
8





Figure 4: Bayer Mosaic for Smooth Hue Transition

For example, as shown in Figure 4, to find the red value for B7, R7 = G7/4 *(R1/G1 + R3/G3 +
R11/G11 + R13/G13). This method is an added improvement over the bilinear in that it tries to smooth the
transition in hue from pixel to pixel.


II. (b) Adaptive Algorithms

Adaptive algorithms try to take into account spatial information of a specific image. Most of these
algorithms use thresholding values such that the algorithm can make intelligent decision as to which
neighbor pixel values to average.

Edge Sensing Interpolation

The edge sensing adaptive algorithm uses a set of threshold values to determine whether to average
adjacent pixels on the right and left side or adjacent pixels on the top and bottom side of the pixel being
interpolated. As the name alludes to, this algorithm is especially important in demosaicking edges within a
picture. Essentially the algorithm determines where a particular direction of adjacent pixels (top-bottom or
left-right) is exclusively greater than a given threshold value, as shown in Equation 1 below. If this is the
case, then most likely a line or edge exists and therefore when averaging adjacent pixels for demosaicking,
it is best to not smooth in the direction where the gradient values are higher than a given threshold value.

Where this method fails is along diagonal lines, since the gradients are only taken along the horizontal
and vertical directions.



Smooth Hue Transition
Green Pixels (Same as in Bilinear but note
green pixels have to be interpolated first)

Red Pixels at Blue Locations:
R7=G7/4 *(R1/G1 + R3/G3 + R11/G11 +
R13/G13)

Red pixel at green position when adjacent
pixels are red on top and bottom:
R8=G8/2*(R3/G3 +R13/G13)

Red pixel at green position when adjacent
pixels are red on right and left:
R12=G12/2*(R13/G13 + R11/G11)

Interpolation of green pixels carried
analogously to red pixels


9
( )
( )
( )
( )
4
18 14 12 8
G13
2
18 8
13

2
14 12
13

2
threshold Define
18 8
14 12
G G G G
Else
G G
G
V H if Else
G G
G
V H if
V H
T
G G V
G G H
+ + +
=
+
=
A > A
+
=
A < A
A + A
=
= A
= A
Eq. 1: Edge Sensing Algorithm



Figure 5: Bayer Mosaic for Adaptive Edges Sensing Algorithm


For example, when trying to determine G13, one must first determine if | G8 G18 | or | G12 G14 | is
greater than a threshold value. Let us assume | G8 G18 | is greater than the threshold value. This means
that is likely to be a horizontal line. This means that you do not want to average pixels along the y-
direction. Instead, you wish to average in the x-direction. For our example, this means G13 = (G12 +
G14)/2.

Variable Number of Gradients

Another method of adaptive interpolation is variable number of gradients. The basis behind this
algorithm is to take a weighted average (based on the direction gradient) of 8 directions (N, W, E, S, NW,
SW, NE, SE). It uses the same idea as the edge sensing algorithm, except that it now encompasses more
directions to accomadate diagonal lines. Whether a direction is used or not is dependant on whether it is
below a given threshold value. If a gradient is above a threshold value, it means that there is probably an
edge present, and that direction is avoided for smoothing purposes.

10

II. (c) Bilateral Demosaicking Algorithm

A relatively new type of adaptive algorithm is known as the bilateral demosaicking algorithm [11].
This demosaicking algorithm was adapted from the bilateral filter as discussed by Tomasi in [13] and it has
the nice property of both smoothing a picture as well as preserving its edges.

Bilateral Filtering Overview

Bilateral filtering tries to take into account both photometric as well as spatial characteristics of an
image. Any type of spatial filtering can be represented by the following convolution equation


= =
=
2
2
0
1
1
) , ( ) , ( ) , (
L
L l
L
L k
l k h l n k m g n m g Eq. 2

where g
0
is the original image and h is the convolution function.

The convolution function h, in bilateral filtering, is a blurring function given by the function:

)
2
exp(
2
1
) ( ) , (
2
2
h h
r
r h l k h
o o t
= = Eq. 3

This is a Gaussian function whereby the blur convolution decreases exponentially with distance. o
h

denotes the spread. This filter is known as the range filter and will be explained more thoroughly in the
section below.

Bilateral filtering also defines a secondary, photometric filter given by the equation below:
)
2
)] , ( ), , ( [
exp(
2
1
)] , ( ), , ( [
2
s s
l k g n m g E
l k g n m g
o o t
A
= O Eq. 4

This filter takes into account color similarity between adjacent pixels. The variable AE is a similarity
measurement kernel described in the Photometric Filtering section below.

Using these two kernels, the range kernel and the photometric difference kernel, the bilateral filter is
able to take into account both the importance of color similarity as well as proximity in interpolating RGB
values. The overall filtering therefore is given by the equation:


= =
=
2
2
0
1
1
) , , , ( ) , ( ) , (
L
L l
L
L k
l k n m b l n k m g n m g Eq. 5

where ) , , , ( ) , ( ) , , , ( l k n m s l k h l k n m b =

Range Filtering

One of the kernels used in the bilateral filter is the range difference filter, also known as a proximity
filter. The underlying theory behind this filter is that nearby pixels are more likely to resemble the RGB
values of a particular pixel in question as opposed to pixels far away. The range filter used by Tomassi
[13] and Ramanath [11] is a Gaussian filter whereby the importance of an adjacent pixel is exponentially
related to its distance from the pixel being interpolated.
11

Figure 6: Gaussian Kernel for the Adaptive Bilateral Filter

Reproducing the range filtering equation again below:

|
|
.
|

\
|
+
= = =
2
2
2 2
2
2
2
exp
2
1
)
2
exp(
2
1
) ( ) , (
h h h h
l k r
r h l k h
o o t o o t
Eq. 6

K and L in the above equation represent the x and y distance between the pixel currently under bilateral
filtering and the adjacent or nearby pixel that will be used in convolving. The term r is equal to the square
root of the addition of the squares of k and l. The o
h
term is the spread distance in pixels.


= =
=
2
2
0
1
1
) , ( ) , ( ) , (
L
L l
L
L k
l k h l n k m g n m g Eq. 7

The convolution equation, reproduced above, has the term ) , (
0
l n k m g . This is the R, G, or B
(dependent on which plane is being processed) difference between the pixel currently under bilateral filter
processing and the adjacent pixel. How valuable this difference is in determining a range filtered image is
dependent on how far the adjacent pixel is from the pixel currently under processing. This is where the
Gaussian exponentially decaying function causes pixels far away from the central pixel being processed to
have a lesser influence in the averaging process of smoothing or interpolating the image.

The importance of this filter is that it allows for smoothing. Pixels close to each other will more
significantly affect each others filtered response.

-2
-1
0
1
2
-2
-1
0
1
2
0
0.2
0.4
0.6
0.8
1
F
x
F
y
M
a
g
n
i
t
u
d
e
12

Figure 7: Original photo (left) versus bilateral smoothed image (right).

The above RGB images show the original image and then the range filtered smoothed image. As one can
see, the range filtered image does not do a good job of preserving the edges and lines seen in the original.
Although the range filter can perform smoothing operations, what it cannot do is identify and preserve
edges and lines. Bilateral filtering therefore employs photometric filter that identifies color similarity
between adjacent pixels. This kernal is defined by Ramanath [12] as:

2
0
2
1
)] , ( ) , ( [ )] , ( ), , ( [ l k g n m g l k g n m g E
o o
= A Eq. 8

This kernel identifies the difference in photometric values between the pixel under bilateral filtering
(m,n) and the adjacent pixel (k,l). For the CIELAB color space, the following range kernel is used in
L*a*b* space again finding the photometric difference between nearby pixels.

Eq. 9
2
0 0
2
0 0
2
0 0 0 0
2
2
)] , ( ) , ( [ )] , ( ) , ( [ )] , ( ) , ( [ )] , ( ), , ( [ l k g n m g l k g n m g l k g n m g l k g n m g E
b b a a L L
+ + = A

This photometric difference equation is then used in the following equation that takes into account the
spread of the function, o
s
(distance in pixels).

)
2
)] , ( ), , ( [
exp(
2
1
)] , ( ), , ( [
2
s s
l k g n m g E
l k g n m g
o o t
A
= O Eq. 10

When convolving this kernel with the original image, given by the following equation,


= =
O =
2
2
0
1
1
)] , ( ), , ( [ ) , ( ) , (
L
L l
L
L k
l k g n m g l n k m g n m g Eq. 11

one can see that the term ) , (
0
l n k m g will be scaled by the photometric kernel.
Adding both the photometric/range filter with the proximity/domain filter, yields the following final
bilateral filter equation:


= =
=
2
2
0
1
1
) , , , ( ) , ( ) , (
L
L l
L
L k
l k n m b l n k m g n m g Eq. 12

13
where )] , ( ), , ( [ ) , ( ) , , , ( l k g n m g l k h l k n m b O =

The following shows how adding the photometric filter will increase the sharpness of the edges and
lines in the image. Note how the window sills edge and the outlines of the fence are sharper.


Demosaicking

Bilateral filtering has been used in image processing for many years. However, Ramanath has proposed
using bilateral filtering in the context of Bayer Demosaicking [11]. Implementation of a bilateral
demosaicking filter is complicated by the fact that for the three color planes (R,G, B), data is not present at
all pixel locations. Therefore, applying a photometric or range filter to a Bayer Mosaic introduce errors due
to the fact that alternating pixel values in each color channel are zero from the Bayer CFA. One can avoid
this problem by masking out the zero values, so that they are not included in the filter.

To illustrate this concept, let us go back again to a simple Bayer Matrix of the form RGGB:

Figure 8: Bayer CFA

Applying a bilateral filter directly to this matrix as described in the previous two sections would yield
the following results. For example, interpolating the value of G5 would involve taking into account the
photometric as well as the proximity similarities between G5 and G2, G4, G6, and G8. However, the
bilateral filter proposed above would take not only into account G2, G4, G6, and G8, but also the values of
G1, G3, G7, and G9. The latter green values are that of a red pixel and therefore have green values of 0.
Interpolating using this data would cause unnecessary 0 values to be included in the interpolated color for
G5. Similarly one can imagine the same problem will exist for places where red or blue values must be
interpolated.

To alleviate this problem, masking is proposed by researchers. Masking essentially will not
include adjacent pixels that have no appropriate color information into the bilateral filter kernel. Ramanath
[11] presents six different masks that are useful for RGGB Bayer Matrices. They are reproduced from his
paper below:
14

a) G interpolation mask for R/B pixel b) B/R interpolation mask for R/B pixel

c) R interpolation mask for even row G pixel d) B interpolation mask for even G pixel

e) R interpolation mask for odd row G pixel f) B interpolation mask for odd G pixel
Figure 9: Masks for Adaptive Bilateral Demosaicking (Ramanath [11])

Black areas denote where the mask will be applied and consequently where pixels will not be processed
when applying the bilateral filter. For instance a) is a Green interpolation mask for either a Red or Blue
center pixel. This means that to calculate the interpolated value of Green for a Bayer Matrix pixel of Red
or Blue, one must mask out the pixels that are black.

Thus, given a demosaicked image ) , , ( c n m f , where m and n are the pixel location and c is the color
plane R, G, or B, one needs to decompose the image using the following parameters:

Eq. 13
) , , ( ) , , ( ) , , ( ) , , ( ) , , ( c n m f c n m f c n m f c n m f c n m f
even even odd odd odd even even odd
+ + + =

There are four different subsets of the demosaicked image because the Bayer Matrix can be broken
down into odd or even combinations of the (m,n) indexing of the pixels. Ramanath presents the following
mathematical derivation of these 4 subsets for green interpolation which we will try to show by example in
this section:

) , ( ) 2 , , (
even odd even odd
n m g n m f = and ) , ( ) 2 , , (
odd even odd even
n m g n m f = Eq. 14

In an RGGB Bayer matrix, reproduced once again below,

15

Figure 10

pixels indexed with odd or even values, such as (1,2) or (3,2) or even, odd values such as (2,1) or (2,3)
correspond to measured green pixels specifically in our case G2, G8, G4, and G6. Therefore, the bilateral
filter should not perform any demosaicking here and instead just use the actual images values,
either ) , (
even odd
n m g or ) , (
odd even
n m g .
Eq. 15
) , , , ( ) , ( ) 2 , , (
'
l k n m b l n k m g n m f
odd odd
L
L k
L
L l
odd odd odd odd
= =
= and

) , , , ( ) , ( ) 2 , , (
'
l k n m b l n k m g n m f
even even
L
L k
L
L l
even even even even
= =
=

The above two equations are points in an RGGB Bayer Matrix where the measured value is not green. In
these cases, where essentially the pixel is indexed by either an odd, odd or even, even combination, one
must apply the bilateral filter denoted by the function b.

Similarly, one must also develop equations for the blue and red interpolations.

Implementation of Bilateral Filter for Bayer Demosaicking

Our implementation of the bilateral filter for Bayer Demosaicking used an RGGB Bayer Matrix
and an implementation of the Bilateral Filter available from Matlabs website. Following Ramanaths
paper, to adapt the Bilateral Filter code to that of Bayer Demosaicking, we used a set of masks described in
the section above.
The implementation of the Bilateral Filter was that of a domain and range filter dot multiplied
together. The following MatLab code snippet shows this implementation:



The above lines of code simulate the domain filter which is a Gaussian filter that takes into
account the X and Y distances between the pixel under processing and the interpolating pixel.



The above lines of code implements the photometric or range filter. The photometric filter again
looks at changes in color intensity and tries to avoid blurring areas where there is a high color intensity
difference. High color intensity differences usually mean that there is line or edge present.

The two filters are dot multiplied as follows:
16



This matrix is then multiplied by the difference in values between the pixel under processing and
the pixel being used for interpolation.

To convert this algorithm into a Bayer Demosaicking algorithm, a set of masks as outlined by
Ramanath needed to be added to the code already developed. To develop these masks, our code multiplies
the interpolation matrix, F, by the mask as seen below:



The above lines of code are for processing a red pixel. Essentially, the code first finds out if one is
on a red pixel i.e. if the value of the input image, A, in the Red plane is not zero. If this is the case, three
masks are applied again following Ramanaths paper for red, blue, and green processing. Note that red
processing just assigns the value of the original image to that of the processed image. This is because when
one is on a red pixel, the Bayer matrix value of red is the correct value of red. Then finally the masked
bilateral filters are applied to the input image.

Masks for each of the different colors red, green, blue are implemented in a similar fashion.

Demosaicking Error

Upon implementation of the code above, the following sets of images were found:


17
Original Error Filtered
Figure 11: Bilateral Demosaicking Implementation Images

The three images above are that of an original image that was then sent through a Bayer RGGB
matrix, an MSE error image of our code, and finally the image that is processed by our code. The MSE
statistics are as follows:

Table 1: Bilateral MSE Error Using Our Demosaicking Algorithm
MSE ERROR
Red 0.3934
Green 0.2633
Blue 0.3243
Overall 0.33


Ramanath [12] provides the following MSE error for his images using his implementation of the Bayer
Demosaicking filter:
Table 2: MSE Error for different demosaicking algorithms (reprinted from Ramanath [11])


As one can see the table above, Ramanaths filter has similar errors on the order of 0.196 better than
our implementation, but not a whole lot better. Possible reasons for our implementation not demosaicking
the image properly could be related to the window size we chose. Essentially our masks were limited to a 3
by 3 matrix. This does not allow for interpolation of pixels outside the 3 by 3 matrix to occur.



III. Motivation for New Bayer Demosaicking algorithms in the Frequency Domain

Both non-adaptive and adaptive Bayer demosaicking algorithms still create noticeable color artifacts in
the reconstructed image. With the exception of the Hue Transition method, these algorithms all operate
independently among the three color planes, and therefore have trouble with colored edges at high
frequencies that are aliased into the low frequency picture upon reconstruction. An example of this can be
seen in Figure 12 below which shows the non-adaptive bicubic interpolation introducing a rainbow of
colors along the fence in the famous lighthouse picture from the Kodak Image Database.


18


Figure 12: Bicubic interpolation of Bayer Mosaic on right introduces errors into the picture (original
on left).

In this particular example, we see that bicubic interpolation works well for grayscale but not as well for
color. Other previous interpolation schemes have similar difficulties. This failure to account for the
different color planes differences has motivated the search for demosaicking algorithms that exploit
similarities in the frequency domain. These methods take advantage of the fact that the green channel has
twice as many samples as the red and blue, and according to Nyquist sampling theory, the green channel
therefore has more spectral content (higher sampling rate corresponds to greater bandwidth). You can
therefore extract more information from the green channel and make assumptions about the missing red and
blue channel based on the idea that high frequencies are strongly correlated among the three color planes.
We will present two algorithms that exploit these spectral properties : the first is called Alternating
Projections on Convex Sets (or POCS) [8], and the other called Adaptive Homogeneity [10].

III. a) Alternating Projections Demosaicking Algorithm


Alternating projections, created by Gunturk, Altunbasak and Mersereau [8], uses wavelet
decomposition to efficiently extract the well preserved high frequency components of the green color plane.
The algorithm also uses an iterative update strategy to reconstruct the degraded high frequency components
of the red and blue color planes. This approach enforces similar high-frequency characteristics for the red,
green, and blue channels, and ensures that the resulting image tracks the observed data in the Bayer CFA.
The algorithm reconstructs the color channels using a projections onto convex sets technique that iterates
through the interpolation by imposing the constraint that the original picture data from the Bayer CFA does
not change for each interpolation iteration. Only the interpolated values between the real data sample points
are allowed to change. Thus, for example, if you have R1, R3, R5 etc on a single line representing your
original red data from your camera, only R2, R4, R6 etc. are allowed to change during each iteration.

The algorithm makes use of wavelets to efficiently implement the frequency domain conversion of the
picture, without the use of Fourier transforms. Each channel of the picture is decomposed into subbands
using wavelets, with the high frequency components being swapped out of the green channel into the red
and blue channels for the interpolation of the red and blue channels. This technique is illustrated in Figure
13.
19

Figure 13: High frequency substitution from the green channel to the red and blue reduces color
artifacts.

Correlation of High Frequencies & Nyquist Exploitation of Green Channel's Greater Bandwidth

It is well known that in natural images, the color channels are highly mutually correlated in high
frequencies [8]. The cross-correlation of red/green and blue/green channels is high in the high frequencies
because all three channels are very likely to have the same texture and edge locations. Demosaicking can
exploit this property by using the fact that luminance (green) is sampled at a higher rate in a CFA than the
chrominance (red and blue) channels. Nyquist theory tells us because there are twice as many green pixels
20
as green or red in the Bayer CFA, the spectral bandwidth of the green channel is larger than the red or blue
channels. Therefore, the green channel is less likely to be aliased, and edge and texture details are better
preserved in the green channel than in the red and blue channels. Demosaicking artifacts, which are most
severed in high frequency regions, such as edges, are caused primarily by aliasing in the red and blue
channels. Alternating projections uses inter-channel high frequency correlation to retrieve the aliased high
frequency information in the red and blue channels by recreating them from the green frequency spectrum
in the high frequencies. Figure 14 shows the frequency spectrum for the lighthouse image, and illustrates
how closely their high frequency spectrums compare to one another.



Figure 14: 2-D Fourier transform of lighthouse image. Notice how the high frequencies are strongly
correlated in each color channel.


Figure 15 shows the red and blue color planes when subtracting out the green color plane. If you
examine the Fourier transform of the resulting difference image, as shown in Figure 16, you'll see that only
low frequencies remain. This tells you that the two color channels, red & green, and blue & red, are highly
correlated in the high frequencies.
2-D Fourier Transform Lighthouse figure
Red FFT ---> Green FFT <--- Blue FFT
0,0 0,0 0,0
Red FFT Green FFT Red FFT Green FFT Green FFT
0,0 0,0 0,0
Red FFT
Green FFT
Blue FFT
High
Frequencies
Low
Frequencies
Low
Frequencies
High
Frequencies
Low
Frequencies
High
Frequencies
High
Frequencies
High
Frequencies
High
Frequencies
21
Red Color Plane Minus Green (right) Blue Color Plane Minus Green (right)
Red minus Green Blue minus Green

Figure 15: Red color plane minus green color plane leaves only low frequency data.


FFT 2-D Frequency Plot of Lighthouse Original(right) Red/Green difference (center) Blue/Green difference (right)
0,0 0,0 0,0
Original FFT
High Frequencies Present
Red Minus Green
High Frequencies Gone
Blue Minus Green
High Frequencies Gone

Figure 16: Fourier transform of Original, Red-Green, and Blue-Green color planes. Frequency
spectrum of the difference images, shows that the high frequencies are highly cross correlated in the
red and green frequencies, as well as the blue and green frequencies.


Wavelets
Alternating projections uses wavelets to filter the image into horizontal, diagonal, vertical and
approximation coefficients that reflect the high frequency content of the image. This efficient
22
implementation avoids using the Fast Fourier Transform altogether in performing a spectrum analysis of a
picture. In Figure 17, a picture is decomposed into wavelet coefficients on the right, by using two short
wavelet analysis filters, one a high frequency filter, and the other a low frequency filter. These filters are
applied both vertically and horizontally to the picture to obtain the horizontal, vertical and diagonal high
frequency spectrum in the detail windows on the top right, bottom left, and bottom right of the illustration.
The low frequency content is found in the top left window.


Figure 17: Wavelet Decomposition: High frequency components are stored in the wavelet coefficients
found in the horizontal, vertical and diagonal detail windows on the top right, bottom left, and
bottom right of the decomposition illustration. The low frequency content is found in the top left
window.

The horizontal, vertical and diagonal spatial characteristics of the picture preserved in the high
frequency wavelet decomposition, as can be seen in Figure 18. This figure also shows the high pass filter h
1

being applied both vertically and horizontally for each wavelet subband.

23

Figure 18: Wavelet decomposition into high/low frequencies with vertical/horizontal/diagonal
subbands.


Reconstruction of image from wavelet coefficients.

Adding the approximation matrix, plus the horizontal, diagonal, and vertical matrix components, allows
you to reconstruct the entire image once you apply the inverse wavelet transform.

The complete alternating projections algorithm is presented in Figure 19 and Figure 20.
24



Figure 19: Flow chart showing how the Alternating Projections demosaicking works using wavelets.


25


Figure 20: Iteration allows Alternating Projections to converge to a solution for the interpolated
samples in the green and red channels.

III. b) S-CIELAB Error for Alternating Projections & Matlab Code

To examine the differences in the color images, and judge them by a qualitative metric that returns
numbers based on human perceptual measurement differences, we used the spatial extension to the
CIEL*a*b* space known as S-CIELAB. The extension is in the form of a spatial pre-processing step that
incorporates the pattern sensitivity measurements of Allen Poirson and Brian Wandell (see Matlab code
[2]). For all the tests, the lighthouse image was used, from the Kodak Digital Image Library. The
alternating projections Matlab code used to generate the reconstructed images can be found in Appendix II,
under [4]. This code will take in a normal picture (tif, bmp, jpg, etc), synthesize a Bayer CFA, and also
generate a bilinear interpolation of the image, in addition to the alternating projections image, for
comparison purposes.


Using the S-CIELAB metric, the alternating projections method of demosaicking produced an error of
2.0495, with the sum of the error image greater than 20 being 202. The alternating projections method
produced the lowest MSE of all the methods under comparison, as can be seen in Table 1 in the Summary
and Conclusions section of this report.

Figure 21 shows a histogram of the errors for the alternating projections method.
26
0 10 20 30 40 50 60 70
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
x 10
5
Histogram of Errors for Alternating Projections Demosaicked Image
F
r
e
q
u
e
n
c
y
MSE Error

Figure 21: Histogram of errors for Alternating Projections


In Figure 22, the S-CIELAB error figure shows in green where the errors are greater than 10. The red
lines indicate edges, based on the original lighthouse figure, where the errors are most severe.
27
Edges where error is worst for Alternating Projections Demosaicked Image
50 100 150 200 250 300 350
50
100
150
200
250
300
350
400
450
500
550


Figure 22: S-CIELAB Errors for Alternating Projections. Green areas are errors greater than 10 in
CIELAB space. Red lines indicate edges, based on the original figure.

III. a) Homogeneity Demosaicking Algorithm


Instead of using interpolation based on edge indicators, homogeneity imposes a similarity restraint on
the luminance and chrominance values within small neighborhoods. RGB data is first interpolated
horizontally and vertically. The best direction chosen to avoid edges is made based on CIELAB space.
Horizontal and vertical interpolated pixels are chosen based on the local homogeneity or similarity of
nearby pixels based on Euclidean distance in CIELAB space. Local homogeneity is measured by the total
number of similar luminance and chrominance values of the pixels in the same neighborhood.


The homogeneity algorithm takes advantage of the difference between the red minus green color plane
being strongly correlated in high frequencies. It creates a filter which samples the red channel, and then
steals the high frequencies from the red to insert them into the green channel's interpolated values. The low
pass filter is used on the green channel to do a simple interpolation of the existing Bayer CFA.

Eq. 16 shows the steps involved in interpolating between the green channel's samples, thereby
reconstructing the entire green channel.

28
Eq. 16: Homogeneity Algorithm (Green Channel Interpolation)
( )
samples. CFA Bayer existing s channel' green the
on filter pass low a is ) ( ) ( while channel green the into
back s frequencie high s channel' red the adds ) ( ) ( that Note
odd ) ( ) ( ) ( ) (
even ) (
) (
into Substitute
filter. pass high a is ) ( and s, frequencie low R - G
s frequencie high in correlated are channels red and green know we Since
) ( ) ( signal difference sampled attenuates ) ( where
0 ) ( ) ( ) ( i.e.,
) ( ) ( ) ( ) (
that such ) ( choose therefore
CFA Bayer from available not is ) ( ) ( Note
odd ) ( ) ( ) ( ) (
even ) (
) (
to simplifies This
) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) (
and
of samples even and odd of terms in Analyze
) ( ) ( ) (
property following the with filter linear a Find
sampled odd t reconstruc to is Goal
) (
0
) (
0
) (
) (
) ( ) ( ) (
Channel Green of ion Interpolat y Homogeneit
0 1
1 0
1 0 0 1
0
0
1 1 0
1 1 0
1 0 1 0
0
1 0
1 0 0 1
0
1 1 1 0 0 1 0 0
0 1
1
1 0
1 0
x G x h
x R x h
x x R x h x G x h
x x G
x G
G(x)
x h
x R x G x h
x R x G x h
x R x h x G x h
x h
x G x h
x x G x h x G x h
x x G
x G
x G x h x G x h x G x h x G x h x G
h h
h(x)
x G x h x G
h
G
x odd x G
x even
x G
x odd
x even x G
x G
x G x G x G
-
-

- + -
=
=

= -
- ~ -
-

- + -
=
- + - + - + - =
- =

=
+ =












29
Figure 23 shows how the red and blue channels are reconstructed from both the Red-Green Bayer samples,
and the interpolated Green pixel image G(x) from the previous step ( Eq. 16).


Figure 23: Red & Blue Channel Reconstruction for Homogeneity Algorithm


It should be noted that the above interpolations of the Green, Red, and Blue channels are done
separately both vertically and horizontally. The problem then becomes finding the direction that is best to
interpolate (vertical or horizontal). Misguidance effects occur when the direction of interpolation is
wrongly selected. To avoid misguidance, a metric is formed based on the homogeneous characteristics of
the luminosity and chromaticity of the pixels in the nearby region where interpolation is occurring. Pixels
in a neighborhood region are selected using a distance function in the CIELAB metric space. Distance is
mapped according to luminosity and chromaticity, as shown in Figure 24. Homogeneity refers to deciding
which direction to interpolate the picture, either vertically or horizontally, based on the shared chromaticity
and luminosity values of a pixel region.


30

Figure 24: Homogeneity refers to deciding which direction to interpolate the picture, either vertically
or horizontally, based on the shared chromaticity and luminosity values of a pixel region.


The RGB data is first interpolated horizontally and vertically. There are then two candidates for each
missing color sample, and the decision for which direction to choose for interpolating it is made in the
CIELAB space. The horizontally or vertically interpolated pixel value is chosen based on the local
homogeneity. The neighborhood homogeneity region is measured by the total number of similar luminance
and chrominance values of the pixels that are within a neighborhood of the pixel in question. Two values
are taken as similar when the Euclidian distance between them is less than a threshold.


III. b) S-CIELAB Homogeneity Method Error & Matlab Code

The homogeneity Matlab code used to generate the reconstructed images can be found in Appendix II,
under [5]. This code will take in a normal RGB picture (tif, bmp, jpg, etc), synthesize a Bayer CFA, and
then output the demosaicked image as a file.

For the lighthouse image, the Homogeneity method of demosaicking had an MSE error using the S-
CIELAB metric of 2.0826, with a total error sum of 162 for the errors greater than 20. Although its error
was slightly higher than the alternating projections algorithm, it produced the best looking picture.






31

S-CIELAB error map for Homogeneity Demosaicked Image


50 100 150 200 250 300 350
50
100
150
200
250
300
350
400
450
500
550
10
20
30
40
50
60

Figure 25: Error Map for Homogeneity Method

The error histogram for homogeneity method compares favorably with the POCS method. Comparing it to
the bilinear method, the homogeneity histogram is not as spread out as the bilinear. This tells you have
fewer errors with smaller standard deviation.

32
0 10 20 30 40 50 60 70
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
x 10
5
Histogram of Errors for Homogeneity Demosaicked Image
F
r
e
q
u
e
n
c
y
MSE Error

Figure 26: Histogram of Errors for Homogeneity Method



33
Edges where error is worst for Homogeneity Demosaicked Image
50 100 150 200 250 300 350
50
100
150
200
250
300
350
400
450
500
550

Figure 27: S-CIELAB Edge Error Map. Green areas are where errors are greater than 10, while red
lines indicate edges from original image.


IV. Adaptive Bilateral Algorithm
Though we were not able to implement the adaptive bilateral demosaicking routines in Matlab, we were
able to benchmark the filtering algorithm and use it as an asymptotic 'best case' example of what this
algorithm might do when demosaicking. Figure 28 shows the original lighthouse juxtaposed against the
bilateral filtered image for comparison purposes.


34
Original versus Bilateral 2 Implementation

Figure 28: Original (Left) and Bilateral Filtered (Right). No Bayer demosaicking was implemented
for this image.

IV. a) S-CIELAB Error Results For Adaptive Bilateral Filter

Using the Adaptive Bilateral filter, but not performing any Bayer Demosaicking, we found the
following S-CIELAB RMS error for the lighthouse image: 6.4764, with the sum of the errors exceeding 20
to be 5245. Note that we used a standard deviation of 1 for both the range and spatial domain Gaussian
filter for the bilateral algorithm, with a filter window width of 5 pixels (i.e., kernel of 5x5 pixels). Figure 36
shows the S-CIELAB error map for the bilateral filtered picture.
35
S-CIELAB error map for Bilateral Filtered Image


50 100 150 200 250 300 350
50
100
150
200
250
300
350
400
450
500
550
10
20
30
40
50
60

Figure 29: S-CIELAB Error Map for Bilateral Filtered Image.
36
0 10 20 30 40 50 60 70
0
2
4
6
8
10
12
14
x 10
4
Histogram of Errors for Bilateral Filtered Image
F
r
e
q
u
e
n
c
y
MSE Error

Figure 30: Histogram of Errors for Bilateral Filtered Image.

Figure 31 shows the edges where the errors were worst for the bilateral filtered image. Green areas show
where errors were greater than 10, while red lines show the edges from the original image.

37
Edges where error is worst for Bilateral Filtered Image
50 100 150 200 250 300 350
50
100
150
200
250
300
350
400
450
500
550

Figure 31: Bilateral Error Edge Map: Green areas are locations where errors are greater than 10.
Red lines indicate edges from original image.



Second Filter Experiment
Next, we tried changing the bilateral filter width to 11 pixels width (w=5 half width), and increased the
edge kernel standard deviation to 3, while decreasing the smoothing to 0.1. This has the result of preserving
the edges better, while still smoothing the picture, as can be seen in the following illustration. The goal is to
perform adequate smoothing which is required for Bayer interpolation, while at the same time preserving
the edge boundaries.

w = 5; % bilateral filter half-width
Sigma = [3 0.1] % Standard deviation for Gaussian kernel


38
Original (left) Bilateral filtered Image sigma=[3 0.1] w=5 pixels (Right)

Figure 32: Original (Left) Versus Bilateral Filtered (w=5 sigma =3 & 0.1).

S-CIELAB Error for Bilateral filter with w=5 pixels half filter width, and sigma=[ 3 0.1]

The S-CIELAB error for the above bilateral filtered parameters was 4.2305, with a sum of 1077 for the
errors greater than 20. Thus, with a few modifications, we see that one can bring the error of the bilateral
filter down to the level of the bilinear filter, but it is not as good as the Homogeneity or the Alternating
Planes methods.

39
S-CIELAB error map for Bilateral Filtered Image


50 100 150 200 250 300 350
50
100
150
200
250
300
350
400
450
500
550
10
20
30
40
50
60

Figure 33: Error map for Bilateral Filtered Image: (w=5 sigma =3 & 0.1).


0 10 20 30 40 50 60 70
0
2
4
6
8
10
12
14
x 10
4
Histogram of Errors for Bilateral Filtered Image
F
r
e
q
u
e
n
c
y
MSE Error

Figure 34: Histogram of Errors for Bilateral Filtered Image: (w=5 sigma =3 & 0.1).

40


Edges where error is worst for Bilateral Filtered Image
50 100 150 200 250 300 350
50
100
150
200
250
300
350
400
450
500
550

Figure 35: Edge Error Map for Bilateral Filtered Image: (w=5 sigma =3 & 0.1). Green areas
indicate errors greater than 10, while red lines show edges from original image.


V. REPORT SUMMARY AND CONCLUSIONS

We conclude this report with a side by side comparison of the S-CIELAB MSE error results, error
maps, and actual photo comparisons of the bilinear, homogeneity and alternating projections demosaicking
algorithms. Figure 36 shows a S-CIELAB plot of the error differences for the three methods.
41
Bilinear (Left) Alternating Projections (Center) Homogeneity (Right)
Green= Errors >10
100 200 300 400 500 600 700 800 900 1000 1100
50
100
150
200
250
300
350
400
450
500
550

Figure 36: S-CIELAB Comparison of Errors for Bilinear, Alternating Projections, and Homogeneity
Demosaicking Algorithms. Green areas are locations where errors are greater than 10. The red lines
indicate original image's edges.

The green colored elements in the picture tell you where the errors are greater than 10. Bilinear is by far
the worst, and alternating projections has some problems along the fence. Homogeneity is the best, with the
fewest patches of green area. S-CIELAB also finds out where the edges from the original image, marking
them in red.



42

Figure 37: Homogeneity versus Alternating Projections.



Homogeneity (Left) Versus Alternating Projections (Right)
43

Figure 38: Bilinear versus Alternating Projections


To better compare the algorithms ability to distinguish sharp edges and color boundaries, we used the
Macbeth color chart, the frequency orientation test, and the slanted bar test, as shown in the next three
figures. The Macbeth color, shown in Figure 39, illustrates how well the different algorithms were able to
differentiate color boundaries. We see that alternating projections introduces breadcrumb pixels along the
borders, and that the homogeneity method has some problems with border separations.
Bilinear (Left) Versus Alternating Projections (Right)
44
27
Macbeth Color Checker
(Used to check color boundaries)
Original
Bilinear
Interpolation
(Demosaicked)
Alternating
Projections
(Demosaicked)
Alternating Projections
errors caused
by looking at incorrect
neighbor pixels
(Which gradient direction?)
Homogeneity
(Demosaicked)


Figure 39: Macbeth Color Chart Comparisons

The frequency orientation test, shown in Figure 40, tells you how well high frequency aliasing is removed
from the red and blue channels. As you can see, Alternating Projections and the Homogeneity methods do
a much better job than the Bilinear method.

28
Frequency Orientation Test
(tests how well aliasing is removed from
red and blue channels)
Original
Bilinear
Alternating
Projections
Homogeneity

Figure 40: Frequency Orientation Test

45
In the slanted bar test, shown in Figure 41, Bilinear interpolation has the most trouble at accurately
rendering the diagonal edge. One can see pronounced aliasing of the red and blue channels as this
algorithm fails to interpolate the edge boundary properly. Alternating projections does the best, while the
Homogeneity method introduces a rougher edge, though it does not alias into the red and blue channels like
the Bilinear method.

29
Slanted Bar Test
Original
Bilinear
Alternating
Projections
Homogeneity

Figure 41: Slanted Bar Test


V. a) Error Comparison Results for Alternating Projections, Homogeneity, Bilateral, and Bilinear
Algorithms:


The next section summarizes this report's quantitative and qualitative test results. Note that when using
the S-CIELAB metric, we wanted to maximize our error by using a larger field of view (40 degrees) to
bring down the number of samples calculated per degree. This was necessary to align our results more
closely with what was being reported in the literature ([7] [2] and [9]).


From Table 1 below, it can be seen that the Homogeneity method compares favorably with the
Alternating Projections method, yet the Homogeneity algorithm produces fewer visible artifacts.

Table 3: S-CIELAB MSE Errors for the Three Algorithms
Algorithm
Compare.m
Error (MSE) S-CIELAB MSE
Sum of Error
Image > 20
Alternating
Projections 2.9829 2.0495

202
Homogeneity 3.0589 2.0826
162
Bilateral 6.1546 4.2305 to 6.4764*
1077 to 5345*
Bilinear 9.987 5.695
3969
*Higher value results when the Bilateral filter pixel width is larger (5x5), and sigma
1
=3 and sigma
2
= 0.1.
46

Note that in the table above, our own MSE error calculation measured the MSE for all 3 color planes
from the original and interpolated images. The S-CIELAB error was calculated using Brian Wandell's
utility mentioned in the Matlab code section [2].

Based on our observed results above, the Alternating Projections algorithm compares favorably with
the Homogeneity algorithm, and both algorithms will give you a 200 to 300% improvement in your picture
when demosaicking. It is interesting to note that the Homogeneity algorithm produces visibly better results,
although the error is about the same as the Alternating Projections method.

The Bilateral method is also quite good, and if computational complexity is a hardware issue, it is a
good choice, better than the bilinear in that you will get a 100% to 200% improvement in the picture
quality.

The following figure shows a comparison of MSE normalized to Variable Gradients method (data for
Homogeneity, Alternating Projections and Bilateral Filtering compiled by Nick Dargahi, other data
compiled by Teng Chen [3]).



1.0
1.0
1.5
1.5
1.9
2.1
2.6
2.8
2.8
2.8
6.1
8.2
0 1 2 3 4 5 6 7 8 9
MSE (Normalized)
Alternating Projections
Homogeneity
Variable Gradients
Laplacian Color Correction I
Edge Sensing II
Bilateral
Pattern Recognition
Bilinear
Smooth Hue Transition
Smooth Hue Transition Log
Edge Sensing I
Nearest Neighbor
Comparison of MSE
Normalized to Variable Gradients Normalized to Alternating Projections

Figure 42: Comparison of MSE (Data for this table includes data compiled from [3])

Computational complexity is mapped for each method according to how long the algorithm took to
process a single image. Figure 43 shows this comparison for the best adaptive algorithm mentioned in [3]
versus the bilinear, bilateral, and newer frequency domain methods.
47
3.55
6.7
15.78
41.37
212.29
0 50 100 150 200 250
Seconds
Bilinear
Alternating Projections
Bilateral
Homogeneity
Variable Gradients
Computational Complexity



Figure 43: Computational Complexity (comparison made using Matlab's Tic/Toc time elapsed
functions).

Figure 44 shows the same Computational Complexity graph, but now normalized to Bilinear =1. The
time for both these illustrations was compared using Matlab's Tic/Toc time elapsed function. Note Variable
gradients method was the best algorithms rated in Teng Chen's paper [3], and that the newer algorithms,
Homogeneity, Alternating Projections, and the Bilateral Filter outperform it both in performance and
quality of picture.
















48
1.00
1.89
4.45
11.65
59.8
0 10 20 30 40 50 60
Bilinear
Alternating Projections
Bilateral
Homogeneity
Variable Gradients
Computational Complexity
(Normalized to Bilinear)



Figure 44: Computational Complexity (Normalized to Bilinear)





Figure 45 shows the S-CIELAB MSE error results for each algorithm. Alternating projections and the
homogeneity method both outperform the bilinear demosaicking method by almost 300%.

























49
2.05
2.08
4.23
5.70
0 1 2 3 4 5 6
MSE
Alternating
Projections
Homogeneity
Bilateral
Bilinear
S-CIELAB Error DeltaE

Figure 45: S-CIELAB AE Error



The conclusions reached in this report show that the new frequency domain methods of Bayer
Demosaicking are far superior to previous adaptive and non-adaptive algorithms. In particular we make the
following assertions:

Frequency domain methods can be superior to traditional methods in reconstructing Bayer
mosaicked pictures, but the computational complexity is greater in performing the reconstruction.
If hardware speed issues are a concern, then these algorithms will not perform as quickly as
simpler methods.
Using either the Homogeneity or Alternating Projections Algorithm, it is possible to see a 200-
300% improvement in picture over the Bilinear or Bi-cubic method used in many cameras today.
Both frequency domain methods are 100% better than the best Adaptive Algorithms.







50

Appendix I. References & Bibliography


[1] Akuiyibo, E. and Atobatele, T., Demosaicking using Adaptive Bilateral Filters. Stanford Psychology
221Final Project Report, Winter 2006. http://scien.stanford.edu/class/psych221/projects/06/atobate/

[2] Bayer B.E., Color imaging array, U.S. Patent 3 971 065, July 1976.

[3] Chen, Ting, "A Study of Spatial Color Interpolation Algorithms for Single-Detector Digital Cameras",
Stanford Psychology 221 Final Project Report, Winter 1999.
http://scien.stanford.edu/class/psych221/projects/99/tingchen/

[4] Dubois, E., Frequency-domain methods for demosaicking of Bayer-sampled color images, IEEE
Signal Processing Letters, vol. 12, pp. 847-850, Dec. 2005.

[5] Gonzalez, Rafael C., Woods, Richard E., Eddins, Steven L., Digital Image Processing using Matlab:
Pearson Prentice Hall, 2004.

[6] Gonzalez, Rafael C., Woods, Richard E., Digital Image Processing: Prentice Hall, 2002.

[7] Gunturk, B.K.; Glotzbach, J.; Altunbasak, Y.; Schafer, R.W.; Mersereau, R.M.; Demosaicking: color
filter array interpolation Signal Processing Magazine, IEEE Volume 22, Issue 1, Jan 2005 Page(s):44
54. http://www.ece.gatech.edu/research/labs/MCCL/pubs/dwnlds/p5.pdf.

[8] Gunturk, B.K., Altunbasak Y., and Mersereau R.M. , Color plane interpolation using alternating
projections, IEEE Trans. Image Processing, vol. 11, no. 9, pp. 997-1013, Sept. 2002.
http://www.ece.gatech.edu/research/labs/MCCL/pubs/dwnlds/bahadir_sep2002.pdf.

[9] Gunturk, B. K., Glotzbach, J. , Altunbasak, Y. , Schafer, R. W. , and Mersereau, R. M. ,
Demosaicking: Color filter array interpolation in single chip digital cameras, IEEE Signal Processing
Magazine (Special Issue on Color Image Processing), 2005

[10] Hirakawa K., and Parks, T.W. , Adaptive homogeneity-directed demosaicing algorithm, IEEE
Transactions on Image Processing Vol. 14, No. 3 2005, pp. 360-369.
http://www.accidentalmark.com/research/papers/Hirakawa05MNdemosaicTIP.pdf.

[11] Ramanath R., and Snyder, W.E., Adaptive demosaicking, J. Electron. Imaging,
vol. 12, no. 4, pp. 633642, Oct. 2003.

[12] Ramanath R., and Snyder, W.E., Bilbro, G.L. , and Sander III, W.A., Demosaicking methods for
Bayer color arrays, J. Electron. Imaging, vol. 11, no. 3, pp. 306-315, July 2002.

[13] Tomasi, C., and Manduchi, R., "Bilateral filtering for gray and color images." Proceedings of the
IEEE International Conference on Computer Vision, pages 839-846, January 1998.

[14] Trussell, H.J., and Hartwig, R.E., Mathematics for demosaicking, IEEE Trans. Image Processing,
vol. 3, no. 11, pp. 485-492, Apr. 2002.

[15] Vandewalle P., Krichane, K., Alleysson, D., and Ssstrunk, S. , Joint Demosaicing and Super-
Resolution Imaging from a Set of Unregistered Aliased Images, Proc. IS&T/SPIE Electronic Imaging:
Digital Photography III, Vol. 6502, 2007.

[16] Vega, M., Molina, R., and Katsaggelos, A. K., "A Bayesian Superresolution Approach to
Demosaicing of Blurred Images," EURASIP Journal on Applied Signal Processing, Special Issue on
Superresolution, vol. 2006, Article ID 25072, 12 pages, 2006
51

[17] Wandell, Brian A. Foundations of Vision. Massachusetts: Sinauer Associates, Inc., 1995.

[18] Zhang, X., Silverstein, D. A., Farrell, J.E., and Wandell, B.A., "Color image quality metric S-
CIELAB and its application on halftone texture visibility," in Proc IEEE COMPCON, Feb. 1997, pp. 44-
48.



Appendix II. Matlab Code

[1] Bilateral Filtering Code - Douglas R. Lanman, Brown University, September 2006.
[2] ISET 2.0 (Image Systems Evaluation Toolkit for Matlab) S-CIELAB metric tools. Wandell, Brian.
Stanford University, September 2007. http://white.stanford.edu/~brian/scielab/scielab.html
[3] Bilateral Filtering Code, E. Akuiyibo and T. Atobatele, Demosaicking using Adaptive Bilateral
Filters, March 2007.
[4] Alternating Projections Code, B.K. Gunturk, Y. Altunbasak, and R.M. Mersereau, Color plane
interpolation using alternating projections," March 2007.
http://www.ece.gatech.edu/research/labs/MCCL/pubs/dwnlds/demosaick.zip and main research link page:
http://www.ece.gatech.edu/research/labs/MCCL/research/p_demosaick.html .
[5] Adaptive Homogeneity Code, K. Hirakawa and T.W. Parks, Adaptive homogeneity-directed
demosaicking algorithm, March 2007.
http://www.accidentalmark.com/research/packages/MNdemosaic.zip and main research link page:
http://www.accidentalmark.com/research/



Appendix III. Project Responsibilities

Nick Dargahi: Project Responsibilities.

1). PowerPoint presentation creation.
2). Report write up for Homogeneity, Alternating Projections, and overall organization.
3). Produced graphics.
4). Wrote Matlab code for evaluating FFT, comparison of graphic file outputs, and ran S-CIELAB metric
test routines.
5). Rehearsal of presentation.

Vin Manohar: Project Responsibilities

1). Report write up for Adaptive Bilateral Filter & analysis of previous non-adaptive and adaptive
demosaicking routines.
2). Wrote Matlab code for evaluating Adaptive Bilateral Demosaicking Routine.
3). Rehearsal of presentation.
4). Produced graphics for Adaptive Bilateral Filter.

You might also like