Professional Documents
Culture Documents
Chan 2017
Chan 2017
Chan 2017
ABSTRACT
,((( ,&,3
pixels, %& and %&, respectively, denote the image and the
PSF of right pixels, and “ ” denotes convolution.
Although the above discussion is for one-dimensional
images, it can be extended to two-dimensional images by
treating each row and each column of the images as a one-
dimensional image and process them separately.
(a) (b) (a) (b)
Fig. 3. (a) A portion of the image sensor considered in this work Fig. 4. (a) Impulse responses of left pixels (blue) and right pixels
with right pixels marked in yellow and left pixels in blue. The red (red) in an example case. (b) The corresponding shift profile
rectangle represents the basic pixel pattern that repeats over the plotted against the lens position. The actual in-focus lens position
sensor. (b) Even pixels and odd pixels of an assembled image are is at 350. Each spatial shift is computed by taking the average of
not vertically aligned. An assembled left image is shown here as 100 tests.
an illustration.
direction perpendicular to the edge, and let %& be the
Δx = arg max p f (x). (6) observed left image and %& the blur kernel for left pixels.
x
We have
3.2. Subpixel Accuracy iL (x) = s(x)∗ hL (x). (8)
Only integral spatial shift can be obtained from (6) since the
index is an integer. However, subpixel accuracy is required Mathematically, the impulse response %& of the system
so that shift has the precision needed in the regular image can be expressed in terms of the step input by
domain [12]. We adopt the method proposed by Tian et al.
[13] to compute subpixel spatial shift, jL (x) = (s(x) − s(x − 1))∗ hL (x). (9)
(a) (b) (c) (d)
6. REFERENCES