US 2016001969941
cu») United States
cz) Patent Application Publication — co) Pub. No.: US 2016/0019699 A1
oy
oy
a
@y
@)
os)
Mudge
TION IN IMAGES
Allyn, Ine., Skancsteles Falls, NY
Inventor:
Miguel C, Mudge, Skaneateles, NY
ws)
Appl. Now 14/867,218
Filed: Sep. 28,2015
Related U.S. Application Data
Continuation of application No, 14/154,494, fled on
Jan. 14, 2014, now Pat, No. 9,177,225,
Publication Classification
Int. Cl.
Gost 700 (2006.01)
AGIB 303 (2006.01)
300
Fcompuine
(43) Pub, Date Jan. 21, 2016
Aorw 3/12 (200601)
G06K 9746 (2005.01),
Goor 1120 (200601),
(2) US.CL
crc G06T 7/0085 (2013.01); GO6K 94608
(2013.01): Ga6T 117203 2013.01): G6
7/0012 (2013.01). AGIB 4/12 (2013.01): AOTB
1/303 (2013.01); GO6T 2207/10004 (2013.01)
G06 2307/10024 (2013.01); G67 2207/3004)
(201301)
on ABSTRACT
An edge detection engine operates to scan an image to ide
‘ify edges within the image. An annular aperture i used 10
locate the edges intheimage, Anoutputimage is generated by
the edge detection engine that identities the locations ofthe
edges found in the image.
Lou
DEVICE
epce |
DETECTION!
ENGINEPatent Application Publication Jan. 21, 2016 Sheet 1 of 18 US 2016/0019699 A1
8
las
Qgés
a 3
by :
3 yee oO
z 896
— aez L
2 ge
8
5
3
8
8Patent Application Publication Jan. 21,2016 Sheet 2 0f 18 US 2016/0019699 Al
104
EDGE DETECTION ENGINE
110
ANNULAR
APERTURE
GENERATOR
112
LINE GENERATOR
114
IMAGE SCANNING ENGINE
OUTPUT DATA
GENERATOR
FIG. 2Patent Application Publication Jan. 21,2016 Sheet 3 of 18 US 2016/0019699 AI
120
122
DETERMINE RADIUS OF
ANNULAR APERTURE
124
DETERMINE PIXEL
LOCATIONS FOR
ANNULAR APERTURE
FIG. 3Patent Application Publication Jan. 21,2016 Sheet 4 of 18 US 2016/0019699 Al
FIG. 4Patent Application Publication Jan. 21,2016 Sheet Sof 18 US 2016/0019699 AI
130
132
a
DETERMINE SET OF
POSSIBLE LINEAR
BISECTIONS OF ANNULAR
APERTURE
134
DETERMINE PIXEL
LOCATIONS FOR EACH
LINEAR BISECTION
FIG. 5Patent Application Publication Jan. 21, 2016 Sheet 6 of 18 US 2016/0019699 AI
160
Angle 2
Angle 5
156
1621
158
200
Angle 4
Angle 7
FIG. 6
192
144
146
Angle 0
‘Angle 3
Angle 6
140
a2
188
190Patent Application Publication
210
Jan. 21, 2016 Sheet 7 of 18
212
SCAN IMAGE FOR EDGE.
LOCATIONS USING AN
ANNULAR APERTURE
214
GENERATE OUTPUT DATA
IDENTIFYING EDGE
LOCATIONS
FIG. 7
US 2016/0019699 AIPatent Application Publication Jan. 21, 2016 Sheet 8 of 18 US 2016/0019699 AI
220
\ 222
—
SET STARTING PIXEL AS
ANALYSIS POINT
224
v
IDENTIFY PIXELS
SURROUNDING THE
*| ANALYSIS POINT USING
ANNULAR APERTURE
226
Y
BISECT ANNULAR
APERTURE TO GROUP THE
PIXELS INTO TWO HALVES
nae 228
v
DETERMINE ANGLE OF
ee BISECTION TO MAXIMIZE
Give DIFFERENCE IN
MOR INTENSITIES BETWEEN
: THE TWO HALVES
230
YES 939 ¥
STORE BISECTION ANGLE
MORE AND DIFFERENCE IN
PIXELS?, INTENSITY VALUES FOR
ANALYSIS POINT
236
No
Y aA
END
FIG. 8Patent Application Publication Jan. 21,2016 Sheet 9 of 18 US 2016/0019699 A1
EDGE
PIXELS
FIG. 9
EDGE
PIXELSPatent Application Publication Jan. 21, 2016 Sheet 10 of 18 US 2016/0019699 A1
242
126 0
HALF 246 HALF 244
FIG. 10Patent Application Publication Jan. 21, 2016 Sheet 11 of 18 US 2016/0019699 A1
. 3
2 g z
ab 2 a 2
S
2 z
2
g z i
g =
g & 9
Ww
3
22 <
a2
2
g g
“ ¢
g *
Ro x
2 2
aS
a
248
246Patent Application Publication Jan. 21, 2016 Sheet 12 of 18 US 2016/0019699 A1
250
\
252 254 256
PIXEL
COORDINATE| ANGLE | MAGNITUDE
(3,3) 0 1248
(3.4) 0 1297
(3,5) 1 1324
FIG. 12Patent Application Publication Jan. 21, 2016 Sheet 13 of 18 US 2016/0019699 A1
260
\ 262
SET STARTING PIXEL AS
OUTPUT POINT
264
¥
RETRIEVE ANGLE AND
MAGNITUDE FOR
-———>|_ ANALYSIS POINT
CORRESPONDING TO
OUTPUT POINT
276
eal MAGNITUDE >
THRESHOLD?
OUTPUT POINT
: Yes 268
y
IDENTIFY LINE HAVING
APPROPRIATE ANGLE
1 270
SHIFT LINE POSITION
272
YES 274 t
DRAW LINE IN OUTPUT
MORE
PIXELS? IMAGE
278
No
oY
END
FIG. 13Patent Application Publication Jan. 21, 2016 Sheet 14 of 18 US 2016/0019699 AI
FIG. 14Patent Application Publication Jan. 21, 2016 Sheet 1S of 18 US 2016/0019699 A1
172
FIG. 15Patent Application Publication Jan. 21, 2016 Sheet 16 of 18 US 2016/0019699 A1
0 1
1 1
2 aa 290
3 1
4 4
5 1
6 1
FIG. 16Patent Application Publication Jan. 21, 2016 Sheet 17 of 18 US 2016/0019699 A1
302 ——>Patent Application Publication
Jan. 21, 2016 Sheet 18 of 18 — US 2016/0019699 AI
‘COMPUTING DEVICE 102
erocessinc || oerrer | Sewce
oa 343 | 346
NETWORK
INTERFACE To
ONDARY
STORAGE INTF 324 r,
DEVICE 322 I
MEMORY 312
Rom 316 |[ RAM 318 || 8105320
SYSTEM 326 MODULES 330,
repucanon | prosRa DATA
PROGRAMS 323° a2
yo
344
ys
ea a MICROPHONE 249 | | concon 32
FIG. 18US 2016/0019699 AI
EDGE DETECTION IN IMAG
(CROSS-REFERENCE TO RELATED
"APPLICATION
10001] This application claims priority w US, Ser. No
14/154,494, fled on Jan. 14, 2014, titled EDGE DETEC-
‘TION IN IMAGES, the disclosure of which is hereby incor.
porated by reference in is ately.
BACKGROUND
10002] Thehumaneyesand bran ae very goodat detecting
Points of interest ina visual image, One way that an object is
‘Mentified in an imayeis by the identification of edges wi
the image, The brain can identify edges of the object by
finding points in tbe image where adjacent pixels exhibit a
distinct contrast, Numerous edges in the image combine to
‘reate an overall shape. The shape inthe image is then com-
pared with the shapes of known object. IF the shape is sull-
‘iealy similar to a known object, the brain ean identify the
‘object in the image.
10003} | Computers cannot process images in the same way
as the human brain, Often images lack suliient detail oF
‘contrast for a computer tobe able yo detect relevant Features,
ven the fundamental step of identifying the location of edges
‘within an image can be challenging t perform with a com=
puter. Without an adequate identification of the locations of
‘edges in an image, the computers unable to perform subse
‘quent operations, such as identifying objects or other points
‘of interest within the image.
SUMMARY
10004] In general terms, this disclosure is directed to edge
detection in images. In ane possible configuration and by
‘non-limiting example, the ede detection involves seanning
the image using an anaular aperture, Various aspects are
‘described inthis disclosure, which include, butare ao lived
to, the following aspect.
[0005] "One aspect isa method of detecting edges within @
‘digital image, the method comprising: processing atleast a
portion of the digital image, using a computing device, in @
pixel-by-pixel manner including a an analysis point in the
«igital image, by: identifying pixels surrounding the analysis
point; identifying a Tocation ofa bisection that divides the
pixels surrounding the analysis point into two halves; deter-
mining an angle ofthe bisection tht maximizes a difference
in intensities ofthe pixels between the two halves; and deter
‘mining that an edge ispresent in the digital imageat theangle
of te bisection
[0006] Another aspectisanedge detection system compris-
‘ng: computing device comprising: a processing device; and
‘ computer readable storage device storing data instactions
that, when executed by the processing device generates an
‘edge detection engine comprising? an annular aperture gen-
‘erator that operates to generate an annular aperture using 3
reledrwing lgaritam: a line generator that generates lines
representative ofa set of bisector of the anaue aperture;
ge scanning engine that wilizes the annular aperture as &
mask scan a digital image and identify edges within the
‘digital image: and an output data generator tha utilizes the
Jines to represent the edges in the output image.
10007] _A further aspect isa mestical instrament comprising:
fn image eapture device operable to capture an inp image:
‘and a computing deviee including an edge detection engine,
Jan. 21, 2016
the ede detection engine operable to process the input image
to detect edges within the input image by processing the
mage using an annular aperture mask
BRIBE DESCRIPTION OF THE DRAWINGS.
[0008] FIG. 1 is schematic diagram illustrating an
example ofan edge detection system.
[0009] FIG. 2s. schematic block diagram illustrating an
example of an edge detection engine ofthe edge detection
system shown in FIG. 1
{0010} FIG. 3s flowchart ilustating an example method
of generating an annular aperture,
[0011] “FIG. 4 is @ schematic diagram illustrating. an
example ofthe anular aperture
[0012] FIG. Sisaflow churtilustating an example meod
‘of generating purity of lines representing set of possible
biseetions ofthe annular aperture Showa in FIG. 4
[0013]. FIG. 6 is a schematic diagram ‘illustrating an
example set of Tineaebsectons forthe example annular ape.
{ure shown in FIG.
(0014) FIG. isaflow chant illustating an example method
ff seanning an image using the annular aperture shown in
HIG. 4
[0018] _F1G.8isaflow chat illustrating an example method
of seanning an image for ee locations using the annular
aperture shown in FIG. 4
[0016] FIG. 9 is a schematic diagram illustrating. an
exampleof a stavtng pine of an input isape and also show
ing an example of pixels of an image that are within the
annular apertre
[0017] FIG. 10a schematic diagram illustrating an oper-
ting in which the annular apertre is bisected along. bisee-
tion line to group the pixels within the annular aperture into
two halves.
[0018] FIG. 11 is a schematic diggram illustrating an
«example operation that determines an angle ofthe bisection
tat maximizes a difference in intensities between the two
halves.
[0019] | FIG.12ithstrates an example of anangle and mag-
nitude map,
[0020] FIG. 13 is a flow chart illustrating an example
method of generating an output image identlying the loca-
tions of edges in an input image.
[0021] FIG. 14 is a schematic diagram illustating an
example ofa line retrieved fom the set of linear bisections
shown in FIG. 6
[0022]. FIG. 18 is a schematic dingram illustrating an
«example of an operation to shift fine tan acta cation of
tm eage in an input image
[0023]. FIG. 16 is a schematic diagram illustrating an
example operation to draw a ine in tbe output image.
{0024} FIG. 17 isa perspective view of an example insta
:meatin which aspects ofthe present diselosure cam be mple-
mented
[0025] FIG, 18 illustrates an example of a computing
device that can be used to implement aspects ofthe present
Aisclosure
DETAILED DESCRIPTION
[0026] Various embodiments will be described in detil
‘with eference tothe drawings wherein lke reference numer-
als present like parts and assemblies throughout the several
views, Reference to various embodiments doesnot limit theUS 2016/0019699 AI
scope of the claims attached hereto, Additionally, any
‘examples set forth in this specication are not intended to be
Timiting and merely set forth some of the many possible
‘embodiments for the appeased claims
10027) FIG. 1 is a schematic diggram illustrating an
‘example of an ede detection system 100. In this example, the
‘edge detection system 100 inchides a computing device 102,
that executes an edge detection engine 104, Also shown re
‘input image 106 and en outpet image 108,
0028] The edge detection system 100 canbe implemented
Jn multiple different forms, In one embodiment, fr example,
the edge detection system 100s part of an instrament, sch as
‘amedical instrument. One example of medical instruments
fn ophthalmoscope, such as shown in FIG. 17. Another
‘example of mesial instrument isa colposeope, In these
{Seas he computing deze 1 en be atthe is
‘ment, for example. In another embodiment, the edge detec-
tion system 100 is implemented in a computing device 102
separate and distinct from an insteumet. For example, ia
some embodiments the computing device 102 is a computer
‘or part ofa computer
10029] The computing device 102 typically includes at
least « processing device and a computer-eadable slonage
device. In some embodiments the computer readable storage
‘device stores data instructions, which When executed by the
processing device, eases the processing device to perform
‘one of more of the factions, methods, ce operations, ofthe
‘edge detection engine 104 described herein. An example of 3
‘computing device 102 is illustrated and described in more
‘dal with reference to FIG. 18.
10030] The eige detection engine 104 operates to detect
‘edges nan input image 106. In someembodiments the results
‘ofthe ede detectionare output in the formofanoutputimage
108, which contains data identifying the locations of the
‘edges detected in the input image 106,
10031] In some embodiments, the input image 106 is cap-
tured by an instrument, such as 3 medical instrument. In the
‘example shown in FIG. 1 he ipo image 106 isan image of
fn eye captured from an ophhalmoscope. The input image
4106 can also come fiom other sourees, Typically the input
mage 106 is captured by a image capture device, such a @
‘charge-coupled: device of a complementary: metal-oxide-
semionductor ative pinel sensor
10032] Insome embodiments the input image 106 is stored
Jn the computerreadable storage device in the form of an
‘mage file, The image can be encoded according 10 one or
‘more of various image file formats. One example ofa suitable
mage file format is the Joint Photograph Expert Group
UPEG) file format, Other examples af image file formats
include exchangeable image file format (EXIF), tagged
mage file format (TIFF), raw image format (RAW), portable
network graphics (PNG) format, graphics interchange format
(GIF), bitmap file format (BMP), and portable bitmap (PBM)
Jorma. Other embaditiens utilize other image file formats.
“The input data could also be provided in a aoa-image Mle
Jorma, such as utilizing another data format to convey the
image data,
10033] In some embodiments exch pixel of the input image
106i encoded in muhiple color channels, such as red, green,
‘an blue color channels, The color channels include an inten:
sity value that nicates the relative contribution of that color
to the pixel color In other words, each pixel is represented by
fn inteasity valve within each color channel. The intensity
values typically range from 0 (0 25S, for example. So, for
Jan. 21, 2016
‘example, pixel thats primarily ed will havea lege inten-
sity valucia thered color channel andsmaller intensity values
inthe blue and green color channels. white pixel will have
approximately equal intensities in all three olor channels
[0034] In some embodiments only one color chanel ofthe
input image 106 is used by the ee detection engine 104. For
cxample, to evaluate red features (eg., oxygenated blood)
‘within the eye, the red color eiaane ofthe input image 106
can be used. To evaluate blue features (eg. 2 vein) the blue
color elie of the input image 106 can be used, In other
embodiments, Wo or more of the color channels are used
Funler, some embexliments iavolve a color space transfor
‘mation, Such a transformation can be used to evaltate other
colors, sich as cyan, magenta, and/or yellow, for example
‘ue, saturation, andor brightness are used in some embodi-
smenis.
[0035] The output image 108 is generated by the edge
tection engine 104, and includes data that identifies the
locations of edges detect in the input image 196. In some
cembodiniens the pixels in the output image 108 include
intensity values. The moce distinet the edge is in the iaput
‘mage 106 the lager the intensity vale will be tthe core-
sponding point in he input image 106. Insome embodiments
{the output image 108s also encoded in an image file form,
seh asthe IPEG file form, or another format
[0036] FIG. 2is a schematic block diggram ilustrating an
‘example ofthe edge detection engine 14, In this example,
the edge detection engine 104 incIndes an annular apertnre
generator 110,alinegenerator112,an image scanningeagine
114, and an output data generator 16.
[0037] The annular aperture generator 110 operates. 10
define an annular aperture. In some embodiments the edge
‘detection engine 104 utilizes the annular aperture to sean the
input image 106 to identify edges in the input image, as
siscussed in further detail below. An example ofthe annular
aperture generator 110 is discussed in further detail herein
With reference to FIGS, 3-4
[0038] The line generator 112 operates to define a set of
lines. More specifiealy in somte embodiments the Fine gen-
erator 112 determinesall ofthe possible ways hat theannular
aperture (generated by the annlar aperture generat 110)
can be bisected, and generates a set of lines defining each of
the possible biseetions. In another possible embodiment, the
Tine generator 112 is operate to generate specific Hines as
ceded. An example ofthe line generator 12 i discussed ia
turer detail with reference to FIGS, 6
[0039] The image seanning engine 114 operates to scan the
‘input image 106, shown in FIG. 1, todetect edges inthe it
image 106. An example ofthe image scanning engine 114 is
scossed in further detail with reference to FIGS. 7-12
[040) ‘The ouput data genersor 116 operates to generate
An output the edge detection engine 104. In some embod
‘ments the output data generator 116 generates the ouput
‘mage 108, shown in FTG. 1, The output data generator 16 is
scossed in further detail with reference to FIGS. 13-16.
041} FIGS. 3-4 illustrate examples ofthe annular aper-
ture generator 110, shown in FIG. 2
[0042] FIG.3 isa low chat illustrating an example method
120 of generating an annular aperture In some embodiments
the method 120 is perlomed by the annular aperture gener
tor 110, shown in FIG. 2. In this example, the method 120
‘includes an operation 122 and an operation 124
[0043] The operation 122 is performed 10 determine a
radius of an annular aperture to be generated. In someUS 2016/0019699 AI
nbovliments the radius is ofa predetermined size. In other
‘embodiments the nidiue is a selectable parameter. Fo
‘example, in some embodiments the angular aperture genera
‘or 110 prompisa user to eater a desired radius, The optim
rads dimension will typically depend on multiple feetors,
such as the resolution of the input image 106, the size and
‘complexity ofthe features of interest in the inur image 196,
and the level of noise (unimportant details) in the input
‘mage 106. As one example, the radivs is in a range from
bout § pixels to about 25 pixels, Ia some embodiments the
radi is about 10 pixels
10043] Some embodiments lize other parameters. For
‘example, another possible parameter i the thickness ofthe
‘annulae apernre. nother embodiments, the annular apertiee
has a predetermined thickness, such as a thickness of one
piel
10045] The operation 124 is performed to generate the
fnnvlar apertre. Recause of the grid-like arrangement of
pixels in an image, a perfect circular shape cannot be drawn
using pixels, Accordingly, in some embodiments the open-
tion 124 determines pixel locations for the annular aperture
that approximate a circular shape. An example of aperstion
124s lustrated in FIG. 4
[0046] FIG. 4 is a schematic diagram illustrating an
‘example ofan annular aperture 126, A plurality of pixels 128,
‘salso shown, The annular aperture 126s formes within the
phunity of pixels 128, in some embodiments
10047] _Inthis example, the desired annular aperture 126has
‘radius Rand isin the shape of a ciccleC,
[0048] Because the aanular aperture 126 needs to be
‘fined within the plurality of pixels 128, which are arranged
ina grid-ike configuration, itis not possible for perfectly
roilar annular aperture 126 0 be genersted. Asa resll the
‘operation 124 (shown in FIG. 3) is performed to determine
pixel locations forthe annular aperture that approximate the
shape of the circle C,
[0049] _Insomeembodiments, the pixel locations are deter
‘mined using circle drawing algorithm. One example of @
le dewving algorithm is The midpoint circle algorithes,
flso known as the Bresenham’s citele algorithm. Other
nbodiments wilize other eccle drawing lgoritins.
[0050] Using the circle drawing algorithm with a knowa
radius R (eg. a radius of 7), the annular aperture 126 is
‘generated as represented by the pixels show in bold ines ia
FIG. 4. The anntlar apertire 126 has shape that eppeoxi
mates the shape of the eirele C and has a radius Rand a
thickness of one pixel
10051) The snnular aperture 126 generated by the annular
‘aperture generator 110 (FIG, 2) i stored for subsequent use.
[0052] FIG. Sisa fow char illustatingan example method
130 of generating a plurality of lines representing the set of
possible bisctions ofthe annular aperture shown FIG. 4, la
this example the method 130 includes operations 132 and
134. In some embodiments the operations 132 and 13 are
performed by the line generator 112, shown in FIG. 104,
10053] |The operation 132sperformed to detemninea st of
possible linear hisections of an annlar aperture. Am example
‘of the annular aperture is shown in F1G. 4
[0034] Before searching through the image for possible
das th operat 12 cn e iro Hy he
shapes those edges, nother words, theedge might
Ret erica ine extending from the opt the otto ofthe
snnvlar aperture, or it could he a horizontal line extending
from the left to the right of the aperture. The edge could also
Jan. 21, 2016
be present at some other angle, Because the digital image has
limited number of pixels, the quantity of lines that can be
formed within the annular aperture is limited. In some
‘embodiments, dhe lines are determined by starting at first
pixel of the annular aperture 126 and identifying a Tine that
fan be drawn from that point to the coresponding point
relly opposite that point. The process is then repeated
‘consecutively foreach point around the samolar anette until
all possible angles have been evaluated. Anexample of the ot
‘of postble linear bisection i show in FIG. 6
[0055] ‘The operation 134 is performed to determine pixel
locations for each linear bisection, Stated another way, the
‘opemtion 134 is performed to draw each ofthe lines between
‘opposing points ofthe annulze sperture 126
[0056] Because ofthe prid-like arrangement of the pixels,
stig lines eam only be drwa vertically and horizontally ia
thepixes. straight line having an angle that snot vertical or
horizontal cannot be perfectly represented in the pixels.
‘Therefore, in some embodiments the operation 134 invalves
the use ofa line dawving algorithm. One example of line
drawing algorithm is the Bresenham's Tine algorithm. The
line drawing algorithm determines a st of pixels that form an
approximation ta perfec Fin extending between two oppos-
ing points ofthe annular aperture
[0057] FIG. 6 is a schematic diagram illustrating an
‘example set 138 of linear bisections forthe example annular
aperture 126 shown in FIG. 4.
[0058] In this example, the set 138 of linear bisections are
Tormed by identifying all straight lines that can bisect the
nnularaperture 126 (FIG. 4) at variousangles. One way to do
this isto begin with a starting pixel of the annular aperture
126, such a the pixel 144, draw the linear bisector extending
{fom thispixel othe comesponding pixel onthe opposite side
of the annular aperture 126, and then consecutively rotate
‘rough the adjacent pixels of the annular aperture 126i the
same manner until all possible bisections have been iden
Tied. The number of possible isections varies depending oa
the pixel size ofthe annwlarapersure 126. In this example, the
‘annular aperture has a diameter of seven pixels, ad eight
possible hiscctions, as shown,
[0059] Each incu bisection canbe identified by anangleot
the bisection with respect to a starting location. In this
‘example the angles are identified by a number of pixels
around the annular aperture, soch that angle 0 is the angle of
4 linear bisection passing through a fist pixel (144) of the
Annular aperture, angle 1 isthe angle of linear bisection
passing though «second pixel (182) ofthe annular aperture,
And so on.
[0060] Although it is possible to convert the angles to
degrees, the conversion would require akditional processing
steps that are unnecessary. AS one example, however, the
‘annular aperture canbe bisected by eight cilferent ines, such
that the angle between each adjacent pixel of the annular
aperture is 22.5 degrees (180(8-22.5). Nove that the linear
bisectons from O to 180 degrees are the same asthe Hinear
biseetons from 1800360 degrees, such thatthe computation
‘of one set ofthe linear biseetions is adequate 1 adress all
possible linear bisectons ofthe annular aperture
[0061] ‘The fis linear bissction 140 inthe sot 138, with an
angle 0 (0 degrees), isthe approximation ofa fine 143 extend-
‘ng veically across the annlarapertre. The near bisection
extends from pixel 144 to the coresponding opposite pixel
146. Thelincarbisection 140 includes seven pels fom pixel
144 to pixel 146.US 2016/0019699 AI
10062] The nex linea bisection 148, with an angle 1 (22.5
degrees), isthe approximation of a line 180 extending lm.
the next pixel 152 in the clockwise direction fom the first
pixel 144, to the corresponding opposite pixel 184. In this
‘example it can he seen how the fine 150 cannot be pertetly
represented in the pixels, and therefore a set of seven pixels
‘extending from pixel 182 to pixel 154 are selected 10 best
approximate the line 150,
[10063] "The next linear bisection 156, with an angle 2 (4S
‘degrees, isthe approximation ofa ine 188. he linear bisec~
tion inches seven pixels extending from pixel 160 to pixel
162.
10068) The linear bisection 164 has an angle A3 (67.5
degrees), and isthe appeoximation ofa Tine 166, The lineae
bisction extends fom pixel 168 to pixel 170,
10065] "The linear bisection 172 isan approximation ofthe
horizontal line 174 having an angle 4 (90 deurees), whieh
‘extends from pixel 176 to pixel 178.
[0066] | Thenext lincarbisection 180 has anangle AS (112.5
‘degrees, and isthe approximation ofa ine 182. The lineae
biscetion 180 extends Irom pixel 184 to pine 186
10067} ‘The linear bisection 188 has an angle AG (135
degrees), ad isthe approximation ofa ine 190, The Finca
bisection 188 extends from pixel 192 to pixel 194.
10068] Ac angle A7 (157-5 degrees) is the linear bisection
196 that approximates the line 198. The linear bisection
‘extends from pixel 200 to pixel 202.
[0069] Advancing othe nex pixel around the annularaper-
ture arrives at pixel 146, and the liner bisection from pixel
146 isthe same.as the ine 140 at angle 0. Therefore all linear
biseetions have heen identified forthe example annular per
ture 126, shown in FIG. 4, Larger annular apertures wll have
‘Jang quantity of liner bisections, while smaller annlae
apertures will bave a smaller quantity of linea bisection.
10070] In some embodiments the set 138 of linear bisee-
Tions is stored in'a computer readable stomge device for
subsequent use,
[0071] FIG. Tisa fow char illustatingan example method
210 of scanning an image using an annular aperture. In this
‘example, the method 210 includes operations 212 and 214,10,
somecmbodiments the operations 212 and 214 are performed
by an image seanning engine 114, shown in FIG. 2
10072] The operation 212 is performed to sean an image
106 (FIG. 1 for edge locations using st annular aperture, An
‘example of operation 212 illustrated and deseribed in more
detail with refereace to PIGS, 8-12.
0073] The operation 214 is performed to generate an oxt-
put image 108 (FIG, 1) identifying the edge locations. An
‘example of operation 214 isillstated and deseribed in more
‘desail with reference to FIGS. 13-16.
10074] FIG. Bisa fow char illustatingan example method
220 of scanning an image for edge locations using an snmlar
‘aperture, FIG. 8 also illusieates an example of the operation
212, shown in FIG. 7. In this example, the method 220
‘includes operations 222, 224, 226, 228, 230, 232, 234, and
236.
[0075] ‘The method 220 is performed t sean an input
mage, such asthe image 106, shown in FIG. 1, to identify
‘edges within the image 106, As described herein, in some
‘embesiiments the method 220 involves scanning only a single
‘color channel of the inp image 106. For example, the red
‘color channel can be evaluated, Within the red color channel,
‘each pixel of the image 106 is represented by an intensity
value. The intensity value
Jan. 21, 2016
{or example, The in
the color associated
pixel
0076] ‘Tae operation 222 s performed to determine a start-
ng pixel, and to begin the seanning and analysis of the image
atthat point. For example, the starting pixel ean be the upper
eft pixel ofthe image.
[0077] problem with edge or commer pixels, however, is
‘that evaluation of such pixels requires thatthe annular aper-
ture 126 (FIG. 4 be positioned such that the annular aperture
126 extendsoutside ofthe bounds ofthe image. Insuchacase,
i is desirable to know what the background color isin the
image. For example, if it is known thatthe hackground is
black, the evaluation ean praceed by using a defaul intensity
conesponding with the background color (eg. an
ty of zero, representing a dark pixel.
[0078] Alternatively. pels that are less than the radius of |
the annular aperture 126 (FIG. 4) sway from the edge are
‘mitted from processing in method 220, Foran annular aper-
ture having a diameter of 7 pixels, for example, the starting
pisel car be the pixel thats four pixels down and four pixels
{o the right of the upper let pixel. An example is shown in
FIG. 9, Various other staring points could also be use ia
other embodiments
[0079] Once the starting point has been determined and set
asthe frst analysis point inthe image 106, the operation 224
is performed to identify pixels surrounding the analysis pont
‘sing the annular aperture 126 (FIG. 4). To do so, the annular
aperture 126 is used as a mask layer to identify only those
pines inthe image 106 that are within the annular aperare
126 when the annular perture 126s centered onthe analysis
point, An example is shown in FIG. 9.
[0080] ‘The operation 226 s performed to bisect the annular
aperture 126 to group the pixel into two halves. Am example
‘of operation 226 i shown in FIG. 10
[0081] The operation 228 is performed to determine an
ange ofthe bisection that maximizes a difference in inten
ties between the two halves, Todo so the intensity vales for
each pixel witin a fist half ofthe annular aperture 126 are
added together, andthe intensity values foreach pixel within
the second half of the annular aperture 126 are also added
together The combined intensity Value ofthe frst hall's then
compared with the combined intensity value of the second
hallo determine a difference between the intensity values
[0082] The same process is repeated for each possible
bisection of the annular aperture 126, and the differences
between the intensity values are detemained for each possible
bisection, An example is illustrated in FIG.
[0083] Ifa lane difference inthe intensity values is found
Tora given bisection, the difference indicates the likely pees-
ence of anedge within the imape 106 a or near te loeation of
the analysis point
[0084] The operation 28 identifies the bisection angle that
results in the greatest difference ia the intensity value
between the two halves
0085} Inoperation 230, the angle that results inthe greatest
lence is then stored in a computer readable storage
{device forthe analysis point, along with the intensity valve
difference, The different in the intensity values herween the
‘seo halves is somtimes refered to herein as a magnitude, In
some embodiments the angle and magnitude are stored in an
Angleand intensity map. Anexample tf anangleandinteasity
sity value indicates the brightness of
th the color channel (ered) in theUS 2016/0019699 AI
10086] Once the anale and the magnitude have been ca
pled und stored forthe analysis point, operation 232 deter
mines whether there are addtional pixels that need to be
‘alyzed. Iso, operation 234 sets thenext pixel asthe analy
sis point and repeats operations 224,226, 228, 230, and 232
‘acconlingly. Otherwise the method 220 ends at operation
236.
10087] FIG. 9 is a schematic diagram illustrating an
‘example of starting pixel of an input image 106, and also
shoving sn example of the pixels of the image that are within
the annolar aperture 126. Only an upper eft portion of the
mage 106 is represented in FIG. 9.
[0088] In thisexampe, the annular aperture has a diameter
‘of seven pixels. A a resul, any pixels that are located less
than the radi (3.5 pixels) of the annular aperture away from
the edge of the image are designated as edge pixels. Ifthe
annular aperture were centered on an edge pixel, a portion of
the aanulae spertire would extend outside of the hounds of
the image. In some embodiments the scanning of the image
Jnvolves the use of interior pixels that ure greater than the
radi of the annular aperture 126 away from the bounds of
the image,
[0089] In some embodiments each pixel of the image 106 s
represented by a coordinate value of (X.Y), where X is the
horizontal numberof pixels fm the left side of the image
‘and Y is the vertical number of pixels from the top of the
‘image. The upper left pixel has a coordinate (0.0),
10090] In this example, the pixel (3,3) is selected as the
Staring pixel, and is therefore sot asthe first analysis point
240,
[0091] The annular aperture 126 is then used w idently @
sctof pixels surrounding the analysis point that ar within the
‘annulie apertre 126. Ia FIG. 9 the pixels within he annulae
aperture are represented with bold lines.
[0092] FIG. 10 is a schematic diagram sllustating an
‘example of operation 226, shown in FIG. 8, during which the
‘annulae apertnre 126 is bisected slong a bisection line 242 10
‘eroup the pixels within the annular aperture 126 into 60
halves 244 snd 246,
0093] The annular aperture 126 is bisected along a bisce-
tion line 242, The example shown in FIG. 10 illustrates &
vertical bisection line 242, The vertical bisection line 242