Ir Sensor Array For A Mobile Robot

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Proceedings of the 2005 IEEE/ASME TC4-01

International Conference on Advanced Intelligent Mechatronics


Monterey, California, USA, 24-28 July, 2005

IR Sensor Array for a Mobile Robot


Hyunwoong Park, Sungjin Baek and Sooyong Lee

Abstract— This paper describes a new sensor system for sensors. Unless the IR sensor rotates very small amount of
improving the accuracy of the range information using multiple angle at each step, small obstacle may be missed.
IR range sensors. Environment and obstacle sensing is the
key issue for mobile robot localization and navigation. Laser
scanners cover 180◦ and accurate but are too expensive. Radial
θ
range sensors such as laser scanner, IR scanner and Ultrasonic
range sensor ring have blind spots so that small obstacle not
close to the sensor may be easily missed. It is necessary to
develop a low cost sensor system which covers 360◦ and with
small blind spots. A sensor system with 12 IR range sensors S#1 S#12
(each rotates ±37.8◦ for overlapping area to reduce the blind S#2
spot) is designed and implemented. Iterative estimation of range S#11
from the IR sensor information is developed and verified with S#3
experiments. S#10
S#4
S#9
I. I NTRODUCTION S#5
S#8
The importance of environment sensing for mobile robot S#6 S#7
navigation and localization motivated the work described in
this paper. In most of the mobile robot application, the robot
enters an unknown environment and relying solely on range
sensor information, it builds up an environment map that
can be used for collision free navigation and localization.
Conventional ultrasonic sensors measure distance using time Fig. 1. Multiple Rotating Range Sensors
of flight. The detected object may be located anywhere along
the perimeter of the sonar sensor’s beam pattern. Therefore,
as in [1], the distance information that ultrasonic range sensor
provides is fairly accurate in depth, but not in azimuth. [2], Figure 1 shows the multiple rotating range sensors proposed
[3], [4], [5] focused on modeling of the ultrasonic range in this paper. There are 12 IR range sensors around the
sensors to get accurate range information. Laser scanner mobile robot. Each sensor rotates − θ2 ∼ + θ2 with step size
provides high resolution information, however, it incurs a of ∆θ and the scanning area of a sensor overlaps with those
high cost, sometimes more than the mobile base itself. [6] of the nearby sensors. This idea has two major advantages.
developed the mixed camera-laser based system for mobile The first one is the time for 360◦ scanning is shorter than
robot navigation. [7] also introduced multisensor fusion for using only one sensor. The second and the more important
SLAM. one is that the blind spot becomes much smaller that using
Mobile robot builds the environment map for localization only one sensor.
and navigation from the range sensor information as in [8] This paper describes the rotational IR range sensor system
or by combining odometry [9]. With reliable range data, as shown in Figure 1. In the following section, the proposed
the environment map is built using occupance grids [10]. system is analyzed to show the advantages and also to find
Therefore, the accuracy of the map depends on the accuracy the design parameters. In Section III, an iterative estimation
of the range sensor information, so do the localization and of the distance from sensor information is described. Section
navigation. IV shows the mechanical design and implementation of
Different from the ultrasonic range sensor, the IR range the sensor system with experimental results followed by
sensor emits very narrow beam and measures the distance conclusion.
either from the offset of the reflected beam or the intensity II. R ADIAL S ENSOR A RRAYS AND ROTATING S ENSOR
of the reflected light. The IR sensor also has the limit of A RRAYS
maximum distance, usually smaller than that of the ultrasonic
In order to scan 360◦ , one simple way is to rotate a range
This work was supported by Grant No.(R01-2003-000-10336-0) from the sensor. If the sensor ray is very narrow as the IR range sensor,
Basic Research Program of the Korea Science & Engineering Foundation a small obstacle located not very close to the sensor may
Hyunwoong Park, Sungjin Baek and Sooyong Lee are with Department
of Mechanical and System Design Engineering, Hongik University, Seoul, not be sensed. We define a metric, r, as the maximum radius
#121-791, KOREA sooyong@hongik.ac.kr of a circular obstacle that might not be sensed. Figure 2

0-7803-9046-6/05/$20.00 ©2005 IEEE. 928


(j)th
shows how the metric, r, distance, d and the resolution of (i)th

the rotation, ∆θ are related. (j+1)th

(i+1)th
(i+1)th (i)th
r

(k)th sensor

d
(k+1)th sensor

∆θ

Fig. 3. r of the Multiple Rotating Range Sensor System


Fig. 2. r of the Radial Range Sensor System
5000

4500

As the angle difference between the i measurement and th 4000

(i + 1)th one is ∆θ, the metric, r is expressed as Eq. 1


3500

3000

d 2500

r=

y [mm]
(1)
1  − 1 2000

∆θ
sin 2
1500

1000

As expected, r grows proportionally to the distance, d. For 500

instance, r=0.016m with ∆θ = 1.8◦ and d=1m. However, if 0

there are multiple rotating sensors and there exist overlapping −3000 −2000 −1000 0
x [mm]
1000 2000 3000

areas, r is bounded because a circular obstacle of which ra-


dius is larger than r will be detected by the ith measurement Fig. 4. Laser Scanner Measurement
from the k th sensor and/or (j + 1)th measurement from the
(k+1)th sensor as shown in Figure 3. Maximum value of r is
close to the half of distance between two nearby sensors (k th placed and each sensor is spaced at Φ apart. If there are N
and (k + 1)th sensors in Figure 3). More important feature sensors used, then Φ = 2π
N .
of the developed sensor system is that it is highly likely the
robot can recognize whether there exists open area beyond
the obstacle. This is not possible with only one rotating range
sensor, once the sensor ray is blocked by an obstacle. Figure S
4 shows the measurement from a SICK Laser scanner. Even
D
though there are empty (unoccupied) regions beyond the
β
obstacle, the radial scanner doesn’t provide that information. α
One common example is a leg of a table. It is a cylinder with
relatively small radius, but the area beyond the leg is regarded
as occupied if only one rotating range sensor is used, thereby
the robot would not be commanded to navigate around the
leg or important localization feature would be missed. The R Φ
Laser scanner and the IR scanner belong to this case.

Fig. 5. Overlapping Area

In the overlapping area, an obstacle would be hit by two


different sensors and more accurate measurements of the
location and the shape of the obstacle are easily recognized For given R, D, Φ and α, the angle β is found from the
in this area. Therefore, it is important to design the sensor following equation.
system to maximize the area. Figure 5 shows the overlapping
area with several design parameters. Each sensor rotates  
−α ∼ +α and D is the maximum distance the sensor can −1 R · sin Φ2 π Φ
β = cos − +α− (2)
measure. R is the radius of the circle where the sensors are D 2 2

929
TABLE I
D ESIGN PARAMETERS
Parameter Value
R 0.15m θ + 90o
D 0.686m
Φ 30◦ Y
α 37.8◦

The overlapping area S is from Eq. 3. φ


  X
β 1 R · sin Φ2 do
S = 2 × πD2 × − ·D· π Φ
 · sin β (3)
2π 2 cos 2 − α + 2
Fig. 7. IR Range Sensor Kinematics
These two equations provide the way to maximize the area
by varying the design parameters. Table I shows the values
of those for our system. From Eqs. 2, 3, the size of the error at φi is defined as
overlapping area, S is 0.138m2 . yi
εi = di − −do · tan φi + (6)
cos φi
III. S ENSOR I NFORMATION P ROCESSING
and the sum of the squared error is,
The IR sensors used in this work are SHARP GP2 series. 2
Instead of measuring the amount of the reflected light, this yi
ε2i = di − −do · tan φi + (7)
type of sensors measure the offset distance of the point on the cos φi
photo-resistive strip where the reflected light hits. Therefore, To minimize the sum of the squared errors, the following
it is much less influenced on the color of the reflective objects partial differentiation is derived.
and reflectivity. Most of the analog output IR range sensors 
have the following characteristics between the distance and ∂ ε2i
=2 di tan φi +
the output analog voltage as shown in Figure 6. The sensor ∂do
output voltage is converted to a digital number using an A/D yi tan φi
2do tan2 φi − 2 (8)
converter (10 bit, unipolar, 0V-5V). cos φi
 2
∂ ε
by setting ∂do i = 0,
 yi tan φi 
Sensor Characteristics
80

cos φi − di tan φi
do = 
70

2 (9)
60 tan φi
50 The parameter do is estimated from the above equation with
Distance [cm]

40 measured di and known yi , φi , θ experimentally. Difficulties


30
arise from the fact that θ is not known for application. From
20
Figure 6, the relation between the distance, d and the sensor
output S can be simply represented as
10

0
0 100 200 300 400 500 600 700 800 900 1000
d = aS b (10)
A/D Converted Values

where a > 0 and b < 0. However, a and b are not constants


Fig. 6. IR Range Sensor Characteristics: Normal Projection
but vary as θ does. More importantly, they also depend on
d.
Figure 8 shows how much the sensor output varies de-
However, depending on the projection angle (θ in Figure 7), pending on θ. Because θ is not known a priori, accurate
the sensor characteristics vary significantly. estimation of the distance is not possible without knowing
it. In other words, the relation between the distance and
the output voltage depends on θ. Therefore, there should
be a way of estimating θ as well as the distance from the
If θ = 0, then the kinematic parameter do , the offset from
sensor output. Iterative estimation is developed as shown in
the center of the IR sensor where the IR light is emitted, is
Table II. Using the developed iterative estimation, accurate
estimated from the following equations.
estimation is achieved even with θ ≥ 70◦ , which is not
y = −do · sin φ + d · cos φ (4) possible with the ultrasonic range sensor. Figure 9 shows
y the experimental data. A convex and a concave corners are
d = −do · tan φ + (5) accurately recognized. The plus sign in Figures (at x =
cos φ

930
CASE #1: a(constant), b(constant) CASE #2: a(θ=0°), b(θ=0°)
60 60

200 200 40 40

150 150 20 20
Distance [cm]

Distance [cm]

y [cm]

y [cm]
100 100
0 0

50 50
−20 −20

0 0
50 50
−40 −40
1200 1200
1000 1000
0 800 0
800
600 600 −60 −60
400 400
−50 200 −50 200
θ [degree] Sensor Value θ [degree] Sensor Value −80 −60 −40 −20 0 20 40 60 80 −80 −60 −40 −20 0 20 40 60 80
x [cm] x [cm]

CASE #3: a(θ), b(θ)

Fig. 9. Convex and Concave Corners


200

150
Distance [cm]

100

50

0
50

1200
1000
0
800
600
400
−50 200
θ [degree] Sensor Value

(CASE #1) − (CASE #2) (CASE #1) − (CASE #3)

16 8

14 6
12
4
10
2
Distance [cm]

Distance [cm]

8
0
6
−2
4
−4
2

0 −6

−2 −8
50 50

1200 1200
1000 1000
0 0
800 800
600 600
400 400
−50 200 −50 200
θ [degree] Sensor Value θ [degree] Sensor Value

Fig. 8. IR Range Sensor Characteristics

TABLE II
I TERATIVE E STIMATION Fig. 10. Design of the Multiple Rotating Sensor System

Input: S (Sensor Reading), φ (Sensor Angle)


Output: d (Distance), θ (Relative Angle) digital numbers.
1) assume θ = 0
2) estimate d from Eq. 10 with a, b corresponding θ
3) calculate the position of the endpoint of the sensor ray xi , yi in A T-shaped environment is made for experiments (Figure
Cartesian coordinates from d, θ and φ 14). Its dimension is shown in Figure 15.
4) group xi , yi into X, Y to form a line
5) calculate θnew from X and Y
6) repeat 2) until |θ − θnew | < 

As the IR sensor limit (D) is set as 624mm, it is not


0, y = 0) shows the location of the rotating IR range sensor possible to detect all the edges of the environment even at
and the circle represents the maximum of the sensor range. the center of the environment. Instead, the map is built with
several measurements at different locations. The environment
IV. E XPERIMENT is partitioned into same sized cells (10mm×10mm). Initially,
A sensor system is developed by putting 12 rotating IR each cell’s occupancy probability is set as 1. The white cell
sensors on a perimeter 30◦ apart. As shown in Figure 10, means that it is not occupied, i.e. no obstacle in that cell.
a stepper motor rotates all the sensors with a timing belt Black means there is an obstacle or it is not sensed yet.
synchronously. The stepper motor rotates with a step size of As each sensor ray gives distance information, the sensor
1.8◦ and the sensor rotates with that of 1.08◦ due to gear ray is transformed into a line in Cartesian coordinates and
reduction. Figures 11, 12 are the photos of the sensor system. the probability of the cells the ray passed are changed to
zero. In order to consider the uncertainty of the sensor infor-
mation, the Gaussian distribution is used. Gray cells show
the intermediate probability. A single scan of the developed
sensor system (Figure 16) clearly shows the advantage of the
rotating sensor measurement.
The ATMega8535 microcontroller is used for the stepper
motor control and communication with a PC via RS232c.
6 MCP3202 A/D converters (12bits) are interfaced to the Red circle in the Figure shows where the center of the sensor
microcontroller via SPI. Each A/D converter has two chan- system is located. The occupance grids show that there exists
nels so that total of 12 IR sensor readings are converted to a vertical edge on the right side, and another horizontal

931
Stepper Motor
Motor Driver

ATMega8535
Micro Controller
SPI

MCP3202 MCP3202 MCP3202


A/D #1 A/D #2 A/D #6

Sensor Sensor Sensor Sensor Sensor Sensor


#1 #2 #3 #4 #11 #12
Fig. 11. Multiple Rotating Sensors: Top View

Fig. 13. Controller

Fig. 12. Multiple Rotating Sensors: Side View

Fig. 14. Experiment

edge at the bottom. It also detected a convex corner at left


upper side. An arc of a circle represents the sensor range [3] D. Bank, ”A Novel Ultrasonic Sensing System for Autonomous Mobile
maximum not the real edge. We can tell left side and top Systems”, IEEE Sensors Journal, Vol.2, No.6, pp.597-606, December,
side are open as far as the sensor limit. Note that there 2002
[4] J. J. Leonard and H. F. Durrant-Whyte, Directed Sonar Sensing for
exists a cylinder in the lower right corner. The edge behind Mobile Robot Navigation. Norwell, MA: Kluwer, 1992
the cylinder is detected due to the rotating sensors, which is [5] O. Wijk and H. Christensen, Triangulation-based fusion of sonar data
not possible with radial sensors. Following two figures show with application in robot pose tracking, IEEE Transactions on Robotics
and Automattion, vol. 16, pp. 740-752, December, 2000
the progressive environment map building. The first one is [6] D. Dedieu, V. Cadenat, and P. Soueres, Mixed camera-laser-based con-
without an obstacle in the environment (Figure 17). The first trol for mobile robot navigation, Proceedings of IEEE/RSJ International
scan builds the map of the lower right corner. The second one Conference on Intelligent Robots and Systems, pp. 1081-1086, 2000
[7] J. A. Castellanos, J. Neira, and D. Tardos, Multisensor fusion for
adds lower center followed by the lower left and the center simultaneous localization and map building, IEEE Transactions on
top. Two small cylinders and one large cylinder are detected Robotics and Automation, vol. 17, pp. 908-914, December, 2001
in the environment in Figure 18. Similar to the first case, [8] L. Kleeman and R. Kuc, Mobile robot map building from an advanced
sonar array and accurate odometry, IEEE Transactions on Robotics and
four scans complete building the map of the environment Automattion, vol. 13, pp. 3-19, February, 1997
including the obstacles. [9] K. S. Chong and L. Kleeman, Mobile robot map building from an
advanced sonar array and accurate odometry, International Journal of
V. C ONCLUSION Robotics Research, vol. 18, pp.20-36, January, 1999
[10] A. Elfes, Using occupancy grids for mobile robot perception and
We have presented a novel rotating IR range sensor system navigation”, Computer Magazine, vol. 22, no. 6, pp. 46-57, June, 1989
for a mobile robot. Instead of putting sensors along the radial
direction, each sensor rotates so that we can obtain substan-
tial information of the environment from the overlapping of
sensor rays. An iterative estimation algorithm is developed
to enhance the sensor accuracy. The experiments show the
sensor system can accurately build a map of the environment
with obstacles.

R EFERENCES
[1] H. Choset, K. Nagatani, N. A. Lazar, ”The Arc-Transversal Median
Algorithm: A Geometric Approach to Increasing Ultrasonic Sensor
Azimuth Accuracy”, IEEE Transactions on Robotics and Automation,
vol.19, no.3, pp.513-522, June, 2003
[2] J. Borenstein and Y. Koren, Error eliminating rapid ultrasonic firing for
mobile robot obstacle avoidance”, IEEE Transactions on Robotics and
Automattion, vol. 11, pp. 132-138, February, 1995

932
900 mm

900 m m
760 mm

760 mm

90 0 m m
2420 mm

Fig. 15. Dimension of the Environment

Fig. 16. Processed Range Data from the Rotating Range Sensors with an
Obstacle in the Lower Right Corner

Fig. 18. Progressive Environment Map Building with Four Measurements:


Three Circular Obstacles

Fig. 17. Progressive Environment Map Building with Four Measurements

933

You might also like