Professional Documents
Culture Documents
Localization of Wheeled Mobile Robot Based On Exte
Localization of Wheeled Mobile Robot Based On Exte
Localization of Wheeled Mobile Robot Based On Exte
ABSTRACT: A mobile robot localization method which combines relative positioning with absolute orienta-
tion is presented. The code salver and gyroscope are used for relative positioning, and the laser radar is used to
detect absolute orientation. In this paper, we established environmental map, multi-sensor information fusion
model, sensors and robot motion model. The Extended Kalman Filtering (EKF) is adopted as multi-sensor data
fusion technology to realize the precise localization of wheeled mobile robot.
With the development of technology, mobile robot Figure 1 is the diagram of robot positioning system
technologies have made tremendous process. Howev- with EKF. The diagram shows that it is a recursive
er, nowadays, people have put forward higher demand procedure. Location prediction or motion update is the
of mobile robots for both functions and performances first stage. We obtain odometer and gyro data by ap-
due to the increasing complex environments and tasks; plying Gaussian error motion model to their measured
therefore, traditional robots are now evolving to intel- value directly. Through fusing these data, we generate
ligent robots. Intelligentization of robot poses new predicted location. We then search environmental map
challenges to robot navigation. As the foundation of based on the predicted value to find predicted obser-
robot navigation, localization of mobile robot has vation location that match this information, that is, to
received great attention. Because of its simple struc- predict environmental features and their location in-
ture, high efficiency and controllability, wheeled mo- formation of the laser radar. During the procedure,
bile robot is widely used and researched as an im- robot compares predicted value with real value from
portant branch of mobile robot. Wheeled robot locali- the laser radar to find the best matching value. At last,
zation technology has become a hotspot in robot re- we fuse information from the best matching vale with
searches. EKF to update prediction confidence, thus obtaining
No single sensor can guarantee the accuracy, relia- the best predicted value of robot location.
bility and abundance of information because of its In this paper, we experiment on differential driving
limited resources; therefore, robot cannot locate itself wheeled robot. Code salvers are installed on the
precisely based on only one sensor. Multi-sensor in- wheels as odometer. They and gyro are used to esti-
puts can solve the problem of insufficient data with mate robot location information. Location errors in-
redundant and supplemental information. Through crease over time because of measurement errors of the
fusing information from multiple sensors properly, two types of sensors. External sensor laser radar that is
robot can obtain its accurate geographic information. installed on the robot is the key element to eliminate
This technology has become a new research direction these errors. Laser radar continually matches external
in systems with sensors. conditions with environmental map to obtain absolute
In this paper, we put forward a positioning method location information. With this information, the robot
based on EKF with feature extraction function. can correct its location errors to overcome the in-
Odometer, gyro, laser radar are adopted as the main creasing cumulative errors, thus realizing long-time
sensors to combine relative positioning with absolute precise location.
orientation. A model of robot motion is proposed by
fusing odometer and gyro data. We establish the robot
location observation model with environmental fea- 3 SENSOR MODEL
tures obtained from laser radar to locate the robot. By
combining location observation model and motion 3.1 Odometer Model
model, and then tracing environmental features with
EKF, we realize precise localization of wheeled mo- As an effective relative positioning sensor, odometer
bile robot. has been widely used in the field of wheeled mobile
robot. It estimates the changes of attitude by detecting
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits
unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Estimation
fusion
Estimatio
Fusion
Position Estimation
Estimation fusion
Gyroscope
Predicted
Value
Match
Extracted
Feature
Observed Value
(Laser ladar)
y = s sin( / 2) (3)
Y
sr
sl
XR (4)
b
sr sl
s =
YR
2
(5)
Where, sr and sl
s
(x,y,̋)
represent moving distance
of left and right wheels respectively; b represents dis-
YR
tance between the two driving wheels. Therefore, the
XR
updated position p
is:
x
x x s cos( / 2)
P(x,y,̋)
p
y
p y y s sin( / 2) (6)
X
Figure 2. Motion model of the differential driving robot
01061-p.2
ICETA 2015
Expression (6) is the basic equation of updated po- Where, represents initial attitude angle, i
sition value from odometer. In increment vector ( sr , represents angular velocity output of gyro.
sl ), there exist errors introduced by uncertain inte- There are two types of gyro errors; scale error and
offset error. Scale error is introduced by the propor-
gral errors and the approximate motion model. For this tional relationship between the input and output of
reason, we must establish error model of position p
. gyro, which belongs to regular drift error. Offset error
The covariance matrix p
of odometer’s estimated is introduced by external environment, which means
position is determined by the following expression: that gyro generates limited nonzero output even with
no input. This type of error belongs to random drift
p
f p p p f T rl f rl f T (7) error, and the main reasons are large-scale temperature
variations and noise. In this paper, we assume that the
We assume that the initial covariance matrix p is
measured values of gyro obey Gaussian white noise
distribution. We determine model parameters by es-
known, and then the covariance matrix of motion in- tablishing fitting error model with zero input, and then
crement ( sr , sl ) is: determine variance Q(K).
i d t (11) scanning angle information are subjected to
0 Gaussian probability density curve; average value is
the measured value and variance is the independent
01061-p.3
MATEC Web of Conferences
constant and . From these points, we can This covariance matrix is the observation error covari-
estimate an optimal line. Any given measuring point ance
R,i in observation equation.
( , ) can be converted to Euclidean coordinates
through two expressions: x cos ;
y sin . A given straight line can be expressed 4 PREDICTION OF ENVIRONMENTAL FEA-
by the following equation: TURES
cos cos sin sin
r (12) In multi-sensor fusion, the most important thing is
match between predicted environmental features and
cos(
)
r 0 observed environmental features. Predicted robot po-
sition pˆ (k | k ) will generate expected feature z t ,i .
The orthogonal distance between the line and a spe- In environmental map, the stored feature is linear fea-
cific point x i = ( i , i ) is: ture, which is given as environmental coordinate pa-
rameters; whereas extracted linear feature is given as
i cos( i
)
r d i (13) robot local coordinate. Therefore, we transform the
Then we classify data which belong to the same line predicted observation features in environmental coor-
into one category, and use least squares fitting to gen- dinate [W] to robot local coordinate [R]. The follow-
erate a straight line. We calculate fitting straight line ing expression shows the transformation:
parameters and r by the following expressions:
(19)
1 i 2 sin 2 i
i j cos i sin j (14)
arctan W
t ,i
W ˆ(k 1 / k )
2 2 cos 2
(cos ) zˆi (k 1) W
i i i j i j t ,i
r
( W
ˆ
x ( k 1 | k ) cos(W t ,i ) W yˆ (k 1 | k ) sin(W t ,i ))
Where, C X is a given 2n 2n input covariance From the predicted position of robot, we extract en-
matrix: vironmental features in robot local coordinate. Ac-
cording to EKF, the following observation equation
can be obtained:
0 diag ( i )
2
C 0
CX 2
(17)
0 C 0 diag ( i ) zˆi (k 1) hi ( zt pˆ (k 1 / k )) wi (k ) (21)
01061-p.4
ICETA 2015
ACKNOWLEDGEMENT
O
A X
6 CONCLUSION
01061-p.5