Localization of Wheeled Mobile Robot Based On Exte

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

MATEC Web of Conferences 22 , 010 6 1 (2015)

DOI: 10.1051/ m atec conf/ 201 5 2 2010 6 1



C Owned by the authors, published by EDP Sciences, 2015

Localization of Wheeled Mobile Robot Based on Extended Kalman Fil-


tering
Guangxu Li, Dongxing Qin & Hui Ju
Chengdu University of Information Technology, Chengdu, Sichuan, China

ABSTRACT: A mobile robot localization method which combines relative positioning with absolute orienta-
tion is presented. The code salver and gyroscope are used for relative positioning, and the laser radar is used to
detect absolute orientation. In this paper, we established environmental map, multi-sensor information fusion
model, sensors and robot motion model. The Extended Kalman Filtering (EKF) is adopted as multi-sensor data
fusion technology to realize the precise localization of wheeled mobile robot.

Keywords: Wheeled mobile robot, Positioning, Multi-sensor, EKF.

1 INTRODUCTION 2 SYSTEM PRINCIPLES

With the development of technology, mobile robot Figure 1 is the diagram of robot positioning system
technologies have made tremendous process. Howev- with EKF. The diagram shows that it is a recursive
er, nowadays, people have put forward higher demand procedure. Location prediction or motion update is the
of mobile robots for both functions and performances first stage. We obtain odometer and gyro data by ap-
due to the increasing complex environments and tasks; plying Gaussian error motion model to their measured
therefore, traditional robots are now evolving to intel- value directly. Through fusing these data, we generate
ligent robots. Intelligentization of robot poses new predicted location. We then search environmental map
challenges to robot navigation. As the foundation of based on the predicted value to find predicted obser-
robot navigation, localization of mobile robot has vation location that match this information, that is, to
received great attention. Because of its simple struc- predict environmental features and their location in-
ture, high efficiency and controllability, wheeled mo- formation of the laser radar. During the procedure,
bile robot is widely used and researched as an im- robot compares predicted value with real value from
portant branch of mobile robot. Wheeled robot locali- the laser radar to find the best matching value. At last,
zation technology has become a hotspot in robot re- we fuse information from the best matching vale with
searches. EKF to update prediction confidence, thus obtaining
No single sensor can guarantee the accuracy, relia- the best predicted value of robot location.
bility and abundance of information because of its In this paper, we experiment on differential driving
limited resources; therefore, robot cannot locate itself wheeled robot. Code salvers are installed on the
precisely based on only one sensor. Multi-sensor in- wheels as odometer. They and gyro are used to esti-
puts can solve the problem of insufficient data with mate robot location information. Location errors in-
redundant and supplemental information. Through crease over time because of measurement errors of the
fusing information from multiple sensors properly, two types of sensors. External sensor laser radar that is
robot can obtain its accurate geographic information. installed on the robot is the key element to eliminate
This technology has become a new research direction these errors. Laser radar continually matches external
in systems with sensors. conditions with environmental map to obtain absolute
In this paper, we put forward a positioning method location information. With this information, the robot
based on EKF with feature extraction function. can correct its location errors to overcome the in-
Odometer, gyro, laser radar are adopted as the main creasing cumulative errors, thus realizing long-time
sensors to combine relative positioning with absolute precise location.
orientation. A model of robot motion is proposed by
fusing odometer and gyro data. We establish the robot
location observation model with environmental fea- 3 SENSOR MODEL
tures obtained from laser radar to locate the robot. By
combining location observation model and motion 3.1 Odometer Model
model, and then tracing environmental features with
EKF, we realize precise localization of wheeled mo- As an effective relative positioning sensor, odometer
bile robot. has been widely used in the field of wheeled mobile
robot. It estimates the changes of attitude by detecting

This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits
unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Article available at http://www.matec-conferences.org or http://dx.doi.org/10.1051/matecconf/20152201061


MATEC Web of Conferences

Estimation
fusion

Estimatio
Fusion
Position Estimation
Estimation fusion
Gyroscope

Match Predicted and


Environmental Observation Observed Value
Map (Data Base) Predition

Predicted
Value
Match

Extracted
Feature

Observed Value
(Laser ladar)

Figure 1. System diagram


radian changes in a certain time, via the code salvers Figure 2 shows the motion model of the differential
installed on driving wheels. Generally speaking, atti- driving robot. In a short period of sampling time t ,
tude of mobile robot can be represented by the fol- the robot moves from position p to p
, and the
lowing vector: changes of attitude can be estimated by the integration
of return value from code salvers. Mobile robot path
 x can be represented by straight lines; therefore, the
variation of position ( x , y , ) from p to p
is:
p   y  (1)
  x = s cos( / 2) (2)

y = s sin( / 2) (3)
Y

sr sl
XR  (4)
b
sr sl
s =
YR

2
(5)

Where, sr and sl
s
(x,y,̋)
represent moving distance
of left and right wheels respectively; b represents dis-
YR
tance between the two driving wheels. Therefore, the
XR
updated position p
is:

 x
  x   x   s cos( / 2)
P(x,y,̋)  
p
  y
  p  y    y   s sin( / 2)  (6)
X

      
Figure 2. Motion model of the differential driving robot

01061-p.2
ICETA 2015

Expression (6) is the basic equation of updated po- Where, represents initial attitude angle, i
sition value from odometer. In increment vector ( sr , represents angular velocity output of gyro.
sl ), there exist errors introduced by uncertain inte- There are two types of gyro errors; scale error and
offset error. Scale error is introduced by the propor-
gral errors and the approximate motion model. For this tional relationship between the input and output of
reason, we must establish error model of position p
. gyro, which belongs to regular drift error. Offset error
The covariance matrix  p
of odometer’s estimated is introduced by external environment, which means
position is determined by the following expression: that gyro generates limited nonzero output even with
no input. This type of error belongs to random drift
 p
  f  p p p f T  rl f   rl f T (7) error, and the main reasons are large-scale temperature
variations and noise. In this paper, we assume that the
We assume that the initial covariance matrix  p is
measured values of gyro obey Gaussian white noise
distribution. We determine model parameters by es-
known, and then the covariance matrix of motion in- tablishing fitting error model with zero input, and then
crement ( sr , sl ) is: determine variance Q(K).

k s 0  3.3 Laser Radar Model


   r 0 r k s  (8) Laser radar is also called laser ranger, which has the
 l l  same principles with ultrasonic ranger, and it belongs
k
Where, r and k are error constants, which rep-
to active ranger instrument. It can scan surrounding
l environment on its scanning plain based on a given
resolution, thus obtaining distance information  and
resent the uncertain parameters between driving mo-
tors, wheels and ground. Specific value of k r and k l
scanning angular information of the measuring
should be determined by experiments.
points in the environment. These points can reflect
basic features of the environment. The features ex-
Two Jacobi matrixes can be calculated from ex- tracted from laser radar are linear features. The meas-
pression (6): ured numbers of laser radar at a single point are great-
er than estimated numbers of linear feature parameters.
1 0 s sin( / 2) Due to measuring errors of sensors, extraction of fea-
 f f f    (9)
  0 1 s cos( / 2) 
FP   P f   tures needs certain optimizing algorithm to minimize
 x y   0 0 differences between estimated values and measured
 1  values.
1 s 1 s 
 2 cos( 2 ) 2b sin( 2 ) cos( ) sin( )
1 s
2
1
2
s
2b 2
 YW ( i , i )
F rl   rl   sin( ) cos( ) sin( ) cos( )
2 2 2b 2 2 2 2b 2 
  (10)
i
1 1

 b b 

After analyzing the procedure and error model of i R Rr


odometer, we can use them as robot motion model.
Along with the certain control input u(k), we can pre-
dict robot location and its uncertainty. Two parameters
p
and  p
represent confidence status, which can P(x,y,̋) Wr
be assumed that they obey Gaussian distribution.
3.2 Gyro Model
W
O XW
Gyro is a kind of inertial device and is used to meas-
ure its carrier’s angular velocity and rotation angle.
The variation of position angle is calculated through Figure 3. The least square estimation line and environment
integration of gyro’s output. Therefore, a reference coordinate [W] to robot local coordinate [R]
direction is needed for gyro to accumulate its angular
variation. At a certain time TS , the following expres-
sion represents updated attitude angle: Figure 2 shows n measuring points x i = (  i , i ) in
the polar coordinates of robot sensors. We assume that
the measuring errors of distance information  and
TS


    i d t (11) scanning angle information are subjected to
0 Gaussian probability density curve; average value is
the measured value and variance is the independent

01061-p.3
MATEC Web of Conferences

constant   and  . From these points, we can This covariance matrix is the observation error covari-
estimate an optimal line. Any given measuring point ance 
R,i in observation equation.
(  , ) can be converted to Euclidean coordinates
through two expressions: x   cos ;
y  sin . A given straight line can be expressed 4 PREDICTION OF ENVIRONMENTAL FEA-
by the following equation: TURES

 cos cos   sin sin  r (12) In multi-sensor fusion, the most important thing is
match between predicted environmental features and
  cos(  ) r  0 observed environmental features. Predicted robot po-
sition pˆ (k | k ) will generate expected feature z t ,i .
The orthogonal distance between the line and a spe- In environmental map, the stored feature is linear fea-
cific point x i = (  i , i ) is: ture, which is given as environmental coordinate pa-
rameters; whereas extracted linear feature is given as
 i cos( i  ) r  d i (13) robot local coordinate. Therefore, we transform the
Then we classify data which belong to the same line predicted observation features in environmental coor-
into one category, and use least squares fitting to gen- dinate [W] to robot local coordinate [R]. The follow-
erate a straight line. We calculate fitting straight line ing expression shows the transformation:
parameters  and r by the following expressions:
(19)
1    i 2 sin 2 i   i  j cos i sin j  (14)
  arctan    W
 t ,i W ˆ(k 1 / k ) 
2    2 cos 2    (cos )  zˆi (k 1)   W 
 i i i j i j   t ,i
r ( W
ˆ
x ( k 1 | k ) cos(W  t ,i ) W yˆ (k 1 | k ) sin(W  t ,i ))

r    i cos( i  ) (15) The Jacobi matrix hi is given by the following


expression:
The uncertainty of laser radar measurement will   t ,i  t ,i  t ,i 
amplify the uncertainty of extracted line. A and R  xˆ yˆ 
represent random output variables  and r, the co-  ˆ 
hi  
variance matrix of system output is:  rt ,i rt ,i rt ,i 
 
 2
 AR   xˆ yˆ  ˆ  (20)
C AR   A  F C X F
T
2  0 0 1
 AR R  
0 
(16)
 cos  t ,i sin W  t ,i
W

Where, C X is a given 2n  2n input covariance From the predicted position of robot, we extract en-
matrix: vironmental features in robot local coordinate. Ac-
cording to EKF, the following observation equation
can be obtained:
0  diag ( i ) 
2
C 0
CX       2 
(17)
0 C    0 diag ( i ) zˆi (k 1)  hi ( zt pˆ (k 1 / k )) wi (k ) (21)

Where, wi (k ) represents observation errors of


and F is a Jacobi matrix: sensors.
       
   ...  ...
 1  2  n 
F   1 2 n
 5 EXPERIMENT
 r r ... r r r
...
r  (18)
 1  2  n  1  2  n  The experimental platform is an autonomous mobile
robot, which is based on 32-bit ARM Cortex-M3 mi-
crocontroller LM3S8962 and coprocessor DSP. Ac-
Therefore, considering the laser radar data errors, cording to our model, the robot has installed internal
we establish sensor model according to environmental sensor photoelectric encoder, gyro, and external envi-
features of the extracted line. For every line extracted ronmental sensor laser radar. Laser radar is SICK
from sensor data, there exists a pair of corresponding LMS200, whose scanning scope ranges from 0° to
parameters (  ˈr) and error covariance matrix C AR .
180°. It returns the distance information and angle
information of the measured points.

01061-p.4
ICETA 2015

tended Kalman Filter as mobile robot localization


Y method. When applied to actual robot navigation and
location, the results show that the method can improve
C the precision and reliability of robot localization.

ACKNOWLEDGEMENT

YR This paper is supported by the Chengdu University of


B Information Engineering School Fund (379116).
XR X
O A
REFERENCES
Figure 4. Robot path
[1] SangJoo Kwon. & Kwang Woong Yang. 2006. An Ef-
fective Kalman Filter Localization Method for Mobile
As shown in Figure 4, the experiment is conducted Robots. Beijing, China: Proceedings of the 2006
in corridor. Point O is the origin of the environmental IEEE/RSJ International Conference on Intelligent Ro-
map, the robot is planned to move from original posi- bots and Systems, October 9-15, pp: 1524-1529.
tion A to terminal point C, approximately 20 meters. [2] Leopoldo Jetto, et al. 1999. Development and Experi-
The robot’s speed is 1m/s, sampling time is 0.5s, and mental Validation of an Adaptive Extended Kalman Fil-
estimated starting point A(x, y, ) is (0, 0, 0). Prac- ter for the Localization of Mobile Robots. IEEE Trans.
tically, point A deviates from the origin to some extent Robotics and Automation, 15(2): 219-229.
to verify algorithm convergence. The robot use EKF [3] Lin Y Z, Huang Y M. & Shi E X. 2004. Application of
to locate itself during the procedure. In point B, we data fusion algorithm based on Kalman filter in mobile
give the predicted line (thin line), measured line (thick robot position measuring system [C]//Proceedings of the
line) and updated estimated line (thick dash line) after 5th World Congress on Intelligent Control and Automa-
data fusing. The prediction, measurement and latest tion. New York: IEEE Press, pp: 4956-4959.
estimation of robot location is uncertain, so the loca- [4] Borenstein, J., Everett, H.R., Feng, L. 1996. Navigating
tion of the robot is expressed as ellipse. The predicted Mobile Robots, Systems and Techniques. Natick,
and observed features of the robot are judged and MA,A.K. Peters, Ltd.
matched through Mahalanobis Distance. [5] Borenstein, J., Everett, H.R., Feng, L. Where Am I?
The robot use linear control algorithm to find the Sensors and Methods for Mobile Robot Positioning. Ann
path tracking. Figure 5 shows the whole route. The Arbor, University of Michigan.
robot reaches the destination as required; every loca- [6] Arras,K.O. & Tomatis,N. 1999. Improving Robustness
tion’s mean-square deviation is about 2 centimeters, and Precision in Mobile Robot Localization by Using
and heading angle is less than 0.5. The experimental Laser Range Finding and Monocular Vision. Proceed-
results show that the sensor fusion system has im- ings of 3rd European Workshop on Advanced Mobile
proved robot navigation. Robots (Eurobot 99), Zurich, September 6-9.
Y [7] Arras, K.O. & Tomatis, N. 1997.“Feature Extraction and
Scene Interpretation for Map-based Navigation and Map
Building”, in Proceedings of SPIE, Mobile Robotics XII,
C 3210: 42-53.
[8] Kanayama. 1996. A stable tracking control method for
an autonomous mobile robot. In:Proc IEEE Int Conf
Robots. IEEE Trans on Robotics and Automation. 1912:
47-61.

O
A X

Figure 5. Actual robot path in experiment

6 CONCLUSION

In this paper, we overcome the disadvantages of the


system error expression method of previous single
sensor or multi-sensor system by adopting the Ex-

01061-p.5

You might also like