Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

PROCEEDINGS

IEEE Asia-Pacific World Congress


on Computer Science and Engineering 2014
(APWC on CSE 2014)

4-5th November 2014, Plantation Island, Fiji

http://i-lab.org.au/conference
Vision Based Autonomous Path Tracking of a
Mobile Robot using Fuzzy Logic

Edwin Vans Gancho Vachkov Alok Sharma


Dept. of Electrical & Electronic Eng. School of Engineering & Physics School of Engineering & Physics
Fiji National University University of the South Pacific University of the South Pacific
Derrick Campus, Suva Laucala Campus, Suva Laucala Campus, Suva
Fiji Islands Fiji Islands Fiji Islands
Email: vans.edw@gmail.com Email: gancho.vachkov@gmail.com Email: alok.fj@gmail.com

Abstract—In this paper we present an algorithm for au- Mobile robot path tracking can be compared to driving a
tonomous path tracking of a mobile robot to track straight and car along a road. The road consists of straight and curved
curved paths traced in the environment. The algorithm uses a segments or bends. The driver has to control two parameters
fuzzy logic based approach for path tracking so that human of the vehicle in order to drive safely along the road. One of
driving behavior can be emulated in the mobile robot. The method parameters is the speed of the vehicle which can be controlled
combines a fuzzy steering controller, which controls the steering
angle of the mobile robot for path tracking, and fuzzy velocity
by using the accelerator and the brake. The other parameter
controller which controls the forward linear velocity of the mobile is the heading of the vehicle which can be controlled using
robot for safe path tracking. The inputs to the fuzzy system are the steering wheel. The experienced driver (expert) know how
given by the vision system of the mobile robot. A camera is used to appropriately control the speed and steering of the vehicle
to capture images of the path ahead of the mobile robot and the based on the road ahead of the vehicle. If the car approaches
vision system determines the lateral offset, heading error and the a bend of some constant curvature then a constant amount of
curvature of the path. We perform experiments using a mobile steering must be applied to track the bend. The authors in [10]
robot platform. In the first experiment the mobile robot is able referred to this exception handling. On a straight road however,
to successfully track a straight path. This shows the effectiveness the driver may have to make fine adjustments to the steering
of the fuzzy steering controller. We also perform experiments on to keep the car on track, and may increase the speed of the
paths containing curved sections. The fuzzy velocity controller
was able to command appropriate speed for safe tracking of the
car. This was referred to as compensation control in [10].
path ahead of the robot. The effectiveness of the fuzzy velocity An expert driver may make decisions based on a vague
controller is shown in this experiment. set of rules such as “if aggressive steering is required then
reduce speed” or “if gentle steering is required, then increase
I. I NTRODUCTION speed” or “if bend is narrow then reduce speed”. To handle
In the area of mobile robot navigation, there exists many such vagueness in information, perhaps the most appropriate
problems that robotics researchers have addressed using dif- solution to path tracking problem is by using a fuzzy logic
ferent techniques. One such problem is the path tracking based approach. In fact, research in autonomous highway
problem where the mobile robot navigates by following a path driving has also considered using fuzzy logic based approach.
traced on the ground in the robot environment. Path tracking The authors in [10] have used a fuzzy logic controller to
problem is most relevant to mobile robots in industrial and control the steering of a car for autonomous highway driving.
manufacturing environment as well as to mobile robots in The dynamic model of the mobile robot is highly non-linear
agriculture applications [1]. and it is further affected by undesired effects such as wheel
slippage, gear backlash and sensor noise. A fuzzy logic based
Several papers have addressed the problem of path tracking controller is well suited for non-linear control applications as
for nonholonomic mobile robots e.g., [2], in which the authors the dynamic model of the plant is not required. For this reason
have attempted the trajectory tracking problem by designing and because fuzzy logic based controller is easier to design and
an adaptive tracking controller and [3], in which the authors implement, many authors e.g., in [9] have used this approach
have successfully used the backstepping technique. Another for path tracking of mobile robots. The authors in [11], used a
interesting approach is the sliding mode control approach [4] to fuzzy inference system (FIS) to control the advancing velocity
tracking in which the authors have represented the kinematics of the mobile robot to successfully track a path.
of a nonholonomic mobile robot in polar coordinates. Recently
it has been noted that the mobile robot path tracking problem This paper focuses on mobile robot navigation problem,
has been addressed using vision based approaches [5]–[9], in particular solving the problem of autonomously tracking
that is using a vision sensor. Vision sensors can provide extra a continuous path traced on the ground in the mobile robot
information about the characteristics of the path ahead of the environment. We explore a fuzzy logic based approach mainly
mobile robot, instead of just determining whether the mobile because human driving behavior can be emulated using simple
robot is to the left or right of the path as with infrared sensors. fuzzy control rules and since no explicit dynamics model of
In [7], the mobile robot approaches a target point whereby the the mobile robot is required in this approach. In our approach
coordinate of the point is extracted from the image of the path. the images of the path ahead of the robot is captured using a
y 1
ω c
P
v
eh
y el
b x

ϕ l L

θ
Fig. 2. Sketch showing the parameters of the path used in the proposed
x algorithm

Fig. 1. Top view sketch of the differential drive robot


of the mobile robot respectively. However, for mobile robot
navigation problems such as path tracking, a more appropriate
camera. The characteristics of the path ahead of the robot is variable to control would be steering angle, instead of angular
determined by the vision system of the robot. The proposed velocity. Thus, we define the steering angle, ϕ, as the angle
method allows the mobile robot to accurately and safely track the rear caster makes with respect to the lateral center of the
the path ahead of it by controlling the steering angle and mobile robot, l distance away from the axis of the center
forward linear speed of the robot. of mass as shown in Fig. 1. The steering angle and angular
velocity of the robot is related using the equation in (3), where
ϕ is subjected to the kinematic limit − π2 < ϕ < π2 .
II. M ODELING
v
A. Mobile Robot Model ω = tan(ϕ) (3)
l
The proposed method is developed for a differential drive
mobile robot. Fig. 1 shows a sketch of the top view of the B. Vision Dynamics
mobile robot and all the relevant parameters of the robot. Our proposed path tracking method is for both straight and
The mobile robot has two active wheels at the front, powered curved paths of unknown curvature traced on the ground in the
independently and a free wheeling swivel caster to support the mobile robot environment. The top view of the mobile robot
mobile platform at the rear of the robot. The mobile robot and the path is shown in Fig. 2, where y is the lateral position
is able to steer towards the left or the right by differentially of the center of the path with respect to the mobile robot and
varying the left and the right wheel velocities. If the left and x is the distance ahead of the mobile robot in the direction the
the right wheels turn in tandem, then the mobile robot moves mobile robot. The thick line represents the path traced on the
in a straight line. ground. The vision system of the robot is used to extract the
The kinematic equations describing the model is given in line traced on the ground from images of the path ahead of the
(1). We assume that the masses and the inertia of the wheels robot captured by a camera. From the extracted path, the vision
are negligible. The position (x, y) of the mobile robot in the system determines the lateral offset el which is the distance
world cartesian coordinate system is taken from the middle of between the lateral center of the mobile robot and the center
the axis connecting the centers of the two front wheels. We of the path at a distance of L ahead of the mobile robot. The
assume also that this is the center of mass of the mobile robot. heading error eh is the difference between the current heading
v is the magnitude of the forward linear velocity of the mobile of the mobile robot and the direction of the tangent line to
robot at the center of mass, θ is the orientation of the mobile the path L distance away from the mobile robot. The vision
robot with respect to the positive horizontal axis of the world system is also used to extract the curvature, c, of the bend in
coordinate system and ω is the angular velocity of the mobile the path ahead of the mobile robot. The lateral and heading
robot about a point on the axis of connecting the centers of errors and the curvature of the path ahead of the robot is used
the two front wheels, e.g., point P . as inputs into our proposed algorithm.
ẋ = v cos(θ) III. P ROPOSED PATH T RACKING M ETHOD
ẏ = v sin(θ)
The proposed method combines a fuzzy steering controller
θ̇ = ω (1) with a fuzzy velocity controller. The steering controller con-
The mobile robot is subjected to a nonholonomic constraint. trols the steering angle of the mobile robot, while the velocity
As described through the equation in (2), the mobile robot is controller controls the forward linear velocity of the mobile
unable to move laterally along the axis connecting the center robot for safe tracking of a continuous path traced on the
of the two front wheels of the mobile robot. ground ahead of the robot. A concise block diagram showing
the proposed mobile robot path tracking method is given in
ẋ sin(θ) − ẏ cos(θ) = 0 (2) Fig. 3. A camera captures the images of the path ahead of
the robot that is to be tracked. The captured path is relative
The motion of the mobile robot can be controlled by the the to the mobile robot since the camera is fixed to the mobile
parameters v and ω, which are the linear and angular velocities robot. The vision system then by using image processing

Actual
Vision el Fuzzy ϕ Fuzzy v Low Mobile Path
Steering Velocity Level
System eh Controller Controller Controller Robot

Camera

Fig. 3. Block diagram representation of the proposed path tracking method

techniques determines the lateral error and heading error of the Negative Positive
mobile robot. The lateral error is in meters whereas heading 1
error is in radians. The errors are inputs to the fuzzy steering
controller. The vision system of the mobile robot also estimates

membership
the curvature of the path ahead of the robot. The estimated
curvature is one of the inputs to the fuzzy velocity controller. 0.5
The output of the fuzzy steering controller and fuzzy velocity
controller is the steering angle and forward linear velocity
respectively. It can be noted that the outputs of the two fuzzy
controllers are reference inputs to the low level controller 0
which regulates the left and the right wheel velocities of −M 0 M
the mobile robot to achieve the reference linear velocity and error, change of error
steering angle. The following subsections gives details about
the design of the fuzzy steering and fuzzy velocity controller. Fig. 4. Membership function plots for error e1 (k) and change in error e2 (k)

A. Fuzzy Steering Control


that must be selected carefully. The degree of membership in
The fuzzy steering controller controls the steering angle sets positive and negative for each input can be computed as
of the mobile robot to accurately track the path traced on
0, for ej < −M

the ground in the mobile robot environment. The inputs to 
ej + M

the fuzzy steering controller are the lateral and heading errors µP (ej ) = (6)
, for −M ≤ ej ≤ M
defined in the previous section. Both of the errors are combined  2M

in (4) to get the tracking error et of the mobile robot, where ēl 1, for ej > M
is the normalized lateral error and ēh is the normalized heading
error. α is a parameter which controls the sensitivity of the two and
errors. 1, for ej < −M


et (k) = αēl (k) + (1 − α)ēh (k) (4) 
−ej + M
µN (ej ) = , for −M ≤ ej ≤ M (7)
Using the tracking error in (4) we define the two linguistic  2M

0, for ej > M
variables for our fuzzy system as error and change of error.
To coherently explain our proposed method we will use the where j ∈ {1,2}.
notation e1 (k) for error and e2 (k) for change of error at k-th
discrete time step given as We use the Takagi-Sugeno (TS) type fuzzy controller struc-
ture for our steering controller as it is widely used in control
e1 (k) = et (k) applications and because of its computational advantages. The
e2 (k) = e1 (k) − e1 (k − 1) (5) fuzzy rules for the TS type fuzzy controller is in the IF...THEN
form as
The fuzzy controller has three computational steps which are
fuzzification, inference and defuzzification. In the fuzzification Ri : IF e1 (k) is A1i AND e2 (k) is A2i
step, the two input variables e1 (k) and e2 (k) are fuzzified THEN ui (k) = a0i + a1i e1 (k) + a2i e2 (k) (8)
into a degree of membership of the two sets, positive (P) and
negative (N). The two linguistic values, negative and positive where A1i , A2i ∈ {N, P} are the input fuzzy sets and i =
denotes that the error and change of error can be either negative 1, 2, ..., N where N is the number of fuzzy rules. For our
or positive. The membership function plot which is identical fuzzy steering controller, there are two inputs and each of the
for error and change of error is given in Fig. 4. A combination inputs have two fuzzy sets. Thus, the total number of rules,
of triangle and trapezoidal membership functions are used for N = 4. The TS fuzzy rules in (8) suffers from a large number
simplicity of controller design and ease of computation of of parameters which need to be selected carefully or tuned for
membership values. The parameter M is a design parameter good performance. We use the simplified version of the TS
TABLE I. TS FUZZY RULE BASE
fuzzy rules [12] where it is assumed that a0i = 0, and the
consequent part of the fuzzy rules are simplified as Rule No. (i) e1 (k) e2 (k) ui (k)
1 Positive Positive u1 (k) = (a1 e1 (k) + a2 e2 (k))
u1 (k) = p1 (a1 e1 (k) + a2 e2 (k)) 2 Positive Negative u2 (k) = 0
u2 (k) = p2 u1 (k) 3 Negative Positive u3 (k) = 0
4 Negative Negative u4 (k) = (a1 e1 (k) + a2 e2 (k))
u3 (k) = p3 u1 (k)
u4 (k) = p4 u1 (k) (9)
This reduces the number of parameters to be selected to six,
which are p1 , p2 , p3 , p4 , a1 and a2 . 10
In the inference step the antecedent part of the fuzzy rules
are combined to compute the firing strength of each rule. We

ϕ
0
use the algebraic product for the fuzzy AND operation to
compute the firing strength of each rule as
−10
Φi = µA1i (e1 (k))µA2i (e2 (k)) (10) 1
1
where Φi is the firing strength of rule i and µA1i is the degree 0 0.5
0
of membership of input e2 (k) in set A1i and µA2i is the degree e2 −0.5
of membership of the input e2 (k) in set A2i . The crisp steering −1−1 e1
angle output is computed in the defuzzification step. We used
the popular weighted average method given in (11) to compute Fig. 5. Non-linear control surface of the steering fuzzy controller
the crisp value of the steering angle. It can be noted that due to
the structure of the membership function for error and change
of error, the denominator in (11) is always one. The structure case, a dynamic model of the mobile robot needs to be made
of the fuzzy steering controller is similar to the fuzzy PD and a suitable method of optimization needs to be selected.
controller with variable proportional and derivative gains [13]. There are other factors such as the optimization criteria or the
PN fitness function needs to be obtained and constrains need to
Φi ui (k) be set as tunning fuzzy controller parameters is a constrained
ϕ(k) = i=1 PN (11)
optimization problem. In addition, the optimization method
i=1 Φi
itself has several parameters that needs to be selected carefully.
The values of the parameters of the fuzzy steering con- Optimization of the controller parameters requires experience
troller are chosen as follows. We choose M = 1 since the error and time. The optimization of the fuzzy controller parameters
and change of error can be normalized in the range [−1, 1]. The is outside the scope of this work. However, we can confirm that
parameters p1 = p4 = 1 and p2 = p3 = 0. The explanation the parameters that we selected resulted in a stable controller.
behind this choice is as follows. The choice of these variables
resulted in the simplest steering angle control rules. If error
and change of error are both positive, then it means that the B. Fuzzy Velocity Control
path is to the left of the robot and the robot is moving further
towards the right away from the path. Therefore, the controller The fuzzy velocity controller compensates for the increase
output must approach the positive maximum value of steering in the angular velocity above the kinematic limit of the mobile
angle to steer the robot towards the left to track the path. robot by appropriately reducing the forward linear velocity of
Similarly, if error and change of error are both negative, then the mobile robot. There are two inputs to the fuzzy velocity
it means that the path is to the right of the robot and the controller which are steering and bend. The linguistic variable
robot is moving further away from the path towards the left. steering refers to the amount of steering angle applied by the
In this case the controller output must approach the negative fuzzy steering controller. The steering angle of the mobile
maximum value of the steering angle to steer the robot towards robot can be used to determine the angular velocity of the
the right to track the path. However, if error is positive and robot at current forward velocity using the equation in (3).
change of error is negative, then the controller steering angle A kinematic limit on the angular velocity ωmax , of the mobile
output must approach zero as the mobile robot is in the process robot such that |ω| ≤ ωmax brings to an upper limit the steering
of moving towards the path. The same is the case if error angle ϕ, such that |ϕ| ≤ ϕmax , to maintain the angular velocity
is negative and change of error is positive. It must be noted limit of the mobile robot at current forward linear velocity.
that the human driving behavior is reflected in the four cases v
discussed here. The fuzzy rule base given in Table I shows all ωmax = tan(ϕmax ) (12)
l
the N = 4 possible rules of the fuzzy inference system. We
set the parameters a1 = 9.00 and a2 = −0.02. This is based However, the fuzzy steering controller may command a greater
on several experiments. The four simple control rules can be steering angle to track an upcoming bend in the mobile robot
visually seen in the three dimensional non-linear controller path. The linguistic variable steering then refers to the amount
surface in Fig. 5. by which the commanded steering angle exceeds ϕmax , and is
given by ∆ϕ in (13).
The optimal values of the parameters a1 and a2 can be
obtained by using an offline optimization algorithm. In such ∆ϕ = |ϕ| − ϕmax (13)
Gentle Appropriate Aggressive Wide Medium Narrow
1 1

membership

membership
0.5 0.5

0 0
−1.0 −0.5 0 0.5 1.0 0 0.5 1.0
steering (∆ϕ) bend (c̄)

Fig. 6. Membership function for steering Fig. 7. Membership function for bend

TABLE II. F UZZY RULE BASE OF THE VELOCITY CONTROLLER


If the output of the fuzzy steering controller, ϕ, is equal in ∆ϕ
vsf
magnitude to ϕmax , then ∆ϕ = 0. Therefore, positive values Gentle Appropriate Aggressive
of ∆ϕ indicates that an aggressive steering is applied by the Wide high high medium
steering controller and it will result in the vehicle exceeding the c̄ Medium high medium low
angular velocity limit, at current linear velocity v. Therefore, Narrow medium low low
the velocity controller must compensate by reducing the linear
velocity to keep the angular velocity of the mobile robot
within the safe kinematic limits. The membership function with three fuzzy sets each. Thus, there is a total of nine rules.
for the input variable steering is shown in Fig. 6. It takes The fuzzy AND product operation is used to combine the
on the linguistic variables gentle, appropriate and aggressive membership degrees of the antecedent part of the rules to
to indicate the three levels of steering applied by the fuzzy obtain the firing strength of each rule. Defuzzification is by
steering controller. the use of the weighted average method to obtain the crisp vsf
output from the fuzzy system.
A second input to the fuzzy velocity controller is an
estimate of the normalized curvature of the upcoming bend
ahead of the vehicle. The linguistic variable bend, refers to IV. E XPERIMENTAL R ESULTS
the value of the normalized curvature of the upcoming bend, A. Setup
and it can have the linguistic values wide, medium and narrow.
The normalized curvature, c̄, is given as 1) Mobile Robot Platform: The mobile robot platform that
was used to test our path tracking algorithm is shown in Fig. 8.
|c̃| This platform was assembled using various mechanical and
c̄ = (14)
cmax electronic parts. The physical parameters of the mobile robot
where cmax is the maximum curvature of the bend which can are as follows. With reference to Fig. 1 the wheel base, b of the
be tracked by the mobile robot, and c̃ is the estimated curvature mobile robot is 0.33 m and l which is the distance between the
given by the vision system of the robot. The maximum rear caster center and the center of the front wheel is 0.30 m.
curvature of the path that the mobile robot can track at its The radius of the front wheel of the mobile robot is 0.06 m. In
current velocity v is determined by the angular velocity limit addition, we set some kinematic limits of the mobile platform.
ωmax of the mobile robot given as In all our experiments the linear forward velocity of the mobile
robot, v = 0.2 m/s. This is set taking into consideration the
ωmax
cmax = (15) low sampling rate of the video acquisition system explained in
v later section. We set the angular velocity limit of the mobile
The membership function of the bend input is given in robot, ωmax = 1.0 rads/s. We set the parameter which controls
Fig. 7. A wide bend will have a very small curvature value, the sensitivity of the lateral and heading error α = 0.85. This is
approaching zero, while a narrow bend will have a curvature based on several experiments that we conducted with different
approaching cmax . α values.
We defined the output of the fuzzy velocity controller as the The mobile platform contains two modular motor system
velocity scaling factor vsf which is in the range [0.2, 1.0], used to drive the left and the right wheels of the robot. The modular
to scale the forward linear velocity of the mobile robot. Three motor system consists of DC motor, gear-head, encoder sensor
singletons are used as outputs of the velocity controller. The and controller. The controller of the modular system features
linguistic variables representing the output are low, medium a build in low level PI speed controller running at 56 kHz
and high. The chosen numerical constant values of the three sampling rate. The controller speed input is commanded using
linguistic variables are 0.2, 0.5 and 1.0 respectively. The rule a PWM signal generated by the 32-bit microcontroller running
base of the fuzzy velocity controller is given in Table II. The on ARM Cortex M3 core. The microcontroller receives refer-
rule base was designed to emulate human driving behavior ence speed commands for the left and the right wheel from
such as “If steering is aggressive and bend is narrow, Then our path tracking algorithm. The algorithm was programmed
speed is low”. The fuzzy velocity controller has two inputs using MATLAB running on a notebook computer.
Fig. 9. Original image of the path taken by the on-board camera
Fig. 8. Mobile robot platform

is a translation vector connecting the world coordinate system


2) Vision System: Image data is acquired using a low cost to the camera fixed frame. Both R and t are the extrinsic
USB camera (webcam) mounted on the mobile robot. The parameters of the camera. A MATLAB camera calibration
CMOS camera is connected to a notebook computer running toolbox [14] inspired by [15] was used to find the intrinsic
MATLAB. Images are acquired using the Image Acquisition camera parameters. The camera matrix which contains the
Toolbox at a frame rate of 7 Hz. The image size is 640 × 480 intrinsic calibration parameters was found to be
pixels. We assume a pinhole camera model for the vision
728 0 371
 
system. In this model we define f c = [fc1 , fc2 ]T ∈ R2 , as the
effective focal lengths of the camera and cc = [cc1 , cc2 ]T ∈ R2 K= 0 728 237 (21)
as the principal point, both of which are in pixels. A point 0 0 1
P in the camera reference frame has the coordinates pc =
[xc , yc , zc ]T ∈ R3 . The projection of point P onto the image The values have been rounded off to the nearest integer. The
plane is given by pixel error was sufficiently low thus the parameters were
accepted. To find the extrinsic parameters, an image of the
xc /zc xn
   
pn = = (16) checkerboard pattern was captured by carefully placing the
yc /zc yn printed checkerboard pattern on the ground on which the white
line to be followed by the robot was traced. The ground on
which is known as the normalized projection. The same point
which white path is traced is flat. The camera was fixed to the
in pixel coordinates is given by
robot and all the path tracking experiments were conducted by
using the same fixed position of the camera. From this image
xp fc1 0
  
pp = = p + cc (17) and using the intrinsic parameters found earlier, the extrinsic
yp 0 fc2 n parameters were found using the camera calibration toolbox.
The extrinsic parameters were found to be
Using homogeneous coordinates for pp , (17) can also be
expressed as 
−0.0002 −1.0000 0.0056

xp xn
   
R = −0.4610 −0.0049 −0.8874
 yp  = K  yn  (18) 0.8874 −0.0028 −0.4610
1 1  
−11.8299
where t =  −2.5442  (22)
fc1 0 cc1
 
364.5480
K= 0 fc2 cc2  (19)
0 0 1
Using the extrinsic parameters found, the images of the lane
is a 3 × 3 camera matrix, which contains the intrinsic pa- taken was projected from image pixel coordinates to ground
rameters of the camera such as focal lengths and principal plane. The projection on ground plane was in millimeters,
point. Moreover, a point p = [x, y, z]T ∈ R3 in a different but later converted to meters. A sample image of the path
reference frame or world coordinates can be transformed into taken during experiments is shown in the Fig. 9. The pixel
image coordinates by the relation coordinates of the midpoint of the path was extracted from the
image and these coordinates were projected onto the ground
x
 
xp
  plane as shown in Fig. 10. Note that the image processing
y  techniques for extraction of the path is out of the scope of
 yp  = K[R t]   (20)
z  this paper. We also consider the radial and tangential lens
1 distortion into account when computing the projections. From
1
this projection we estimated the parameters such as the value
where R ∈ R3×3 is a rotation matrix from the world of curvature in the bend ahead of the mobile robot, the lateral
coordinate system to the camera reference frame, and t ∈ R3 error and the heading error by fitting a cubic polynomial
robot position
0.4 0.4
actual path

0.2
0.2

y (m)
0
y (m)

0
−0.2
−0.2
−0.4
−0.4
0 0.5 1 1.5 2
x (m)
−0.2 0 0.2 0.4 0.6 0.8 1
x (m)
Fig. 12. The position (x, y) of the mobile robot in relation to the actual path
traced on the ground for straight path tracking
Fig. 10. Projection of the points owning to the path onto ground plane. These
points are shown as red squares. The trapezoid shows the area of the ground
captured by the camera. The blue line is the curve that has been fit through 1.5 3.33 m−1
the points owning to the path. 2.00 m−1
1.43 m−1
el 1
0.1 eh

y (m)
0 0.5
error

−0.1 0

0 0.5 1 1.5 2
−0.2 x (m)

0 2 4 6 8 10 12 Fig. 13. The trajectory followed by the mobile robot while tracking three
t (sec) different path containing bend of different curvatures

Fig. 11. Lateral (el ) and heading (eh ) error of the mobile robot for tracking
a straight path C. Curved Path Tracking
We performed a series of experiments on curved paths of
different curvature value. In this experiment, a section of the
through the points and using the model of the path described
mobile robot path was curved and had a constant curvature
in [16].
while the rest of the path was straight. We selected three dif-
ferent curvature values for the three paths. The three different
curvature values are 3.33 m−1 , 2.00 m−1 and 1.43 m−1 . The
B. Straight Path Tracking curved section for each of the three paths started from the same
point. This can be seen from the combined plots in Fig. 13 for
Our first experiment was tracking a straight path traced each of the three trajectory taken by the mobile robot. The
on the ground. The initial position of the mobile robot was path taken by the mobile robot was determined by the use of
about 0.1 meters offset from the path on the ground. The odometry only. It was also assumed that initial position of the
lateral error and the heading error for a duration of about 12 mobile robot is (0, 0).
seconds is shown in Fig. 11. It can be seen that the lateral
error approached zero in about 2 seconds. It must also be In addition, the velocity of the mobile robot for the duration
noted there is no overshoot in the lateral position error. This of the experiment for each of the three paths was also acquired.
is a feature of the derivative term of the fuzzy PD controller. The velocity is plotted in Fig. 14. The plots show that the
The heading error also stabilized with very little overshoot velocity of the mobile robot was reduced by the fuzzy velocity
and is within ±1 %. The small error is encountered because controller while tracking the curved section of the path. It
of the nonholonomic constrain of the mobile robot and because can be noted from Fig. 13 that the curved section of the
more weight is given to the lateral error. The lateral error is in path was encountered after about 1 m of straight path. At
meters while the heading error is in radians. The plot showing a velocity of 0.2 m/s, the mobile robot approaches the bend
the position of the mobile robot in relation to the tracked or the curved section after 5 seconds. Thus, to safely track
path is given in Fig. 12. This plot was obtained by the use the curved section of the path the velocity fuzzy controller
of robot odometry only. Techniques such as dead reckoning reduces the linear forward velocity of the mobile robot. The
which involved setting the initial (x, y) position of the robot three plots in Fig. 14a, Fig. 14b and Fig. 14c shows the velocity
as (0, 0) were used to obtain this plot. profile of the mobile robot for curvatures 3.33 m−1 , 2.00 m−1
0.2 0.2 0.2

0.15 0.15 0.15


v (m/s)

v (m/s)

v (m/s)
0.1 0.1 0.1

0.05 0.05 0.05

0 0 0

0 5 10 15 0 5 10 15 0 5 10 15
t (sec) t (sec) t (sec)
(a) (b) (c)

Fig. 14. Forward linear velocity of the mobile robot while tracking a path of three different curvatures of (a) 3.33 m−1 (b) 2.00 m−1 and (c) 1.43 m−1

and 1.43 m−1 respectively. It can be seen that the greater the [2] J.-h. Li and S.-a. Wang, “Adaptive trajectory tracking control of wheeled
curvature value, the more the velocity controller reduces the mobile robots with nonholonomic constraint,” Journal of Electronic and
linear forward velocity. After the mobile robot exits the curved Technology of China, vol. 3, no. 4, pp. 342–347, 2005.
section, the velocity controller accelerates the mobile robot to [3] T.-C. Lee, K.-T. Song, C.-H. Lee, and C.-C. Teng, “Tracking control of
unicycle-modeled mobile robots using a saturation feedback controller,”
0.2 m/s. IEEE Trans. Control Syst. Technol., vol. 9, no. 2, pp. 305–318, Mar
2001.
V. C ONCLUSION [4] D. Chwa, “Sliding-mode tracking control of nonholonomic wheeled
mobile robots in polar coordinates,” IEEE Trans. Control Syst. Technol.,
The contribution of this paper is summarized as follows. vol. 12, no. 4, pp. 637–644, 2004.
We have proposed a fuzzy logic based method for autonomous [5] Y. Ma, J. Kosecka, and S. Sastry, “Vision guided navigation for
path tracking of a mobile robot. The proposed method is a nonholonomic mobile robot,” in Proc. 36th IEEE Conference on
designed using fuzzy rules that emulate human driving behav- Decision and Control, vol. 3, Dec 1997, pp. 3069–3074.
ior. The path tracking algorithm is for tracking straight and [6] A. Cherubini, F. Chaumette, and G. Oriolo, “An image-based visual
curved paths of different curvatures. Path information such as servoing scheme for following paths with nonholonomic mobile robots,”
lateral position of the path with respect to the mobile robot, in 10th International Conference on Control, Automation, Robotics and
Vision (ICARCV 2008), Dec 2008, pp. 108–113.
the heading of the robot with respect to the direction of the
[7] A. Rezoug and M. Djouadi, “Visual based lane following for non-
tangent line to the path and the curvature of the path ahead holonomic mobile robot,” in IEEE EUROCON 2009, May 2009, pp.
of the robot is estimated by vision system of the robot using 902–907.
images of the path captured by a camera. The fuzzy logic based [8] G. Antonelli and S. Chiaverini, “Experiments of fuzzy lane following
approach is taken by designing two fuzzy systems, one for for mobile robots,” in Proc. 2004 American Control Conference, vol. 2,
steering control and the other for velocity control. The steering June 2004, pp. 1079–1084.
controller controls the steering angle of the mobile robot while [9] T. H. Lee, H. Lam, F. H. F. Leung, and P.-S. Tam, “A practical fuzzy
the velocity controller controls the forward linear velocity of logic controller for the path tracking of wheeled mobile robots,” IEEE
Control Syst. Mag., vol. 23, no. 2, pp. 60–65, Apr 2003.
the mobile robot. Experiments are performed on a mobile robot
[10] J. Guo, P. Hu, L. Li, and R. Wang, “Design of automatic steering
platform. The performance of the two combined controller controller for trajectory tracking of unmanned vehicles using genetic
is shown through the experiments on different paths. Firstly, algorithms,” IEEE Trans. Veh. Technol., vol. 61, no. 7, pp. 2913–2924,
the mobile robot is able to successfully track a straight path, Sept 2012.
which shows the effectiveness of the fuzzy steering controller. [11] G. Antonelli, S. Chiaverini, and G. Fusco, “A fuzzy-logic-based ap-
Secondly, the advantages of the fuzzy velocity controller is proach for mobile robot path tracking,” IEEE Trans. Fuzzy Syst., vol. 15,
shown by tracking a path with curved section. The fuzzy no. 2, pp. 211–221, April 2007.
velocity controller is able to reduce the velocity of the mobile [12] H. Ying, “The takagi-sugeno fuzzy controllers using the simplified
linear control rules are nonlinear variable gain controllers,” Automatica,
robot to safely track the bend in the path. vol. 34, no. 2, pp. 157–167, 1998.
[13] ——, “Constructing nonlinear variable gain controllers via the takagi-
ACKNOWLEDGMENT sugeno fuzzy control,” IEEE Trans. Fuzzy Syst., vol. 6, no. 2, pp. 226–
234, May 1998.
The authors wish to thank the Faculty Research Committee,
[14] J.-Y. Bouguet. (2013) Camera calibration toolbox for matlab. [Online].
of Faculty of Science, Technology and Environment at the Available: http://www.vision.caltech.edu/bouguetj/calib doc/
University of the South Pacific for their grant of $5012.12 [15] Z. Zhang, “Flexible camera calibration by viewing a plane from
FJD under the vote code 6C373-1351-70306-00 towards this unknown orientations,” in Proc. Seventh IEEE International Conference
study. on Computer Vision, vol. 1, 1999, pp. 666–673.
[16] B. Southall and C. Taylor, “Stochastic road shape estimation,” in Proc.
R EFERENCES Eighth IEEE International Conference on Computer Vision, vol. 1,
2001, pp. 205–212.
[1] S. Hiremath, F. van Evert, G. van der Heijden, C. ter Braak, and
A. Stein, “Image-based particle filtering for robot navigation in a maize
field.”

View publication stats

You might also like