P1329 - UGV Manuvering With The Help of UAV

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 52

UGV ASSISTANCE ROBOT CAPABLE

OF MANEUVERING THROUGH
CLUTTERED ENVIRONMENT BY
MEANS OF A UAV

Group Members

ANAS SIDDIQUI (130687)

ABDUL BASIT (130705)

M. FASIL HAYAT (130787)

BE MECHATRONICS (2013-2017)

Project supervisor

Dr. Umer Khan

Assistant Professor

DEPARTMENT OF MECHATRONICS ENGINEERING

FACULTY OF ENGINEERING,

AIR UNIVERSITY, ISLAMABAD


UGV ASSISTANCE ROBOT CAPABLE
OF MANEUVERING THROUGH
CLUTTERED ENVIRONMENT BY
MEANS OF A UAV

Final Year Project Report

(2013-2017)

DEPARTMENT OF MECHATRONICS ENGINEERING


UGV ASSISTANCE ROBOT CAPABLE
OF MANEUVERING THROUGH
CLUTTERED ENVIRONMENT BY
MEANS OF A UAV

Submitted by:

ANAS SIDDIQUI (130687)

ABDUL BASIT (130705)

M. FASIL HAYAT (130787)

Project Supervisor

Dr. Umer Khan

Assistant Professor

Head of Department

Dr. Zareena Kauser

Assistant Professor
Acknowledgement

First and Foremost praise is to ALLAH Almighty, the most Beneficial and most
Merciful, on whom ultimately we all depend for sustenance and guidance.
Working on the project of “UGV assistant robot capable of maneuvering through
cluttered environment by means of UAV” is a source of learning, experience and
knowledge to us. We would like to express our sincere gratitude to our project
supervisor Dr. Muhammad Umer Khan for his guidance and valuable support
throughout the course of this project work. We acknowledge the encouragement,
inspiration & guidance from our respected faculty.

iv
Abstract

Two robots are used, one of which is UGV (Unmanned Ground Vehicle) having
wireless data receiving unit installed on it and the other one is UAV (Unmanned
Aerial Vehicle) having an onboard computer (Raspberry pi) and camera mounted
on it, which provide live video feedback wirelessly to ground station. The UGV
will be guided by UAV (Programmed to accomplish a user desired autonomous
flight mission by using GPS) that enables UGV to accomplish desired task. The
UGV will navigate among the walls and obstacles using the way points provided
by the UAV.A cooperation scheme between the two is being developed to
perform different operations for which image processing techniques are used
which helps in detecting information of cluttered environment and then path
planning algorithm is applied to find the optimal path along with waypoints for
UGV to maneuver in environment. Way points are sent wirelessly from ground
station which has wireless transmission unit.

v
List of Figures
Figure 1-1 Scenario ................................................................................................. 2
Figure 1-2 System Flow .......................................................................................... 3
Figure 1-3: Multi-UAV and UGV cooperation system .......................................... 4
Figure 1-4: Multi-UAV based vision for global path planning .............................. 4
Figure.1-5: Five directional distance sensors.......................................................... 5
Figure.1-6: AR-100 UAV ....................................................................................... 5
Figure 1-7: Overall scenario for object transportation ............................................ 5
Figure 1-8: Simulation ............................................................................................ 6
Figure 1-9: SLAM-Based navigation...................................................................... 6
Figure 1-10: Quadrotor body fixed frame ............................................................... 7
Figure.1-11: Image scaling ..................................................................................... 9
Figure 1-12: Image Processing ............................................................................. 10
Figure.1-13: Different viewpoints of UAV & UGV ............................................. 10
Figure 2-1 Different Possible Paths ...................................................................... 16
Figure 2-2UAV Schematic Flow .......................................................................... 17
Figure 2-3UGV Schematic Flow .......................................................................... 17
Figure 2-4 RGB image from camera .................................................................... 18
Figure 2-5 Binary image ....................................................................................... 18
Figure 2-6 Boundary wall thickness increased .................................................... 18
Figure 2-7 Object filling ....................................................................................... 19
Figure 2-8 Image after filtering............................................................................ 19
Figure 2-9 RGB image of Maze........................................................................... 20
Figure 2-10 Gray Scale image ............................................................................. 20
Figure 2-11 Binary image after filtering ............................................................... 21
Figure 2-12 Specified area with Numeric assignment .......................................... 21
Figure 2-13 Paths Founded Based on Algorithum ................................................ 22

vi
Figure 2-14 Applied algorithum in Maze ............................................................. 22
Figure 2-15 Probabilistic road map....................................................................... 23
Figure 2-16 3DR Solo Drone ................................................................................ 24
Figure 3-1 Geometric parameters and Coordinate frames ................................. 25
Figure 3-2 Quadcopter Inertial frame and Body frame ........................................ 27
Figure 4-1 Arena construction .............................................................................. 33
Figure 4-2 Overhead camera image ...................................................................... 33
Figure 4-3 Complete Arena .................................................................................. 33
Figure 4-4 Complete working system .................................................................. 34
Figure 4-5 Mission Planner for UAV flight planning ........................................... 35
Figure 4-6 Raspberry Pi ........................................................................................ 35
Figure 4-7 NRF24L01 module.............................................................................. 36
Figure 4-8 Ground Station .................................................................................... 36
Figure 5-1 Processed Image .................................................................................. 38
Figure 5-2 Applied Algorithm .............................................................................. 39
Figure 5-3 Final Image After Algorithms ............................................................. 40

vii
List of Tables
Table 1: Cost Analysis .......................................................................................... 13
Table 2: Quadcopter rotations ............................................................................... 30

viii
Nomenclature

UAV Unmanned Aerial Vehicle


UGV Unmanned Ground Vehicle
SLAM Simultaneous Localization and Mapping
PID Proportional Integral Derivative
RC Remote Control
CCD Charge-Coupled Devices
GPS Global Positioning System
IMU Inertial Measurement Unit
GRTR Geometric Relation-based Triangle Representations
FOV Field Of View

ix
Table of Contents
Acknowledgement ................................................................................................. iv
Abstract ................................................................................................................... v
List of Figures ........................................................................................................ vi
List of Tables ....................................................................................................... viii
Nomenclature ......................................................................................................... ix
Table of Contents .................................................................................................... x
CHAPTER 1 Introduction ................................................................................... 2
1.1 Introduction .............................................................................................. 2
1.1.1 Scenario Building.............................................................................. 2
1.1.2 System flows ..................................................................................... 3
1.2 Literature review ...................................................................................... 4
1.3 Utilizations ............................................................................................. 11
1.4 Organization of report ............................................................................ 11
1.5 Cost analysis ........................................................................................... 12
1.6 Timeline ................................................................................................. 14
CHAPTER 2 Project Description ...................................................................... 16
2.1 Problem Statement ................................................................................. 16
2.2 Objectives ............................................................................................... 17
2.3 System Description ................................................................................ 17
2.3.1 Image Processing ............................................................................ 18
2.3.2 Path Planning .................................................................................. 21
2.3.3 Flood fill and wave front algorithm ................................................ 21
2.3.4 Probabilistic road map algorithm .................................................... 23
2.4 UAV selection ........................................................................................ 24
CHAPTER 3 Kinematic & Mathematical Modeling......................................... 25
3.1 Kinematics .............................................................................................. 25

x
3.1.1 Unmanned Ground Vehicle (UGV) ................................................ 25
3.1.2 Unmanned Aerial Vehicle (UAV) .................................................. 27
3.2 Sinkage Calculations of UGV ................................................................ 30
3.2.1 For two tracks: ................................................................................ 31
3.2.2 For sand ........................................................................................... 31
3.2.3 For Heavy Clay ............................................................................... 32
CHAPTER 4 Experimental Setup and Implementation ......................... 33
4.1 Environment Building ............................................................................ 33
4.2 Autonomous UAV flight planning ......................................................... 35
4.3 Single board computer on UAV ............................................................. 35
4.4 Wireless communication between Robots ............................................. 36
4.4.1 Wireless Data Transmission unit at ground station ........................ 36
CHAPTER 5 Discussion and Conclusion ......................................................... 38
5.1 Use of Image Processing ........................................................................ 38
5.2 Path Planning Algorithms ...................................................................... 39
5.3 Conclusion.............................................................................................. 39
References ............................................................................................................. 41

xi
CHAPTER 1 Introduction

1.1 Introduction
In this project, a collaborative mission is managed by two robots, UGV and UAV.
These robots will perform given tasks in cluttered environment. The UGV
performs its task efficiently in the guidance of UAV, in a way that UAV provides
information of path for UGV to follow including the information of obstacles in
the environment.

1.1.1 Scenario Building

UGV with the help of only


sensors or camera(s) mounted on
it can be used to perform desired
operations in an unknown
environment shown in Figure
1-2, but it requires more time this
way which results in more power
consumption as well as more
time taken in task completion.
Moreover, in military and search Figure 1-1 Scenario
based operations, if checkpoints
or needed objects (circles) are placed behind the obstacles (walls/blocks), it is
difficult for the UGV to find them optimally or consuming less time. But this task
can be feasibly accomplished if UGV is being guided by an aerial support. UAV
will provide the 2D map of environment to get information about the location of
obstacles (Ditches or cliffs etc.).

2
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

The area UGV has to cover in scenario under consideration is maze-like.


Different scenarios are developed inside the maze as shown in Figure 1-1, for
instance square patches are indicating no-go areas.

1.1.2 System flows


UAV is an RC controlled quad-copter and programmable for autonomous flight
mission quadcopter equipped with single board computer i.e. Raspberry-Pi with
its camera module and power supply. Quadcopter has built in altitude sensor,
flight controller and GPS. It is either operated by a human operator or
programmed for autonomous flight. It takes picture(s) of the whole map from a
pre-fixed height with reference to the ground, which will be sent to the Raspberry-
Pi connected to laptop/computer where image(s) will be processed using
MATLAB, as well as the optimal path will be planned using path planning
techniques. All of this data will be then provided to UGV via wireless
communication. System flow schematic is shown in Figure 1-3.

Figure 1-3 System Flow

3
INTRODUCTION

1.2 Literature review


So far, researches related to the project under consideration are performed in
several different aspects.

In one approach, the global path or map of environment is made for an unmanned
ground vehicle (UGV) by the help
of multi UAV- based stereo vision
system shown in Figure 1-3.
Specifically for GPS-denied
environments, using CCD camera
to generate the depth maps of Figure 1-4: Multi-UAV and UGV cooperation system
objects, also this system provides
moveable base line by using two UAVs to overcome the altitude problems with
single UAV based system.

Also, this research has focused on path


planning and cooperative control. This
research overcomes the line of sight
limitation for UGV which can be observed in
Figure 1-4. Depth camera (CCD) is used to
overcome the problem arises due to altitude
of UAV. Marker are used instead of GPS for Figure 1-5: Multi-UAV based vision for
global path planning
relative position and orientation between
UAVs and UGV [1].

Another research work which includes visual navigation of unmanned aerial


vehicle (UAV), establishes robust point correspondences between consecutive
image frame and a visual steering algorithm. The advantage of this approach lies
in the fact that decision for collision avoidance is made immediately, by using
purely visual information extracted from the live video sequence. Also it

4
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

eliminates the time–consuming steps of explicit obstacle recognition and global


reconstruction of the environment. This is shown in Figure.1-6.

Figure.1-6: Five directional distance sensors

The focus of this research is visual steering for safe maneuvering, not localization
or mapping.

The AR–100 UAV shown in Figure 1-6 is capable of vertical take–off and
landing as well as autonomous flight. It has a diameter of 1 m and can fly at a
speed up to 5 m/s. It can be tele–operated
either by a human pilot via remote
controller (in the form of 3D joysticks) or
by a controlling software running on a
laptop which serves as the ground station
Figure.1-7: AR-100 UAV of UAV. In both cases the controlling
commands are sent via an interface built upon radio communication image
sequences captured by the camera mounted on the UAV are also sent via radio
link to the ground station [2].

In a research about the multi robots


cooperation for object
transportation in industrial area,
three UGVs and one UAV are used
for object transportation in an
industrial area as displayed in
Figure 1-8: Overall scenario for object transportation
Figure 1-7. Out of three UGVs, one

5
INTRODUCTION

of them is leader and other two are followers. Real-time Visual feedback and
global coverage of environment by means of camera mounted on UAV is first
sent to PC or ground station where image processing techniques are applied and
obstacles are detected, where the human operator selects the way points and path
free from obstacles through wireless communication provided to the leader UGV
[3].

System of two robots


which includes a UGV
and a UAV that perform
desired task in cooperative
manner by sharing sensors
information is shown in
Figure 1-8. A real-time
mapping for obstacles
detection is performed by
getting information both Figure 1-9: Simulation
from the camera of UAV
and the positioning system of the UGV. The camera of UAV is denoted as
relative sensor and its information is merged with the absolute position of UGV to
know the absolute position of obstacle. To obtain obstacles UGV position,
different coordinate systems and simple prospective projection are made. To
obtain the robot pose measurement model (Odometry, GPS. and IMU) is used [4].

Navigation of micro-
aerial vehicles in
completely unknown,
GPS denied environment
using only camera and

inertial sensors on board Figure 1-10: SLAM-Based navigation

6
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

using a monocular Simultaneous Localization And Mapping (SLAM) framework


shown in Figure 1-9. Using SLAM stabilizes the vehicle in six degrees of
freedom. Algorithm used in this research paper run in real time. Linear optimal
controller is used to perform calculations.

Aerial vehicles will play a major role in search, surveillance and rescue operations
but we strive to find autonomous aerial vehicles in GPS denied unknown
environment. Major problems are control of six degree of freedom of vehicle, its
stability, position and altitude control. Of various method like GPS, laser or visual
navigation later is best one. Important benefits are low cost, low weight and
onboard camera provide rich and distinctive information about environment.
Camera is used to build a 3-D map of environment.

Aerial vehicles are nonlinear systems. To control them two control units are used
one for its altitude and other for its 3-D position. Often PID-controller is enough
for both purposes. For altitude control inertial approach is used. The use of laser
range finders is not optimal because these sensors have a restricted perception
distance and limited field of view (typically only in a plane) and are still heavy for
UAVs and also the power consumption is high [5].

Coordination of UAV and a group of


UGVs has been studied, in a way that
they work together in a known
environment. UAV has path planar unit
and one of UGV is leader among UGVs
and take the lead. Also all of UGVs
remain at safe distances from each other Figure 1-11: Quadrotor body fixed frame

as shown in Figure 1-10 and UAV track the centroid of UGVs group. Presented
method is lot more agile and can face more real time problems.

To reach at specific locations in unknown environment coordination of UAV and


UGVs has been studied extensively. Leader follower approach is used because of

7
INTRODUCTION

ease of control and coordination. A quad copter is used as UAV because it can
stay at specific positions if required and it is a lot easy to use. The main focus is to
implement control over motions of UAV and UGV and algorithms to avoid real
time obstacles. Obstacle avoiding is done by UGV itself because it can observe
obstacle more precisely. To counter external disturbances faced by UAV limit
cycle feature with potential theory is used.

In an approach for coordination of UGV and UAV in known environment, robots


should catch a target which is known for UAV due to its ability to observe far
distances. As a result, the UAV is considered as leader. On the other hand, the
UGVs should keep the desired formation while avoiding possible obstacles. Limit
cycle potential field strategy is employed for obstacle avoidance. During obstacle
avoidance, UAV improves its motion to keep connection among vehicles. Finally
obstacle avoidance algorithm is proposed for large obstacles which both UAV and
UGVs should avoid [6].

Images captured from camera of UAV are distorted because of turbulence in


altitude, roll, yaw and pitch. Techniques for post processing of these images are to
be implemented so that they become uniform as they were taken from uniform
perspective without any turbulence. Then these images can be stitched together to
form an accurate map. Algorithms used are developed in MATLAB and rotary
wing UAV are the focus here.

Images taken from UAV are distorted because of difference in perspective of


camera because of various factors such as difference in roll, yaw, pitch and
altitude. If these images are stitched together then stitching fails because images
are not from same perspective. So in order to make an accurate map, these images
need to be processed in a way that they are from same perspective, using post
processing techniques.

8
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

To use all mathematical algorithms


for post image processing of images,
one must have information of roll,
yaw, pitch and altitude and for this
one can integrate a circuit with flight
sensor along with altimeter and
gyroscope on UAV. Camera angle
with respect to ground can vary in
Figure.1-12: Image scaling
six dimensions. This particular
research paper focus on roll, yaw, pitch and altitude errors. To calculate these
errors, one can use telemetry chip mounted on UAV.

Images taken from UAV are distorted in 5 different ways. Fishy lenses causes
distortion in radially symmetric way. To correct this distortion a distortion
function is used. Because of roll, yaw and disturbances image is distorted and to
take care of this rotation matrix is used. Because of disturbances in altitude image
is distorted and scaling function is used correct the image. This scenario is shown
in Figure 1-11. [7].

To find a shortest path for a UGV in presence of obstacles, first of all image of
ground is captured by a camera and then after obstacle detection, shortest path for
UGV is generated.

Image of ground is taken by camera then it is converted to gray scale image


because color of ground is not so important. Then feature extraction of obstacles
and edge detection is performed on that gray scale image, then thresholding is
performed and finally grid is turned on. To find a shortest path grid is turned on.
These steps are clearly shown in Figure 1-12.

9
INTRODUCTION

Figure 1-13: Image Processing

Then these nodes are used to find shortest path using A* algorithm.
Image based path planning for autonomous robot is inexpensive i.e. energy
efficient. Moreover, these vehicles have fewer resources and they can extract
required information from environment to reach goal. Camera is used to capture
obstacles and A* algorithm is used to find shortest path. For obstacle
detection, thresholding and variance, as well as grid based map strategy is used is
used [8].
In heterogeneous robot
Figure.1-14: Different viewpoints of UAV & UGV
system i.e. co-operative
movement of a UAV (Unmanned Aerial Vehicle) and a ground robot, viewpoint
of UAV is greatly different from ground robot i.e. they may have different
perceptions about the same objects as shown in Figure 1-13. That makes it
difficult to realize cognitive sharing. A better cognitive sharing method between
UAV and ground robot is shown below, by sharing Geometric Relation-based
Triangle Representations (GRTR). It will basically describe a robust method for

10
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

UAV and ground robot to identify the same object among similar objects without
sharing appearance information.

In this study, issue of applying the UAV and UGV to execute collaborative was
dealt. Main focus was to realize cognitive sharing between UAV and UGV in
unstructured environment, because cognitive sharing between robots is the
foundation of cooperative system. With the proposed method, the UGV can
identify the target correctly via matching reference GRTRs sent by UAV even
though the objects’ appearance representations change in different viewpoints or
similar objects exist. This method enables UAV and UGV realize cognitive
sharing in real world environment [9].

1.3 Utilizations
Collaborative robotics scheme can be used for remote viewing and monitoring.

Co-operation of surveillance robots finds primary applications in the defense and


aerospace industry, like to replace humans in hazardous situations (Bomb disposal
etc.), for instance mines deposited area where a ground vehicle in the lead of a
UAV to provide optimal path with mine detector sensors attached can be used to
detect mines.

Enhancements can be made to use this project for combat, rescue and firefighting
purposes.

1.4 Organization of report


The report includes an overview of the work which has been already done in
history related to UAV and UGV co-operation including image processing and
path planning, and a brief description of our project.

Chapter 1 contains the introduction of the project which has been further divided
into subsections including the introduction of project idea, literature survey,
problem statement, objectives, tentative cost analysis, timeline progress and
references.

11
INTRODUCTION

Chapter 2 contains the detailed project description including problem statement,


objectives. Moreover, image processing and path planning techniques are
explained.

Chapter 3 contains mathematical modeling and kinematics of UGV and UAV.

In chapter 4, implementation of the experimental setup of project is stated in


detail.

Chapter 5 consists of evaluated results of the project & brief discussion.

1.5 Cost analysis

12
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

Table 1: Cost Analysis

13
INTRODUCTION

1.6 Timeline

14
CHAPTER 2 Project Description

2.1 Problem Statement


Obstacle avoidance, mapping of unknown environment and finding an optimal
path to provide collision-free navigation for the UGV during the execution of its
mission.

Figure 2-1 Different Possible Paths

In our case as shown in Figure 2-1, second path seems to be the most optimal path
for UGV to maneuver.

16
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

2.2 Objectives
The task performing efficiency of UGV is increased by providing aerial support
from UAV which provides a global map for maneuvering through cluttered
environment.

Using an aerial visual system gives advantage that the UGV field of view (FOV)
is enhanced.

Safety of UGV is enhanced along with the decrease in fuel and time consumption
by providing optimal (shortest) path between initial and goal positions.

2.3 System Description


System description is shown in Figure 2-2 & Figure 2-3.

Figure 2-2UAV Schematic Flow

Figure 2-3UGV Schematic Flow

17
PROJECT DISCRIPTION

2.3.1 Image Processing


For object detection in a simple environment, for instance an environment
containing just three blocks in it as
shown in 2-4, image processing flow
will be as stated below.

Let us assign this image a name ‘F’.


Now, proceeding to process image ‘F’
step by step. Following are
resultant images shown parallel to Figure 2-4 RGB image from camera

the operations these images are generated from.

RGB image is converted into binary


image (Black & White) & edges of
obstacles are detected.

Figure 2-5 Binary image

Boundary lines are made thicker for


prominence.

Figure 2-6 Boundary wall thickness increased

18
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

Obstacles are filled to acquire


complete objects.

Figure 2-7 Object filling

Some small unnecessary objects are


removed and MATLAB filter STREL
is applied to enhance image quality.

Figure 2-8 Image after filtering

Now, considering a maze-like environment shown in Figure 2-9, image of which


is taken from a camera mounted on UAV, in our project. This image is sent to
ground station to process. In this image processing, primary purpose is to detect
the walls and objects placed in the environment. In scenario created below for
instance only contains walls.

19
PROJECT DISCRIPTION

Figure 2-9 RGB image of Maze

This is a colored image, technically in RGB form. First of all, this RGB image is
converted into grey scale.

Resultant image generated after converting it to grey scale becomes as shown in


Figure 2-10.

Figure 2-10 Gray Scale image

20
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

Now, this acquired grey scale image is to be converted into binary image. For
this, a threshold value is to be calculated. In our example scenario, calculated
level value is 0.4196.

Now, our grey-scale image is all set to be converted to binary image. When the
operation to convert this image into binary form is applied, the image generated is
shown in Figure 2-11.

Figure 2-11 Binary image after filtering

This is the required binary image of maze in which walls have been detected.
Now the real scenario is establish and image processing is performed.
2.3.2 Path Planning
For planning an efficient path in a simple environment, the method explained
below can be used.

2.3.3 Flood fill and wave front algorithm


Grids containing walls are assigned
1. The goal point grid is to be
assigned a value higher than 1, for
instance; 2. Then, connected four
grids from the goal point grid is
Figure 2-12 Specified area with Numeric assignment

21
PROJECT DISCRIPTION

assigned the value 3 and further, four connected grids from grid of value 3 is
assigned value the 4. This manner is kept on until the start point is reached. Final
generated matrix is shown in Figure 2-12.

After number assignment completion, these numbers are, starting from start point,
traced backwards from higher number to lower until they reach goal point. It
generates the desired path. In the above generated matrix, the path formed is as
shown in Figure 2-13.

Figure 2-13 Paths Founded Based on Algorithum

Now, let us proceed to an environment that contains walls in it. Procedure to be


followed to plan an optimal path is slightly different from the one used for an
open environment. Here, goal
point grid is assigned lowest
number (0), and out of 4 grids
around the goal point grid,
the grids in which there is not
any obstacle/wall between it
and the goal point grid is
assigned one number higher
than the goal point number i-e,
Figure 2-14 Applied algorithum in Maze
1. Then this recently assigned
number grid(s) further assign one number greater than its own number to the grids
around it which do not have a wall towards this grid & this process will go on

22
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

until it reaches the start position. This number assignment is shown in the Figure
2.14.

After number assignment completion, the algorithm starts creating the desired
path, starting from goal point and moving grid to grid to the connected grid which
has one number less than the current grid. Following this manner, our desired path
from starting point to the goal point gets created.

2.3.4 Probabilistic road map algorithm


Probabilistic roadmap planner is a motion planning algorithm in robotics, which
solves the problem of determining a path
between a starting configuration of the
robot and a goal configuration while
avoiding collisions. It is explicit
Geometry based planner impractical in
high dimensional spaces.

The basic idea behind PRM is to take


random samples from the configuration

space of the robot, testing them for Figure 2-15 Probabilistic road map

whether they are in the free space, and use


a local planner to attempt to connect these configurations to other nearby
configurations. The starting and goal configurations are added in, and a graph
search algorithm is applied to the resulting graph to determine a path between the
starting and goal configurations.

The probabilistic roadmap planner consists of two phases: a construction and a


query phase. In the construction phase, a roadmap (graph) is built, approximating
the motions that can be made in the environment. First, a random configuration is
created. Then, it is connected to some neighbors, typically either the k nearest
neighbors or all neighbors less than some predetermined distance. Configurations
and connections are added to the graph until the roadmap is dense enough. In the
23
PROJECT DISCRIPTION

query phase, the start and goal configurations are connected to the graph, and the
path is obtained by a Dijkstra's shortest path query which is shown in Figure 2-15.

2.4 UAV selection


The major concern about selection of
UAV is to select the quadcopter having
minimum of 350 grams payload, good
stability, Cost effective and
programmable.
Many quadcopter has been reviewed
like Phantom and Iris. But we selected
3DR Solo quadcopter shown in Figure Figure 2-16 3DR Solo Drone
2-16 because of its excellent hovering
stability and is Programmable. It also meets the requirement of payload and has
450 gram payload capacity.

24
CHAPTER 3 Kinematic & Mathematical
Modeling

3.1 Kinematics
In this section, movement of systems (Unmanned Aerial Vehicle & Unmanned
Ground Vehicle) without consideration of masses nor the forces causing the
movement is defined.

3.1.1 Unmanned Ground Vehicle (UGV)


The mobile robot shown below has three degree of freedom. It can be defined in
the form of body coordinate frame like; M: {x, y, ϕ}. It is fixed to the UGV in the
inertial coordinate frame I: {X, Y, φ} or in x axis along the tracks, as shown in
Figure 3-1.

Figure 3-1 Geometric parameters and Coordinate frames

Where,
α = slip angle of robot
ϕ=Rotation of robot in body coordinate frame.
25
KINEMATIC &MATHEMATICAL MODELING

φ = Rotation of robot in inertial coordinate frame.


Main frame direct kinematics is;
(Ẋ, Ẏ,𝜑̇ ) =f (𝜔𝑟 ,𝜔𝑙 ) (3.1)
It shows the relation between linear velocity components to the angular speeds of
tracks of both sides. Right track speed is,
Vr = r ωr (1-irs)
= rωr I (3.2)
Where,
r = driving wheel radius
ωr = angular speed
irs = slip coefficient of track
For right track, longitudinal slip is also defined as:
𝑉𝑡 −𝑉𝑟𝑟 𝑉
Vr= =1- 𝑉𝑟𝑗 (3.3)
𝑉𝑡 𝑡

Where,
Vt = theoretical speed
Vrr = real speed of right track
Equations ii and iii can also be used for the left similarly.
The velocity components in body frame are,
ẋ = cwd [ωrir + ωlil] (3.4a)
ẏ = cwd [ωrir + ωlil] tan (α) (3.4b)
ϕ̇ = c [ωrir - ωlil]
Where,
wd = half width of the robot
c = constant (r/2 wd)
α = slip angle of robot
(Nonzero when there is side slippage
During movement on curved paths with a relatively high speed, the generated
centrifugal force causes lateral slippage. But this centrifugal force is negligible as

26
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

maximum longitudinal speed of the chosen platform does not exceed 0.2 m/s.
Moreover, this tracks’ design creates very much lateral friction that helps the
UGV not to slide. Neglecting the lateral slippage, this equation will become,

ẋ = cwd [ωrir + ωlil] (3.5a)


ẏ=0 (3.5b)
ϕ̇ = c [ωrir - ωlil] (3.5c)
Here, ϕ̇ is positive in (3.5c) for anti-clockwise rotations. Transferring these
components in inertial frame:
Ẋ = ẋ cos (φ) (3.6a)
Ẏ = ẋ sin (φ) (3.6b)
φ̇ = ϕ̇ (3.6c)
Putting the values of 3.5a, 3.5b & 3.5c in 3.6a, 3.6b & 3.6c to convert it in matrix
form, we get;
𝑋 𝑐𝑤𝑑 𝑖𝑟 cos(φ) 𝑐𝑤𝑑 𝑖𝑙 cos(φ) ω
𝑑 𝑟
[ 𝑌 ]=[ 𝑐𝑤𝑑 𝑖𝑟 sin(φ) 𝑐𝑤𝑑 𝑖𝑙 sin(φ)] [ ω ]
𝑑𝑡 𝑙
𝜑 𝑐𝑖𝑟 −𝑐𝑖𝑙

3.1.2 Unmanned Aerial Vehicle (UAV)


Quadcopter body is presented in Figure 3-2 which includes the correlative angular
Velocities, torques and forces establish by the four rotors.

Figure 3-2 Quadcopter Inertial frame and Body frame

27
KINEMATIC &MATHEMATICAL MODELING

The quadcopter absolute linear position is designated in the inertial frame I :{ x,


y, z} with ᶓ
The angular position is also designated in the inertial frame with three Euler
angles (roll, pitch and yaw) by vector ɳ. The quadcopter rotation around the y-
axis is determines by pitch angle. The rotation around the x-axis is determined by
roll angle and yaw angle determines rotation around z-axis. The linear and
angular position vectors are defined by vector q.

x ϕ
ᶓ =[ y ] , ɳ = [ θ ]
z 𝜓

The origin of the body frame is in the center of mass of the quadcopter. In the
body frame, the linear velocities are determined by VM and the angular velocities
by
Ѵ𝑥 p
VM = [ Ѵ𝑦 ] ν=[ q ]
Ѵ𝑧 r

The rotation matrix from the body frame to the inertial frame is

𝐶𝜓 𝐶𝜃 𝐶𝜓 𝑆𝜃 𝑆𝜙 − 𝑆𝜓 𝐶𝜙 𝐶𝜓 𝑆𝜃 𝐶𝜙 + 𝑆𝜓 𝑆𝜙
R = [𝑆𝜓 𝐶𝜃 𝑆𝜓 𝑆𝜃 𝑆𝜙 + 𝐶𝜓 𝐶𝜙 𝑆𝜓 𝑆𝜃 𝐶𝜙 − 𝐶𝜓 𝑆𝜙 ]
𝑆𝜃 𝐶𝜃 𝑆𝜙 𝐶𝜃 𝐶𝜙
The rotation matrix R is orthogonal thus R-1 = RT which is the rotation matrix
from the inertial frame to the body frame.
p 1 0 −𝑆𝜃 𝜙̇
ν = 𝑊𝑛 ɳ̇ [ q ]= [0 𝐶𝜙 𝐶𝜃 𝑆𝜙 ] [ 𝜃̇ ]
r 0 −𝑆𝜙 𝐶𝜃 𝐶𝜙 𝜓̇

28
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

The transformation matrix for angular velocities from the inertial frame to the
body frame is Wn.
The quadcopter is assumed to have symmetric structure with the four arms
aligned with the body x- and y-axes. Thus, the inertia matrix is diagonal matrix I
in which
Ixx = Iyy

𝐼xx 0 0
I =[ 0 𝐼𝑦𝑦 0]
0 0 𝐼𝑧𝑧

The angular velocity of rotor i, denoted by 𝜔𝑖 , creates force 𝑓𝑖 in the direction of


the rotor axis. The angular velocity and acceleration of the rotor also create torque

𝑓𝑖 =k 𝜔𝑖2 𝑇𝑀𝑖 =b 𝜔𝑖2 + 𝐼𝑀 𝜔̇ 𝑖


𝑇𝑀𝑖 around the rotor axis in which the lift constant is k, the drag constant is b and

the inertia moment of the rotor is 𝐼𝑀 . Usually the effect of 𝜔̇ 𝑖 is considered small
and thus it is omitted.

The combined forces of rotors create thrust T in the direction of the body z-axis.
Torque B consists of the torques shown below and in the direction of the
corresponding body frame angles,

0
T= ∑4𝑖=1 𝑓𝑖 = k ∑4𝑖=1 𝜔𝑖2 , TM=[ 0 ]
𝑇
𝑇𝜙 𝑙𝑘(−𝜔22 + 𝜔42 )
TB= [ 𝑇𝜃 ] = [𝑙𝑘(−𝜔12 + 𝜔32 )]
𝑇𝜓 ∑4𝑖=1 𝑇𝑖

in which 𝑙 is the distance between the rotor and the center of mass of the
quadcopter. Thus, the roll movement is acquired by decreasing the 2nd rotor
velocity and increasing the 4th rotor velocity. Similarly, the pitch movement is

29
KINEMATIC &MATHEMATICAL MODELING

acquired by decreasing the 1st rotor velocity and increasing the 3rd rotor velocity.
Yaw movement is acquired by increasing the angular velocities of two opposite
rotors and decreasing the velocities of the other two or a quadrotor adjusts its yaw
by applying more thrust to rotors rotating in one direction as shown in table 2.

Table 2: Quadcopter rotations

Conditions Rotor Rotor Quadcopter movements

For (velocity (velocity


movements increment) decrement)

1. 4th 2nd Roll

2. 3rd 1st Pitch

3. 4th and 2nd 1st and 3rd Yaw

3.2 Sinkage Calculations of UGV


The measure of immersion into the surface on which the wheel is rolling or static
when weight is applied on it.

𝑃
Z = 𝑛√ 𝑘𝑐 (3.7)
( +𝑘 Ø )
𝑏

Where

Z = Sinkage in m

n = Sinkage exponent or Exponent of Soil

b = Width of track

30
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

𝑘𝑐 = Cohesive modulus of soil deformation (N/m²)

𝑘Ø = Frictional modulus of soil deformation (N/m³)

P = Pressure

Values of n, 𝑘𝑐 and 𝑘Ø are in Appendix A

Area = (width of track) x (radius of track wheel)

A = (0.044m) x (0.0240m)

A = 0.001056𝑚2

3.2.1 For two tracks:


A = 2 x (0.001056)

A = 0.002112 𝑚2

Load = Force = mg

F = (2.5kg) (9.81 m/s²)

F = 24.52 N

𝐹
Ground Pressure = P =
𝐴

24.52
P=
0.002112

P = 11609.84 (N/m²)

3.2.2 For sand


𝑘𝑐 = 990 (N/m²)

𝑘Ø = 1528430 (N/m³)

n = 1.1

By putting above values in Equation (1)

31
KINEMATIC &MATHEMATICAL MODELING

𝑁
1.1 11609.84 ( 2 )
𝑚
Z= √ 𝑁
990 ( )
( 𝑚2 +1528430( 𝑁 ))
0.044𝑚 𝑚3

Z = 0.0116 m

Z = 1.167cm

3.2.3 For Heavy Clay


𝑘𝑐 = 12700 (N/m²)

𝑘Ø = 1555950 (N/m³)

n = 0.73

By putting these values in equation 3.7,

𝑁
0.73 11609.84 ( 2 )
𝑚
Z= √ 12700 ( 𝑁2)
𝑚 𝑁
( +1555950( 3 ))
0.044𝑚 𝑚

Z = 9.65x10⁻⁴m

Z = 0.0965cm

As the sinkage depth comes out to be very minor i.e. 0.0965 – 1.167 cm for
different surfaces. It would be appropriate to use this track with the above
mentioned width.

32
CHAPTER 4 Experimental Setup and
Implementation

4.1 Environment Building


Environment in which UGV has to
maneuver is designed using thermocol
sheets as shown in Figure 4-1, to move arena
feasibly when needed. Obstacles in the
environment are in rectangular shape,
whereas check points are represented by
circular shaped objects.

Figure 4-1 Arena construction


UAV is programmed to achieve 60 feet
altitude after takeoff and come over arena to
take image(s) of the complete environment as
shown in Figure 4-2. This image is then
processed using MATLAB, applying various
filters and techniques. Compelete
environment for experimentation without
obstacles and checkpoints is shown in Figure

4-3. Figure 4-2 Overhead camera image

Figure 4-3 Complete Arena

33
EXPERIMENTAL SETUP & IMPLEMENTATION

Complete system overview while working, where UAV is hovering at certain


altitude and UGV is moving in the environment can be seen in Figure 4-4.

Figure 4-2 Complete working system

34
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

4.2 Autonomous UAV flight planning


Mission planner software of version 1.3.44 and pixhawk-2 controller, which is
embedded in UAV, it is an advance version of Ardu-copter flight controller.

Figure 4-3 Mission Planner for UAV flight planning

By using this flight planner we are able to programmed UAV as shown in Figure
4-5 according to our requirements.

4.3 Single board computer on UAV


Raspberry pi is used because of its
useful features like it has good
processing power, built in WI-FI and
Bluetooth protocols. We used WI-FI
for live video feedback. For this we
used its camera module having
resolution of 8 Mega pixels. We used it
for onboard image processing on UAV.
Figure 4-4 displays a picture of
Raspberry Pi with camera attached. Figure 4-4 Raspberry Pi

35
EXPERIMENTAL SETUP & IMPLEMENTATION

4.4 Wireless communication between Robots


To accomplish wireless
communication between robots UGV and
UAV a portable Wi-Fi device and wireless
terminal unit is used shown in Figure 4-7. In
first phase of communication live video
feed is obtained via Wi-Fi at ground station
from UAV then after applying image
processing and path planning algorithm and
in second phase of communication 2D
coordinates of path are sent using wireless
terminal unit. In this terminal unit we used
NRF24L01 wireless modules. The range of
this module is from 50 feet to 750 meters at
1MB per second. Figure 4-5 Wireless terminal unit with Wi-Fi Device

This module is selected because of its low cost and effective data transmission
rate having range from 250kbps to 2MB.

4.4.1 Wireless Data Transmission unit at ground station


Ground station is comprised of a Laptop, wireless transmission unit and Wi-Fi
router device which is shown in Figure 4-6.

Figure 4-6 Ground Station

36
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

Wireless transmission unit sends 2D coordinates of environment generated by


applied path planning algorithm in MATLAB on processed image from ground
station to ground vehicle.

37
CHAPTER 5 Discussion and Conclusion

5.1 Use of Image Processing


Instead of moving in the complete area to get information of the environment, a
better approach is developed to acquire information necessary to plan the path for
UGV to maneuver in the environment. It decreases the time taken & increases
overall efficiency of the complete process. Using overhead camera with a few
further enhancements can provide a real-time navigation as well.
Upon experimenting, it was concluded the path constructed from the acquired
image is dependent on the kind of environment on which the process is applied as
well as preprocessing stages.

Sometimes after applying


image processing on the
environment, a few extra
traces may appear in the
output which can mislead the
ground vehicle. To minimize
this effect, an area threshold
is set and filters for erosion &
dilation are applied as shown
in Figure 5-1. These enhanced
the area of the specific
regions of image and safety is
improved as when path is
generated. Also we have to
set parameters of filters to
apply them, and the values of
these parameters depend upon
Figure 5-1 Processed Image
the image quality & distance
or height from which the
image is being captured.

38
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

5.2 Path Planning Algorithms


Planning of the UGV
movement is a key
requirement for motion of
ground robot in any
unknown environment. The
UGV has to plan all its
actions to perform given
tasks & pass through must-
go areas ensuring collision-
free movement in the
considered environment.
Resultant path coordinates
(x, y) are generated after the
algorithm is applied. These
generated way points are
sent to UGV using wireless
transmission unit and UGV Figure 5-2 Applied Algorithm
navigates safely using these coordinates and reaches the goal position. Result of a
real environment is shown in Figure 5-2.
The resultant 2D coordinates which are wirelessly sent to UGV for maneuvering
in environment are,

5.3 Conclusion
A comparative study on the length & efficiency of the acquired paths using
different algorithms i.e. A*, Random Particle Optimization, Flood Fill algorithm
& Probabilistic Road Map is also done by simulation.

39
The best suitable algorithm for maze like environment is Flood Fill algorithm but
for our simple environment, Probabilistic Road Map (PRM) algorithm was also
equally efficient. Applied PRM algorithm on our environment is shown in Figure
5-3. Image on left side shows optimal path that is from start location to check

Figure 5-3 Final Image after Algorithms

point and image on right side shows optimal path from check point to final
destination and coordinates obtained after applying the probabilistic road map
algorithm is shown as follows.

40
UGV ASSIST ANCE ROBOT CAPABLE OF MANEUVERI NG THROUGH CLUTTERED
ENVIRO NMENT BY MEANS OF A UAV

5.4 Future recommendations


In future collaborations between UAV and UGV will be enhanced by applying
online video processing techniques on unknown environment. Video processing
can be accomplished by using single board computer chip onboard on UAV. This
will be more realistic approach to deal with real life dynamic environment
problems.

References

[1] Jin Hyo Kim, Ji-Wook Kwon, Jiwon Seo, "Multi-UAV Based stereo vision
system without GPS for ground obstacle mapping to assist path planning of
UGV," ELECTRONIC LETTERS, pp. 1431-1432, 2014.

[2] Chunrong Yuan, Fabian Recktenwald, Hanspeter A. Mallot, "Visual Steering


of UAV in Unknown Environments," in International Conference on
Intelligent Robots and Systems, St. Louis, USA, 2009.

[3] El Houssein Chouaib Harik, Francois Guerin, Frederic Guinand, Jean-


Francois, Brethe, Herve Pelvillain, "UAV-UGV Cooperation For Object
Transportation In An Industrial Area," pp. 547-552, 2015.

[4] Mario Garzon, Joao Valente, David Zepta, Antonio Barrientos, "Ab Aeria-
Ground Robotic System for Navigation And Obstacle Mapping in Large
Outdoor Areas," Sensors, pp. 1247-1267, 2013.

[5] Weiss Stephan, Davide Scaramuzza, Roland Siegwart, "Monocular-SLAM–


Based Navigation for Autonomous Micro Helicopters in GPS-Denied
Environments," Journal of Field Robotics, pp. 854-874, 2011.

[6] Azade Aghaeeyan, F.Abdollahi, H.A Talebi, "UAV Guidance For Tracking
Control Of Mobile Robots In Presence Of Obstacles," in RSI/ISM
International Conference on Robotics and Mechatronics, Tehran, Iran, 2013.

[7] J. F.Nel, "Post-Processing of UAV-Captured Images for Enhanced Mapping


by Image Stitching," in 5th International Conference on Consumer

41
Electronics Berlin, Sydney, Australia, 2109, 2015.

[8] Heramb Nandkishor Joshi, Prof. J.P.Shinde, "An Image Based Path Planning
And Mation Planning for Autonomous Robot," International Journal of
Computer Science and Information Technologies, pp. 4844-4847, 2014.

[9] CAl Yifeng, SEKIYAMA Kousuke, "Geometric Relation Matching based


Object Identification for UAV and UGV Cooperation," in IEEE, Tainan,
Taiwan., 2015.

42

You might also like