Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

Robotics for Disaster Management

Yasser Nihal Siddiqui, Yawar Jung and Ankit Gureja

9/11/2010
Acknowledgement

We would like to thank the IEEE HTN and IEEE Bangalore section for pro-
viding us with this oppurtunity and giving us the financial support required
for making this project a success. We would also like to thank our project
supervisor Dr. M.T. Beg for giving us an opputunity to work under him and
for guiding us in our endeavours. His constant motivation has been a driving
force in the completion of our work.

Yawar Jung
(Team leader,Membership Number-90885704)
Md. Yasser Nihal Siddiqui
Date: 11/09/2010

1
Contents

1 Introduction 4
1.1 Challenges of Disaster Response . . . . . . . . . . . . . . . . . 4
1.2 Robotics in Disaster Management . . . . . . . . . . . . . . . 5
1.3 Objective of the Project . . . . . . . . . . . . . . . . . . . . . 6

2 Preliminaries 7
2.1 Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.1.1 Top 10 Natural Disasters Reported Affected . . . . . . 8
2.1.2 Statistics per Event . . . . . . . . . . . . . . . . . . . 9
2.1.3 Statistics By Disasters Type . . . . . . . . . . . . . . 11
2.2 Contemporary Solutions . . . . . . . . . . . . . . . . . . . . . 12

3 The Prototype 13
3.1 Chassis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.2 Driving Mechanism . . . . . . . . . . . . . . . . . . . . . . . . 14
3.3 Communication Module . . . . . . . . . . . . . . . . . . . . . 15
3.4 The Camera Module . . . . . . . . . . . . . . . . . . . . . . . 17
3.5 The Image Processing System . . . . . . . . . . . . . . . . . . 18
3.6 Gas Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.7 Robot Controller . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.8 Power Circuit . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

A Appendix 21
A.1 Software used . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
A.1.1 Main File of AVRCAM on ATmega8 Microcontroller . 22
A.1.2 PCB design . . . . . . . . . . . . . . . . . . . . . . . . 23

2
List of Figures

2.1 (a)Percentage of Reported people killed by disaster type (b)Percentage


of Reported people affected by disaster type . . . . . . . . . . 11
2.2 Economic Damage . . . . . . . . . . . . . . . . . . . . . . . . 11

3.1 L298 IC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.2 Encoder Decoder Pair . . . . . . . . . . . . . . . . . . . . . . 16
3.3 C3088 Camera Module . . . . . . . . . . . . . . . . . . . . . . 17
3.4 Sunrom’s Combustible gas sensor . . . . . . . . . . . . . . . . 18
3.5 Voltage Regulator . . . . . . . . . . . . . . . . . . . . . . . . . 19

A.1 PCB for AVRcam . . . . . . . . . . . . . . . . . . . . . . . . . 23

3
Chapter 1

Introduction

Disaster may strike any time, any place and can also end up in many casual-
ties. Disaster can be natural (floods, earthquake, storms etc) or man made
(terrorist attack, sabotage, minefields blowing up, chemical or nuclear leaks
etc.). Since 1980, the World Bank has approved more than 500 operations
related to disaster management, amounting to more than US$40 billion [1].
Disaster management primarily comprises of but is not limited to [2]:
• Preparing for disaster before it occurs
• Disaster response (emergency evacuation, quarantine, mass decontam-
ination, etc.)
• Supporting, and rebuilding society after disasters have occurred.
The disaster struck places are often not easily accessible and hazardous even
to the disaster relief forces. In the process of disaster response the response
force personnel themselves are exposed to many dangers. Analysis shows that
robotics technology can provide an economically & technically viable solution
for disaster management in disaster response. The proposed solution is in
the form of a robot or an unmanned ground vehicle which can relieve the
disaster relief forces of the tasks of surveillance and reconnaissance of the
disaster afflicted area.

1.1 Challenges of Disaster Response


Listing the challenges of disaster response is virtually an impossible task. A
few of them are listed below:

4
CHAPTER 1. INTRODUCTION 5

• The occurrence of a disaster whether natural or man made cannot be


satisfactorily predicted

• The disaster afflicted areas are not easily accessible and hazardous for
both the rescuer and rescued

• Disaster response cannot always be timely implemented

• The disaster relief equipment is not affordable by every country and


thus donated by somebody else or is procured at an emergency only,
thus causing a delay

• Only a few relief equipments are available that can be reused again and
again, even in multifaceted situations

1.2 Robotics in Disaster Management


One of the first steps in disaster response is the collection of information
from the disaster afflicted areas to further act upon. Surveillance and recon-
naissance is the area of focus in this project. Thus the robot is primarily
expected to provide surveillance and reconnaissance support to disaster re-
sponse teams. The robot is expected to have the following features:
• Ability to traverse rough terrain (a combination of four wheel drive and
belt drive mechanism)

• The robot should have the capability to be remotely operated and


controlled by an operator

• Ability to provide live video transmission to the remote operator

• Detection of hazardous gases in the afflicted area

• Ability to operate in nuclear contaminated areas and detection of any


dangerous levels of nuclear radiation present in the environment

• The robot should be able to transmit the collected data to the remote
operator

• The robot should have the capability to operate independently (in other
words the robot should have on-board image processing capabilities)
CHAPTER 1. INTRODUCTION 6

• It should be easy to repair the robot and replace parts in case of a


breakdown

• Lastly it should be a low cost solution to the problem

A particular challenge concerning the project is that the use of robotics in


disaster management is a relatively new concept and thus no previous models
are available to further build on. A prototype withate reduced capabilities
has been developed. The developed prototype aims to show the feasibility of
a low cost robot with the afore mentioned capabilities.

1.3 Objective of the Project


The objective of this project is to create a surveillance and reconnisscance
robot for disaster management. The proposed robot is designed to provide
the disaster relief forces with the primary information of ground zero at a
disaster struck place. After a disaster, be it an earthquake, oil spill, fire,
nuclear or gas leakage the disaster relief forces themselves are exposed haz-
ardeous environment in the process of disaster response. In this process we
expect the robot to perform with speed, precision and above all prevent the
exposure of the disaster relief force to danger. The robot can with little or
no modification be used for low intensity securty patroling requirements.
Chapter 2

Preliminaries

Since 1980, the World Bank has approved more than 500 operations related
to disaster management, amounting to more than US$40 billion [1]. Huge
amount of money is spent by the Government of India in many rescue opera-
tions which usually involve the deployment of army for the rescue work. The
obvious advantage of our solution lies in the fact that robots can go where
humans cannot. India - Disaster Statistics.
Data related to human and economic losses from disasters that have oc-
curred between 1980 and 2008.

No of events: 395
No of people killed: 139,393
Average killed per year: 4,807
No of people affected: 1,506,794,740
Average affected per year: 51,958,439
Economic Damage (US$ X 1,000): 45,184,830
Economic Damage per year (US$ X 1,000): 1,558,098

Table 2.1: Overview

7
CHAPTER 2. PRELIMINARIES 8

Natural Disaster Occurrence Reported

2.1 Statistics
2.1.1 Top 10 Natural Disasters Reported Affected

Drought 1987 300,000,000


Drought 2002 300,000,000
Flood 1993 128,000,000
Drought 1982 100,000,000
Drought 2000 50,000,000
People Disaster Date Affected
Flood 2002 42,000,000
Flood 1982 33,500,000
Flood 2004 33,000,000
Flood 1995 32,704,000
Flood 1980 30,000,023
CHAPTER 2. PRELIMINARIES 9

Earthquake* 2001 20,005


Earthquake* 2004 16,389
Storm 1999 9,843
Earthquake* 1993 9,748
Epidemic 1984 3,290
Killed People Disaster Date Killed
Epidemic 1988 3,000
Storm 1998 2,871
Extreme temp. 1998 2,541
Flood 1994 2,001
Flood 1998 1,811

Flood 1993 7,000,000


Flood 2006 3,390,000
Flood 2005 3,330,000
Earthquake* 2001 2,623,000
Storm 1999 2,500,000
Economic Damages
Flood 2004 2,500,000
Flood 2005 2,300,000
Storm 1990 2,200,000
Storm 1996 1,500,300
Earthquake* 2004 1,022,800

2.1.2 Statistics per Event


Drought: 53.33
Earthquake*: 3,108.19
Epidemic: 274.78
Extreme temp: 320.56
Flood: 225.01
Killed People
Insect infestation: ...
Mass mov. dry: 45.00
Mass mov. wet: 90.26
Storm: 279.99
Wildfire: 3.00
CHAPTER 2. PRELIMINARIES 10

Drought: 125,195,833.33
Earthquake*: 1,741,700.25
Epidemic: 7,205.18
Extreme temp: 6.62
Flood: 4,005,617.70
Affected People
Insect infestation: ...
Mass mov. dry: ...
Mass mov. wet: 123,677.55
Volcano: ...
Storm: 624,422.81

Drought: 340,187.00
Earthquake*: 318,893.75
Epidemic: ...
Extreme temp: 16,000.00
Flood: 164,938.27
Economic Damages Insect infestation: ...
Mass mov. dry: ...
Mass mov. wet: 1,758.06
Volcano: ...
Storm: 120,139.25
Wildfire: 1,000.00
CHAPTER 2. PRELIMINARIES 11

2.1.3 Statistics By Disasters Type

Figure 2.1: (a)Percentage of Reported people killed by disaster type


(b)Percentage of Reported people affected by disaster type

Figure 2.2: Economic Damage

*: Including tsunami
The above statistics tell us how serious the problem of natural disasters
in India let alone the man made ones. It has been observed that during such
calamitous times only manpower is not enough, this is where robots come in.
CHAPTER 2. PRELIMINARIES 12

2.2 Contemporary Solutions


As far as this solution is concerned, it has no contemporary in the past. The
disaster management forces have always relied heavily on the manpower.
The only instance of use of machines in a disaster response is the use of JCB
machines which entirely defeats the purpose of safe rescue.
Solution proposed and its advantages One of the first steps in disaster
response is the collection of information from the disaster afflicted areas to
further act upon. Surveillance and reconnaissance is the area of focus in this
project. Thus the robot is primarily expected to provide surveillance and
reconnaissance support to disaster response teams. The robot is expected to
have the following features:

• Ability to traverse rough terrain (a combination of four wheel drive and


belt drive mechanism)

• The robot should have the capability to be remotely operated and


controlled by an operator

• Ability to provide live video transmission to the remote operator

• Detection of hazardous gases in the afflicted area

• Ability to operate in nuclear contaminated areas and detection of any


dangerous levels of nuclear radiation present in the environment

• The robot should be able to transmit the collected data to the remote
operator

• The robot should have the capability to operate independently (in other
words the robot should have on-board image processing capabilities)

• It should be easy to repair the robot and replace parts in case of a


breakdown

• Lastly it should be a low cost solution to the problem


Chapter 3

The Prototype

The prototype has reduced capabilities that can be enhanced to achieve the
robot with capabilities that has been previously listed. In this report the
robot has been divided and described module by module. The Robot’s hard-
ware can be divided into the following section/categories:
1. Chassis
2. Drive mechanism
3. Communication Platform
4. Camera Module
5. Image Processing system (AVRcam)
6. Sensors Used
7. Robot Controller
8. Power Circuit

3.1 Chassis
Aluminium was chosen to design the chassis as it is light weight and easily
workable for this purpose. The body is thus resistant to rust and other
atmospheric corrosions. The chassis is in the form of a skeleton made from
L-shaped aluminium angles. The dimensions of the angle are 19mmX38mm.
The chassis itself has rectangular shape with dimensions of 420mmX300mm.
The holes are for fixing the motors and are kept at a distance of 337mm.

13
CHAPTER 3. THE PROTOTYPE 14

3.2 Driving Mechanism


A combination of four wheel drive and a drive belt mechanism has been
used. The robot can use either of the two mechanisms of locomotion inter-
changeably depending on the operational requirements. To change from one
mechanism of locomotion to another human intervention is required.

Figure 3.1: L298 IC

The robot utilizes geared DC motors on each wheel for improved perfor-
mance and horsepower requirements of various operations. Big radii tires
and tracks can make the robot work in various terrains. The motors of the
robot are driven by L298 Dual Full-Bridge driver. The L298 is an integrated
monolithic circuit in a 15- lead Multiwatt and PowerSO20 packages. It is a
high voltage, high current dual full-bridge driver designed to accept standard
TTL logic levels and drive inductive loads such as relays, solenoids, DC and
steppingmotors. Two enable inputs are provided to enable or disable the
device independentlyof the input signals. The emitters of the lower transis-
tors of each bridge are connected together and the corresponding external
terminal can be used for the connection of an external sensing resistor. An
additional supply input is provided so that the logic works at a lower volt-
age.This IC is capable of driving with currents up to 2A per channel and
CHAPTER 3. THE PROTOTYPE 15

a maximum of 46 volts[3]. The IC is shown in the figure 2.1. The motor


driver consists of 4 bridges of which two are needed to drive the motor. The
functioning of a single bridge is shown in the Table 2.1.
The L298 circuit uses another bridge consisting of 8 diodes (1N4007)
which are used here to prevent the back emf of the motors from damaging
the motor driver. The current output is controlled through the resistors
between the current_sense pin and the ground.

Inputs Function
Ven=H C=H,D=L Forward
C=L,D=H Reverse
C=D Fast motor stop
Ven=L C=X,D=L Free Running motor stop

Table 3.1: Function of a single the L298

3.3 Communication Module


The robot is wirelessly remote controlled. Since the transmission of images
requires high data rate, therefore this robot establishing communication with
other systems using a CC2500 based transceiver modules produced by Texas
Instruments. The camera module has a baud rate of 115.2Kbps, which is
not achievable by using standard 434/315 MHz ASK modulated transmitter-
receiver modules having maximum data rate of 9600bps.
The CC2500 is a low-cost 2.4 GHz transceiver designed for very low-
power wireless applications. The circuit is intended for the 2400- 2483.5
MHz ISM (Industrial, Scientific and Medical) and SRD (Short Range Device)
frequency band. The RF transceiver is integrated with a highly configurable
baseband modem. The modem supports various modulation formats and
has a configurable data rate up to 500 kBaud. CC2500 provides extensive
hardware support for packet handling, data buffering, burst transmissions,
clear channel assessment, link quality indication, and wake-on-radio. It’s
data stream can be Manchester coded by the modulator and decoded by the
demodulator . It has a high performance and easily to design your product.
The main operating parameters and the 64-byte transmit/receive FIFOs of
CC2500 can be controlled via an SPI interface. In a typical system, the
CHAPTER 3. THE PROTOTYPE 16

CC2500 will be used together with a microcontroller and a few additional


passive components.
Since digital data is transmitted serially, encoder and decoder is required
for the transmission of the individual bits. A minimum number of four chan-
nels is required to run the robot only, one more is required for the sensor data
traffic and another for the transmission of images from the camera. Two en-
coder decoder pairs are required since we have two transmitter/receivers are
used. The encoder-decoder pairs used are the HT12E/D, which are four chan-
nel encoders. These are capable of addressing 8-12 address bits and 4 data
bits. The circuit connections of HT12D/E are shown below: Fig. Typical
HT12D circuit Fig. Typical HT12E circuit

Figure 3.2: Encoder Decoder Pair

The encoder begins a 4-word transmission cycle upon receipt of a trans-


mission enable (TE for the HT12E, active low). This cycle will repeat itself
as long as the transmission enable (TE) is held low. Once the transmission
enable returns high the encoder output completes its final cycle and then
stops.
The decoders receive data that are transmitted by an encoder and inter-
pret the first N bits of code period as addresses and the last 12_N bits as
data, where N is the address code number. Signal on the DIN pin activates
the oscillator which in turn decodes the incoming address and data. The
CHAPTER 3. THE PROTOTYPE 17

decoders will then check the received address three times continuously. If
the received address codes all match the contents of the decoder’ local ad-
dress, the 12_N bits of data are decoded to activate the output pins and the
VT pin is set high to indicate a valid transmission. This will last unless the
address code is incorrect or no signal is received. The output of the VT pin
is high only when the transmission is valid. Otherwise it is always low.

3.4 The Camera Module


The camera module used in our machine is known as the C3088 camera
module. It uses Omnivision’s OV6620 CMOS sensor. The camera module
works at a native clock of 17.72 MHz. It is able to produce an image of 144 X
82 pixels which is just quite enough to be handled by an 8 bit microcontroller,
hence the module is apt for low cost applications. The camera module is
reported to provide 30 frames per second, when the complete camera control
system is connected to a computer using a serial port. However the efficiency
of the camera is yet to be tested on wireless platform. The camera module
has 32 pins and the communication to the sensor can made through the I2C
interface.
I2C interface was developed by Philips for communication between differ-
ent IC’s in a system. The I2C allows for multi master-slave configurations.
The I2C utilises a Two Wire Interface (TWI) and utilises the SCL (serial
clock) and the SDA (serial data) pins of an IC. Through the use of I2C the
registers of the camera module can be accessed and the module can be di-
rected to perform specific actions. Thus this module can also be used for
object tracking.

Figure 3.3: C3088 Camera Module


CHAPTER 3. THE PROTOTYPE 18

3.5 The Image Processing System


The image processing system consists of two microcontrollers and the camera
module. Besides these a RS-232 level shifter is used which converts the
data from the microcontroller TTL levels to that of RS-232 levels so that
communication to a pc ca be established.
The first microcontroller is an ATtiny12. Its sole purpose in this system is
to establish initial communication between the camera module and the main
microcontroller. First, sends instructions to the camera module to route its
clock to the ATmega8 for a synchronised operation, then tells the I2C to
make mega8 as the master and C3088 as the slave. Due to pin limitations on
Atmega8 the programming pins are being shared by the UV bus of C3088,
thus another function of the tiny12 is to give instruction to the camera module
to tri-state its UV bus on the start-up of system, this allows the programming
of the mega8 to be changed during this time. After this the microcontroller
waits forever in an infinite loop.
The heart of the system is a mega8 microcontroller. The mega8 is used
here to convert the data so that it can be read in a computer system, thereby
acting as a signal processor. The microcontroller directs the OV6620 to
output the image as QCIF format thereby reducing the image resolution.
The reduced image can now be processed by the mega8.
The image processing system used is called AVRcam.

3.6 Gas Sensor

Figure 3.4: Sunrom’s Combustible gas sensor


CHAPTER 3. THE PROTOTYPE 19

Although many sensors can be used but just to show the feasibility of
our project we have put only a combustible gas sensor. The Gas Sensor to
demonstrate the communication and the detection purpose of this robot, a
combustible gas sensor has been selected. The gas sensor used is manufac-
tured by Sunrom Technologies, can detect combustible gases in the atmo-
sphere from 100ppm to 10000ppm. The simple analog output of the gas
sensor can be connected to either a microcontroller an analog to digital con-
verter or an alarm system. The implementation here is for demonstration
purpose only, where the gas sensor sends the signal back to the receiver where
it can be manipulated to obtain the concentration of combustible gas in air.
The sensor is not sensitive to alcoholin air.

3.7 Robot Controller


The robot is controlled by the use of a self designed controller board. The
controller board is used to control the movement of the robot. Besides the
movement control, the data is received from the robot by the use of a micro-
controller which displays the sensor readings.
The remote navigation of the robot can be done through the use of a
Personal Computer. The images captured by the robot are sent to a com-
puter via a CC2500 module connected to the computer’s serial port. The
application software ’AVRcamView’ is used to interface the microcontroller
and the computer.

3.8 Power Circuit

Figure 3.5: Voltage Regulator


CHAPTER 3. THE PROTOTYPE 20

Power source that can be used are Ni-Cad batteries/SLA or 9V dry cells.
Since the circuits run on a 5V supply we use a votage regulator circuit to
obtain a constant supply.
7805 voltage regulator is used to get +5 V output out of a higher voltage
supply (7.5V-20V).We use adapter’s supply to generate +5V here. Connect
the gnd and +12V of adapter to the pins as shown and get +5V directly as
an output out of the 3rd pin. Current up to 0.5 A can be obtained from this
regulator without any significant fall in voltage level.The regulator circuit is
shown in figure 2.5, we use two capacitors of .1uF and 1uF to filter noise in
the input and output of regulator’s supply.
Appendix A

Appendix

A.1 Software used


The camera hardware and the computer can be interfaced by the AVR-
camVIEW software (version 1.2). This unique application was developed
by John Royce Orlando. AVRcamVIEW provides the following capabilities:
• Take full-colour snapshots (176 x 144 pixels) with the system and display
the images (both raw Bayer data and interpolated colour data).
• Easily create a Colour Map of colours to track based on a snapshot
(Just click on the colours of interest and add them to the Colour Map).
• Adjust the precision of each tracked colour (i.e. provide a range of
acceptable R-G-B values for each colour), allowing the user to adjust the
Colour Map to the surrounding environment.
• Display the real-time tracking results of each tracked object (With
colour and bounding box information).
• Record a tracking session for playback at a later time.
• Test the system out in multiple OS platforms that are supported by
Java 1.5 (both Windows and Linux are currently supported).
The other software used were ’WinAVR’ to compile the source code and
burn the hex files onto the microcontroller and ’DipTrace’ for designing the
Printed Circuit Board to mount the C088 camera module.

21
APPENDIX A. APPENDIX 22

A.1.1 Main File of AVRCAM on ATmega8 Microcon-


troller
/*Module Name: Main.c Module Date: 24/8/2010 Description: This module
is responsible for providing the entry point to the code through the "main"
function. */
/* Includes */
#include <avr/io.h>
#include <stdlib.h>
#include <string.h>
#include "UIMgr.h"
#include "UartInterface.h"
#include "I2CInterface.h"
#include "CamInterface.h"
#include "DebugInterface.h"
#include "FrameMgr.h"
#include "CommonDefs.h"
#include "CamConfig.h"
#include "Executive.h"
#include "Utility.h"
/* Local Structures and Typedefs */
/* Extern Variables */
/* Definitions */
/* Function Name: main Function Description: This function provides
the entry point into AVRcam application. Inputs: none Outputs: int */
int main(void) { /* initialize all of the interface modules */
DebugInt_init();
UartInt_init();
I2CInt_init();
CamInt_init(); /* initialize the remaining modules that will process data...interrupts
need to be on for these */
ENABLE_INTS();
CamConfig_init();
UIMgr_init();
FrameMgr_init(); /* provide a short delay for the camera to stabilize
before we let the executive start up */
Utility_delay(1000); /* the rest of the application will be under the con-
trol of the Executive. */
APPENDIX A. APPENDIX 23

Exec_run(); /* this should never be reached */


return(0);
}
Due to size limitation of the page we are attaching the other source code
files. This main file has been derived from the original developed by John R.
Orlando in 2004.

A.1.2 PCB design

Figure A.1: PCB for AVRcam


Bibliography

[1] http://web.worldbank.org/

[2] Datasheets- L298N, L298HN, FS100A,HT12E, HT12D, CC2500, C3088


OV 6620, USBASP AVR programmer, ATmega8, ATmega16, ATmega32,
LM7805 and ATtiny12.

[2] Saurabh Sankule, “Beginers Guide to Embedded Systems”, IIT Kanpur


Robotics Club.

[3] Joe Pardue, “ Programming and Customizing AVR with Butterfly”, Smi-
ley Macros, TN, USA.

[4] www.winavr.scienceprog.com/example-avr-projects.

[5] www.jrobot.net.

24

You might also like