A Study On Development of A Hybrid Aerial/Terrestrial Robot System For Avoiding Ground Obstacles by Flight

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

IEEE/CAA JOURNAL OF AUTOMATICA SINICA, VOL. 6, NO.

1, JANUARY 2019 327

A Study on Development of a Hybrid


Aerial/Terrestrial Robot System for
Avoiding Ground Obstacles by Flight
Chinthaka Premachandra, Member, IEEE, Masahiro Otsuka, Ryo Gohara, Takao Ninomiya, and
Kiyotaka Kato Member, IEEE

Abstract—To date, many studies related to robots have been perform investigations in indoor spaces contaminated with
performed around the world. Many of these studies have assumed chemicals where entry by humans is impossible [7], [8], and
operation at locations where entry is difficult, such as disaster robots that attach to a wall and travel thereupon [9]−[11].
sites, and have focused on various terrestrial robots, such as
snake-like, humanoid, spider-type, and wheeled units. Another Research on bipedal robots [12]−[14] is also an area of active
area of active research in recent years has been aerial robots with investigation. Cameras, GPS systems, ultrasonic sensors, and
small helicopters for operation indoors and outdoors. However, the like are being used for behavioral environment recognition
less research has been performed on robots that operate both and self-position estimation for such robots [15]−[18].
on the ground and in the air. Accordingly, in this paper, we In recent years, various lines of research have been pur-
propose a hybrid aerial/terrestrial robot system. The proposed
robot system was developed by equipping a quadcopter with sued in regard to aerial robots, such as those using small
a mechanism for ground movement. It does not use power helicopters or multirotors. Applications of small aerial robots
dedicated to ground movement, and instead uses the flight are anticipated for indoor and outdoor information gathering,
mechanism of the quadcopter to achieve ground movement as conveyance, rescue tasks, and other purposes. Like ground
well. Furthermore, we addressed the issue of obstacle avoidance mobile robots, aerial robots must recognize their movement
as part of studies on autonomous control. Thus, we found that
autonomous control of ground movement and flight was possible environment and estimate self-position by means such as cam-
for the hybrid aerial/terrestrial robot system, as was autonomous eras, GPS systems, and ultrasonic sensors. For aerial robots,
obstacle avoidance by flight when an obstacle appeared during the negative effects of additional weight can be substantial.
ground movement. Furthermore, mounting areas are smaller, so devices to be
Index Terms—Ground movement/flight control, hybrid aeri- installed on the robot must be smaller and lighter compared to
al/terrestrial robot, obstacle avoidance by flight, obstacle recog- those installed on ground mobile robots. Various applications
nition. of aerial robots are being investigated, including teleoperation
[19], [20], movement to a defined destination [21], obstacle
I. I NTRODUCTION avoidance [22], [23], route planning [24], [25], intrusion detec-

H ERETOFORE, much research has been conducted tion [26] etc.. By using some of these capabilities, technology
around the world on ground mobile robots, such as those is now being established for navigation and mapping by an
intended for use at disaster sites where entry of a human is aerial robot equipped with an ultrasonic device or an infrared
impossible. Examples include robots that can traverse rough laser device [27]−[29].
terrain by using multiple legs in the manner of a spider The advantages of ground-mobile robots include the ability
or centipede [1]−[3], robots that can move under various to perform work in a stable state, ease of work in tight
conditions in the manner of a snake [4]−[6], robots that locations, and easy operation. The advantages of aerial robots
include the ability to move quickly without being affected
Manuscript received December 4, 2017; revised March 22, 2018; accepted by the terrain. Conversely, disadvantages of ground mobile
May 17, 2018. Recommended by Associate Editor Xiaoming Hu. (Corre-
sponding author: Chinthaka Premachandra.) robots include the difficulty of arriving quickly at a site when
Citation: C. Premachandra, M. Otsuka, R. Gohara, T. Ninomiya, and K. there are locations with poor footing as a result of rubble or
Kato, “A study on development of a hybrid aerial/terrestrial robot system for cracks in the ground. Disadvantages of aerial robots include
avoiding ground obstacles by flight,” IEEE/CAA J. Autom. Sinica, vol. 6, no.
1, pp. 327−336, Jan. 2019. the difficulty of performing inspections under the ground or
C. Premachandra is with the Department of Electronic Engineering, Gradu- within rubble, the difficulty of working in close contact with
ate School of Engineering, Shibaura Institute of Technology, 135-8548, Tokyo, an object, and susceptibility to wind. Therefore, we propose a
Japan (e-mail: chintaka@sic.shibaura-it.ac.jp).
M. Otsuka, R. Gohara, and T. Ninomiya are with the Department of hybrid aerial/terrestrial robot system for operation both on the
Electrical Engineering, Graduate School of Engineering, Tokyo University ground and in the air. A concept image of this system is shown
of Science, Tokyo 135-8548, Japan (e-mail: watashidesu1209@gmail.com; in Fig. 1. The robot combines an aerial robot and a ground
j4311045@ed.tus.ac.jp; reqza.t.n.a.s@gmail.com).
K. Kato is with the Department of Electrical Engineering, Graduate School mobile robot, enabling compensation for the aforementioned
of Engineering, Tokyo University of Science, Tokyo 135-8548, Japan (e-mail: disadvantages while combining their advantages. Applications
kato@ee.kag.tus.ac.jp). are anticipated in a wider range than has been possible for
Color versions of one or more of the figures in this paper are available
online at http://ieeexplore.ieee.org. conventional robots, for example, in large-area soil investi-
Digital Object Identifier 10.1109/JAS.2018.7511258 gation, searching through rubble during a disaster, large-area

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.
328 IEEE/CAA JOURNAL OF AUTOMATICA SINICA, VOL. 6, NO. 1, JANUARY 2019

cleaning, and selection of a movement method according to during autonomous ground movement and flight. The useful-
wind speed. ness of this means of obstacle avoidance was confirmed.
The remainder of this paper is organized as follows. In
Section II, the robot system that we developed is described. In
Section III, main operation of the robot is described. Section
IV present automatic avoidance by flight of obstacles appear-
ing during ground movement. Then, in Section V, experiments
are described. Section VI gives our conclusions.

II. D EVELOPMENT OF THE H YBRID ROBOT


A. Configuration of the Chassis
An image of the developed hybrid aerial/terrestrial robot
system is shown in Fig. 2. The robot system is equipped
with a mechanism that enables movement of the quadcopter
on the ground. Aerial robots have strict limitations on the
weight of equipment that can be mounted, and therefore,
mounted hardware must be small and lightweight, and should
be miniaturized to the fullest extent possible. As shown in
Fig. 3, the robot system is equipped with omni wheels, in
which barrel-shaped small wheels are placed on the perimeter
of a large wheel, thus allowing 360o movement. The positional
relationship between the axle and the ground plane is uniform,
thus facilitating control. The dampers for the shafts of the
Fig. 1. Concept of the hybrid aerial/terrestrial robot system. chassis are shown in Fig. 4. By using dampers, shocks to the
chassis are suppressed.
In the development of our robot system, the weight of the
chassis and expandability were considered, and a method was
adopted in which the power of a quadcopter was used for
ground movement. For this hybrid aerial/terrestrial robot, the
quadcopter was equipped with a mechanism enabling ground
movement, and designed for durability by use of dampers
and aluminum. Furthermore, as part of the control of ground
movement and flight for the robot, studies were performed
on automatic obstacle avoidance by flight for obstacles that
appeared during ground movement. Robots capable of both Fig. 2. Hybrid aerial/terrestrial robot system.
ground movement and flight have been developed previously
[30], [31]; however, these robots required manual maneuver-
ing, and almost no research has been done on robots capable
of both ground movement and flight with a complete on-
board embedded control system. In the obstacle avoidance
presented here, obstacles that appear during ground movement
are avoided by flight. Very few works have been previously
performed a basic study of hybrid aerial/terrestrial robot
systems [32], [33]. In these, movement and flight have been
controlled entirely by external hardware for environmental
Fig. 3. Omni wheel.
recognition and control. However, when the hardware is placed
externally, operation is restricted to the installation site and
versatility is impaired. Accordingly, in this study, the hardware
for autonomous control of ground movement and flight is
mounted on the chassis. Furthermore, the chassis configuration
of previous machines was examined, and this robot was retrofit
with shock-resistant material, wheels with higher mobility, and
other improvements. In obstacle avoidance, the presence of
an obstacle was recognized on the basis of an image from a
small camera, and if an obstacle was present, it was avoided by
flight. For the robot, experiments were performed on avoidance
by flight of obstacles that appear during movement, including Fig. 4. Damper.

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.
PREMACHANDRA et al.: A STUDY ON DEVELOPMENT OF A HYBRID AERIAL/TERRESTRIAL ROBOT SYSTEM FOR · · · 329

To maintain axle stability, the framework of aluminum thrust. The rotor blades, motor, chassis, and battery together
shown in Fig. 5 was sandwiched between the shafts and the total 1750 g in weight, making it is possible to achieve liftoff
wheels. The framework allows mounting of hardware and a with an output of approximately 60 %; a value sufficient for
battery, and is provided with fins to promote lift generation. the purpose of the present research.
Furthermore, plastic screws were used that are designed to
break in the event of strong shock to the chassis, thus reducing B. Flight
its effects on the robot. Flight of the hybrid aerial/terrestrial robot system was
performed in a manner similar to a quad copter flight, using
throttle, elevator, aileron, and rudder operations. An overview
of these four actions is shown in Table I and Fig. 6.
TABLE I
D ESCRIPTION OF ACTIONS FOR F LIGHT

Name Action
Throttle Up and down
Elevator Forward and backward
Aileron Left and right
Rudder Rotation

Fig. 5. Framework for stability maintaining.

Here, recognition of the behavioral environment of the robot


system was performed using environmental recognition by
image processing. Therefore, the front part of the chassis
was equipped with a small camera and IMAPCAR2 (inte-
grated memory array processor for CAR2) as a small image-
processing unit. A microcomputer used for determining control
quantities and the like was also mounted on the front part of
the chassis. Details of these hardware units mounted on the
front part of the chassis are given in Section IV.
1) Selection of the Rotor Blades and Motor
When mounting the flight mechanism, it is necessary to se-
lect a mechanism appropriate for the weight of the chassis. The
flight mechanism consists of rotor blades and a motor, which
were selected by calculating static thrust (i.e., the weight that
can be lifted at liftoff). For the chassis configuration used here,
if the static thrust exceeds the weight of the chassis, flight is Fig. 6. Details of flight actions. (a) Throttle. (b) Elevator. (c) Aileron. (d)
possible. The mass corresponding to the static thrust is given Rudder.
by the following formula.
Throttle in Fig. 6 (a) is an operation that uniformly increases
T = (D/10)3 × (P/10) × (N/1000)2 × 22. (1)
or decreases the output of all four motors. If the magnitude
This formula is based on the static thrust defined by some of the throttle does not exceed a certain value, then flight
studies in the literature [34]. Here, T represents the static is impossible; therefore, when flying, throttle action is always
thrust in grams (g), D represents the diameter of the rotor implemented. Elevator in Fig. 6 (b) and aileron in Fig. 6 (c) are
blades in inches (in), P represents the pitch of the rotor operations that cause the output of two motors on one side of
blades in inches (in), and N represents the rotation speed the robot to be greater than the output of the other motors, thus
of the motor in revolutions per minute (rpm). In this study, inclining the chassis and causing it to move to the front, back,
this formula and other factors were considered in selecting left, or right. Rudder control increases the output of one of
the rotor blades and motor. Rotor blades with a diameter of 8 two pairs of motors, where the motors are located diagonally
in. and a pitch of 4.5 in were used. A motor of 1000 kv was to each, thus rotating the chassis to the left or right by the
used, where kv represents the speed of the motor (rpm) per 1 V. generated torque.
Assuming voltage applied to the motor to be 12 V, the mass
corresponding to the static thrust generated by the selected C. Ground Movement
flight mechanism is 730 g. The chassis is equipped with four A schematic diagram of ground movement is shown in
rotors, giving a total mass of 2920 g for the generated static Fig. 7. Similar to flight mechanism, ground movement is

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.
330 IEEE/CAA JOURNAL OF AUTOMATICA SINICA, VOL. 6, NO. 1, JANUARY 2019

possible using the motors and the rotor blades for power. In
multirotor flight, aileron and elevator operations incline the
chassis, and by acquiring thrust in a lateral direction, the
chassis moves. Furthermore, the rudder operation performs
rotation by using the torque produced by the motors. In
ground movement, the chassis is made to move forward and
backward by elevator operations, to move left and right by
aileron operations, and to rotate to the left and right using
rudder operations. When moving to the front, back, left, or
right, as shown in Fig. 7, it is necessary for lateral thrust to
be greater than downward thrust. Therefore, the aileron or
elevator operation is set so that the chassis inclines at a 45o
angle. The throttle operation is set so that output is at such
a level that the chassis lifts off the ground, so that sufficient
lateral thrust is acquired. In this manner, by integrating the
power for flight and ground movement, weight is reduced and
expandability is increased.

Fig. 8. System configuration for manual operation.

Fig. 7. Ground movement. A. Manual Operation


Manual operation of the chassis has the same specification
III. O PERATION OF THE ROBOT as operation of a general quadcopter. Sticks and switches on
the transmitter are operated, and by adjusting the amount of
For this robot system, autonomous movement is presumed; motion as described in Section II, the chassis is operated. In
however, manual operation is also accommodated. In con- order to accommodate autonomous operation, the robot system
sideration of managing the chassis, was introduced in order transmits signals from the receiver via the microcomputer
to perform start up, shut down, and other such operations to the control board as shown in Fig. 8; however, a general
safely. Furthermore, additional management of safety aspects quadcopter directly transmits signals from the receiver to the
was also considered, such as autonomous movement under control board. In order to make the communication method
conditions where a person may be subject to harm or pre- similar to that of a general quadcopter, the robot has been
vention of autonomous movement in a location that is not configured so that signals similar to the those from the receiver
an anticipated site. Fig. 8 shows the system configuration for are transmitted from the microcomputer to the control board
manual operation of the chassis. The system consists of a during manual operation.
transmitter for a radio-controlled toy, a receiver, a microcom-
puter, and a control board. The transmitter is a controller
B. Switching Between Autonomous Operation and Manual
used for general multirotors and the like, and the receiver
Operation
is a module for receiving signals from the transmitter. The
receiver is connected in series to the microcomputer. The In autonomous operation, the transmitter is not operated and
microcomputer generates and receives various signals, and the robot system is operated using only commands from the
communicates with the control board, the receiver, and the microcomputer corresponding to the results of environmental
hardware for environmental recognition. The control board recognition. Switching between autonomous operation and
transmits to the motors signals corresponding to those acquired manual operation is performed by operation of the transmitter.
from the microcomputer and controls the chassis. A detailed Communication from the receiver to the microcomputer then
description of the microcomputer and control board is given in uses pulse position modulation (PPM) signals. Table II shows
Section IV. General manual operation and switching between channel correspondence for PPM signals, and Fig. 9 shows an
autonomous operation and manual operation are performed by overview of the PPM signals. As the figure shows, the PPM
operation of the transmitter. signals consist of eight channels and a dummy pulse, for which

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.
PREMACHANDRA et al.: A STUDY ON DEVELOPMENT OF A HYBRID AERIAL/TERRESTRIAL ROBOT SYSTEM FOR · · · 331

Fig. 9. PPM signals.

TABLE II
autonomous operation, the transmitter is not used for oper-
C HANNEL A SSIGNMENTS OF PPM S IGNALS
ation; instead, operation of the chassis is performed using
Channel Action only commands from the microcomputer using the results of
CH1 Already used environmental recognition. Among the hardware constituting
CH2 Already used the control command unit in Fig. 10, the R8C/25 microcom-
CH3 Already used puter and the control board in Fig. 8 are the same. The ground
CH4 Already used movement/flight control system used here is constituted by an
CH5 Switch to auto or manual obstacle recognition unit that detects obstacles and a control
CH6 Unused command unit that executes command and control of the
CH7 Unused chassis.
CH8 Unused
Dummy pulse Pulse for periodic tuning

control can be performed by varying each channel width.


Here, switching between manual operation and autonomous
operation is performed by varying the pulse width of channel
5; channels 1 to 4 and channels 6 to 8 are not used. Channels
1 to 4 are not used in operation of a general quadcopter, and
therefore, the channels cannot be used for switching between
manual operation and autonomous operation. The pulse width
of the PPM signals is varied by operation of the transmitter.
Furthermore, the PPM signals are transmitted in cycles of 22
ms, and maintain their period by means of a dummy pulse.

IV. O BSTACLE AVOIDANCE BY F LIGHT


The proposed hybrid aerial/terrestrial robot system is as-
sumed to work in areas such as disaster sites, where humans
cannot easily enter. In such locations, there are environments
outside the effective range of radio. Performing remote opera-
tion in such environments is extremely difficult, and therefore,
when robots are made to operate in these environments,
autonomous control is essential. Here, as a basic study on the
control of ground movement and flight, avoidance of obstacles
appearing during ground movement by flight was investigated.
When an obstacle appears in front of the chassis during
movement, the object is automatically recognized by image
Fig. 10. System configuration for autonomous operation.
processing and is automatically avoided by the robot taking
flight. This obstacle avoidance is assumed as an avoidance
B. Obstacle Recognition
method when the obstacle appears during work operations
during ground movement, or when avoidance by ground Obstacle recognition is performed using a small monocular
movement is impossible or takes a significant amount of time. camera and the IMAPCAR2 image-processing unit. Fig. 11
shows the external appearance of the IMAPCAR2 and the
monocular camera, which are connected together with pins
A. Configuration for Obstacle Avoidance and mounted on the front part of the chassis. Furthermore, the
The system configuration for obstacle avoidance is shown IMAPCAR2 is connected in series to the microcomputer of
in Fig. 10. The transmitter and receiver shown in Fig. 8 are the control command unit. Characteristics of the IMAPCAR2
not used; instead, an obstacle recognition unit is used. In include a high-performance parallel processor with a high

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.
332 IEEE/CAA JOURNAL OF AUTOMATICA SINICA, VOL. 6, NO. 1, JANUARY 2019

image processing capability, while retaining a small size and no obstacle is judged to exist. If the value of the difference
being lightweight. Image processing performed by software, exceeds the threshold, an obstacle is recognized to exist in
and its power consumption does not exceed 2W. From these the direction of travel, and obstacle avoidance by flight is
characteristics, we determined that it is a suitable processor for performed. Otherwise, ground movement continues.
mounting hardware onto the chassis and performing obstacle
avoidance by image processing. Furthermore, for IMAPCAR2, C. Control Command Unit
by developing an algorithm in a form suitable for parallel pro-
cessing, it is possible to perform high-speed image processing The command control unit communicates with the re-
[35]−[39]. ceiver when manual operation is performed, and with the
A schematic diagram of obstacle recognition is shown in IMAPCAR2 when autonomous operation is performed. During
Fig. 12. For recognition of an obstacle, background subtraction manual operation, it is possible to switch between manual
of the previous frame from the current frame is used. Attention operation and autonomous operation on the basis of a signal
is set on a pixel (x, y) in an image; treating pixels in the from the receiver. During autonomous operation, according to
current frame as I(x, y), and pixels in the previous frame as image processing results from the IMAPCAR2, the microcom-
Ib (x, y), the difference Id (x, y) for the pixel (x, y) is express puter generates and transmits signals, controlling the chassis.
as follows: The control command unit is constituted by the control board
and the R8C/25, which is the microcomputer. The external
Id (x, y) = |I(x, y) − Ib (x, y)|. (2) appearance of the R8C/25 is shown in Fig. 13. The R8C/25 is
small and has performance capable of sufficiently processing
the control required in this research. Some key features of
R8C/25 includes; 8-bit multifunction timer with 8-bit pre
scaler (Timer RA and RB): 2 channels, input capture/output
compare timer (Timer RC): 16-bit × 1 channel, real-time clock
timer with compare match function (Timer RE): 1 channel.
The control board transmits signals corresponding to those
acquired from the microcomputer to the motors to perform
control of the chassis.

Fig. 11. Camera and IMAPCAR2.

Fig. 13. R8C/25.

1) Communication From IMAPCAR2 to the Microcomputer


The IMAPCAR2 and the microcomputer have different
communication bit rates, and so one unit of information must
be separated across a number of sessions and communicated.
Therefore, for communication from the IMAPCAR2 to the
microcomputer, 4-line bus communication is employed. Here,
Fig. 12. Image processing for object recognition. two hardware units are separated into a master side and a
slave side, and communication is performed using four lines.
The difference Id (x, y) is binarized, as in (7), on the basis In performing communication, the microcomputer was treated
of T , which is a threshold for binarization: as the master side, and the IMAPCAR2 as the slave side [34],
½ [37].
255, if |Id (x, y) > T
Ir (x, y) = (3) 2) Transmission of Commands From the Microcomputer to
0, otherwise.
the Control Board
The result of image summing for all the Ir (x, y) pixels is Transmission of a command from the microcomputer to the
the processing result. A threshold is set for determining the control board is performed in the cases of both manual and
presence or absence of an obstacle. If the processing result autonomous operation. During manual operation, a command
exceeds the threshold, an obstacle is judged to exist; otherwise, signal similar to a command from the receiver is generated

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.
PREMACHANDRA et al.: A STUDY ON DEVELOPMENT OF A HYBRID AERIAL/TERRESTRIAL ROBOT SYSTEM FOR · · · 333

by the microcomputer, and transmitted to the control board. operation of the transmitter. During autonomous operation, the
During autonomous operation, on the basis of information robot determines the presence or absence of an obstacle; if
from the IMAPCAR2, a signal corresponding to the presence no obstacle is detected, then the robot automatically performs
or absence of an obstacle is generated by the microcomputer, movement on the ground, and if and obstacle is detected, it
and transmitted to the control board. For signals sent from automatically avoids the obstacle by flight. In this paper, the
the microcomputer to the control board, four pulse width obstacles approximately appear in between 30 cm and 2 m,
modulation (PWM) signals are employed. Examples of PWM away from the robot and can be avoided by flight. Only the
signals are shown in Fig. 14. obstacles within the field of view of the camera are detected.
The field of view is 120 degrees. In addition, the camera frame
rate is: 30 fps, and the image size: VGA.

Fig. 14. Examples of PWM signals. (a) Minimum-width (1.02 ms) PWM
signal. (b) Maximum-width (2.02 ms) PWM signal.

Each of the four PWM signals corresponds to one of the


four actions in Table I, and by modifying the pulse width
between 1.02 ms to 2.02 ms, the parameter of each is modified
so that if there is no obstacle, control for ground movement is
performed, and if there is an obstacle, obstacle avoidance by
flight is performed. Settings for the parameters are illustrated
in (4)−(7):
Ai = 200(W − 1.02) − 100 (4)
El = 200(W − 1.02) − 100 (5)
T h = 100(W − 0.02) − 100 (6)
Ru = 200(W − 1.02) − 100 (7)
where, W is the pulse width in milliseconds, Ai is the
parameter for the aileron, El is the parameter for the elevator,
and T h is the parameter for the throttle. The pulse width varies
between 1.02 and 2.02 ms, and the pulse period is 22 ms. For
the aileron, if the value of the parameter is positive, movement
is to the right, and if it is negative, movement is to the left. As
the absolute value of the parameter for the aileron is increased,
the inclination of the chassis increases. For the elevator, if the Fig. 15. Flowchart for control system.
parameter is positive, the chassis moves backward, and if it
is negative, it moves forward. As the absolute value of the
parameter for the elevator is increased, the inclination of the
chassis increases. The throttle parameter cannot take a negative
value, and as its value increases, the output increases. For the
rudder, if the value of the parameter is negative, the chassis
rotates to the right, and if it is positive, the chassis rotates to
the left. As the absolute value of the parameter for the rudder
is increased, the speed of rotation increases. The pulse width Fig. 16. Experimental scenario.
in this study was set on the basis of experience during the
experiments.
The experiments were performed for a scenario like the
V. E XPERIMENTS one shown in Fig. 16. In the course of ground movement
by the hybrid aerial/terrestrial robot system, an obstacle was
A. Methods manually caused to appear in front of the chassis to test
Fig. 15 shows the flow of an experiment on the obsta- whether the chassis could avoid the obstacle by flight and land.
cle avoidance system. The system, from the perspective of The obstacle was approximately the same height as the robot,
safety, allows for switching between manual operation and assuming a case in which avoidance by ground movement was
autonomous operation, and this switching is performed by impossible. Parameter settings for movement on the ground

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.
334 IEEE/CAA JOURNAL OF AUTOMATICA SINICA, VOL. 6, NO. 1, JANUARY 2019

are listed in Table III, and parameter settings for obstacle flight. We plan to work on this part in the future. In addition,
avoidance by flight are listed in Table IV. The values in the as another future direction, we plan to widen the obstacle
tables were determined by (4) to (7), representing the values detection range of the robot by installing an omnidirectional
of parameters subsequent to starting each of the actions of camera on the robot.
ground movement and obstacle avoidance by flight. These
values cannot be set without taking into consideration detailed
data on the shape, weight, traveling performance, and the like
of the chassis. Therefore, the values were set on the basis of
experience acquired from the experiments.
TABLE III
PARAMETERS FOR G ROUND M OVEMENT

Start to 0.5 s 0.5−0.55 s


Aileron −1 −1
Elevator −18 0
Throttle 62 38
Rudder −1 −1

TABLE IV
PARAMETERS FOR O BSTACLE AVOIDANCE BY F LIGHT

Start to 0.5 s 0.8−1.3 s 1.3−2.3 s


Aileron −6 −6 −6
Elevator 0 −4 0
Throttle 74 74 50
Rudder 0 0 0

The obstacle was a bar that was manually placed in front Fig. 17. Obstacle avoidance by flight. (a)−(c) Ground movement. (d)−(f)
of the chassis. Furthermore, ordinances and safety issues for Obstacle avoidance by flight. (g) Landing.
the area were taken into consideration in performing the
experiments in a gymnasium, which is an indoor environment.

B. Avoidance of Ground Obstacles by Flight


Figs. 17 and 18 show images in which the robot avoids an
obstacle by flight. In Fig. 17, ground movement was performed
in the order of (a) to (c), obstacle avoidance by flight was
performed in the order of (d) to (f), and landing was performed
in (g). In Fig. 18, ground movement was performed in the order
of (a) to (b), obstacle avoidance by flight was performed in the
order of (d) to (f), and landing was performed in (g). As these
figures show, the robot avoided the obstacle by flight when it
appeared (please see the submitted video).

VI. C ONCLUSION
A hybrid aerial/terrestrial robot system was introduced and
a basic autonomous control system is installed on it. The
results of this study show that it was possible to manufacture a
robot system that performs both ground movement and flight.
Furthermore, by mounting a basic autonomous control system,
it was possible to control the chassis by autonomous control as
to create a more practical robot. A topic for future investigation
is use of a more advanced image-processing algorithm for
measuring the distance to an obstacle, its size, and its shape
after mounting an improved system for obstacle avoidance.
In this study, we have not proposed a mechanism to search Fig. 18. Obstacle avoidance by flight (alternate view). (a), (b) Ground
the ground condition before it lands on the ground after a movement. (d)−(f) Obstacle avoidance by flight. (g) Landing.

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.
PREMACHANDRA et al.: A STUDY ON DEVELOPMENT OF A HYBRID AERIAL/TERRESTRIAL ROBOT SYSTEM FOR · · · 335

R EFERENCES Trans. Syst. Man Cybernet.: Syst., vol. 46, no. 5, pp. 694−705, May
2016.
[1] G. Takeo, T. Takubo, K. Ohara, Y. Mae, and T. Arai, “Rotational [21] A. Ruangwiset, “Path generation for ground target tracking of airplane-
operation of polygonal prism by multi-legged robot,” in Proc. IEEE typed UAV,” in Proc. IEEE In. Conf. Robotics and Biomimetics,
Int. Conf. Mechatronics and Automation, Changchun, China, 2009, pp. Bangkok, Thailand, 2008, pp. 1354−1358.
2677−2682. [22] J. C. Zufferey and D. Floreano, “Toward 30-gram autonomous indoor
[2] G. Takeo, T. Takubo, K. Ohara, Y. Mae, and T. Arai, “ Rotation control aircraft: Vision-based obstacle avoidance and altitude control,” in Proc.
of polygonal prism by multi-legged robot,” in Proc. 11th IEEE Int. IEEE Int. Conf. Robotics and Automation, Barcelona, Spain, 2005, pp.
Workshop on Advanced Motion Control, Nagaoka, Niigata, Japan, 2010, 2594−2599.
pp. 601−606. [23] N. Gageik, P. Benz, and S. Montenegro, “Obstacle detection and
[3] S. Inagaki, T. Niwa, and T. Suzuki, “Follow-the-contact-point gait collision avoidance for a UAV with complementary low-cost sensors,”
control of centipede-like multi-legged robot to navigate and walk on IEEE Access, vol. 3, pp. 599−609, Jun. 2015.
uneven terrain,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and [24] Y. G. Fu, M. Y. Ding, C. P. Zhou, and H. P. Hu, “Route planning
Systems, Taiwan, China, 2010, pp. 5341−5346. for unmanned aerial vehicle (UAV) on the sea using hybrid differential
[4] Z. Yang, K. Ito, K. Hirotsune, K. Saijo, and A. Gofuku, “A mechanical evolution and quantum-behaved particle swarm optimization,” IEEE
intelligence in assisting the navigation by a force feedback steering Trans. Syst. Man Cybernet.: Syst., vol. 43, no. 6, pp. 1451−1465, Nov.
wheel for a snake rescue robot,” in Proc. 13th IEEE Int. Workshop 2013.
on Robot and Human Interactive Communication, Kurashiki, Okayama, [25] K. Dorling, J. Heinrichs, G. G. Messier, and S. Magierowski, “Vehicle
Japan, 2004, pp. 113−118. routing problems for drone delivery,” IEEE Trans. Syst. Man Cybernet.:
[5] J. Y. Gao, X. S. Gao, W. Zhu, J. G. Zhu, and B. Y. Wei, “Design Syst., vol. 47, no. 1, pp. 70−85, Jan. 2017.
and research of a new structure rescue snake robot with all body [26] R. Mitchell and I. R. Chen, “Adaptive intrusion detection of malicious
drive system,” in Proc. IEEE Int. Conf. Mechatronics and Automation, unmanned air vehicles using behavior rule specifications,” IEEE Trans.
Takamatsu, Japan, 2008, pp. 119−124. Syst. Man Cybernet.: Syst., vol. 44, no. 5, pp. 593−604, May 2014.
[6] P. Polchankajorn and T. Maneewarn, “Helical controller for modular [27] K. Cho, J. Shin, M. S. Kang, W. Shon, and S. Park, “Indoor flying robot
snake robot with non-holonomic constraint,” in Proc. 8th Electrical En- control and 3D indoor localization system,” in Proc. 11th WSEAS Int.
gineering/Electronics, Computer, Telecommunications and Information Conf. Automatic Control, Modelling and Simulation, Istanbul, Turkey,
Technology (ECTI-CON), Khon Kaen, Thailand, 2011, pp. 589−592. 2013, pp. 524−528.
[7] A. Gargade, D. Tambuskar, and G. Thokal, “Modelling and Analysis of [28] C. R. Yuan, F. Recktenwald, and H. Mallot, “Visual steering of UAV in
Pipe Inspection Robot,” Int. J. Emerg. Technol. Adv. Eng., vol. 3, no. 5, unknown environments,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots
pp. 120−126, May 2013. and Systems, St. Louis, MO, USA, 2009, pp. 3906−3911.
[29] K. Schmid, T. Tomic, F. Ruess, H. Hirschmiiler, and M. Suppa, “Stereo
[8] G. J. Bruemmer, C. W. Nielsen, and D. I. Gertman, “How training and
vision based indoor/outdoor navigation for flying robots,” in Proc.
experience affect the benefits of autonomy in a dirty-bomb experiment,”
IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Tokyo, Japan, 2013,
in Proc. 3rd ACM/IEEE Int. Conf. Human-Robot Interaction (HRI),
pp. 3955−3962.
Amsterdam, Netherlands, 2008, pp. 161−168.
[30] A. Kossett and N. Papanikolopoulos, “A robust miniature robot design
[9] P. Liljeback, I. U. Haugstuen, K. Y. Pettersen, “Path Following Control
for land/air hybrid locomotion,” in Proc. IEEE Int. Conf. Robotics and
of Planar Snake Robots Using a Cascaded Approach,” IEEE Transac-
Automation, Shanghai, China, 2011, pp. 4595−4600.
tions on Control Systems Technology, vol. 20, no. 1, pp. 111−126, Feb.
[31] K. Kawasaki, M. J. Zhao, K. Okada, and M. Inada, “MUWA: Multi-field
2012.
universal wheel for air-land vehicle with quad variable-pitch propellers,”
[10] S. Q. Wu, L. J. Wu, and T. Liu, “Design of a sliding wall climbing
in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Tokyo,
robot with a novel negative adsorption device,” in Proc. 8th Int. Conf.
Japan, 2013, pp. 1880−1885.
Ubiquitous Robots and Ambient Intelligence, Incheon, South Korea,
[32] M. Ootsuka, C. Premachandra, and K. Kato, “Development of an air-
2011, pp. 97−100.
ground operational robot and its fundamental controlling approach,” in
[11] A. Sekhar R, A. Mary, S. N. Raju, A. G. Ravi, V. Sharma, and G. Bala Proc. Joint 7th Int. Conf. Soft Computing and Intelligent Systems (SCIS)
“A novel design technique to develop a low cost and highly stable wall and 15th Int. Symp. Advanced Intelligent Systems (ISIS) , Kitakyushu,
climbing robot,” in Proc. 4th Int. Conf. Intelligent Systems, Modelling Japan, 2014, pp. 1470−1474.
and Simulation, Bangkok, Thailand, 2013, pp. 360−363. [33] C. Premachandra and M. Otsuka, “Development of hybrid aerial/ter-
[12] T. Takuma, K. Hosoda, and M. Asada, “Walking stabilization of biped restrial robot system and its automation,” in Proc. IEEE Int. Systems
with pneumatic actuators against terrain changes,” in Proc. IEEE/RSJ Int. Engineering Symp., Vienna, Austria, 2017.
Conf. Intelligent Robots and Systems, Edmonton, Alta, Canada, 2005, [34] M. T. Chen, “Static thrust measurement for propeller-driven light air-
pp. 4095−4100. craft,” in Proc. Int. Conf. Computer Application and System Modeling,
[13] T. Sugihara, “Standing stabilizability and stepping maneuver in planar Taiyuan, China, 2012, pp. 650−652.
bipedalism based on the best COM-ZMP regulator,” in Proc. IEEE Int. [35] R. Gohara, C. Premachandra, and K. Kato, “Smooth automatic vehi-
Conf. Robotics and Automation, Kobe, Japan, 2009, pp. 1966−1971. cle stopping control system for unexpected obstacles,” in Proc. 10th
[14] J. Z. Zhao and H. H. Ju, “Detail analysis on human bipedalism and a Asia-Pacific Symp. Information and Telecommunication Technologies,
natural gait generation method for biped robots,” in Proc. IEEE Int. Conf. Colombo, Sri Lanka, 2015, pp. 103−105.
Mechatronics and Automation, Takamatsu, Japan, 2013, pp. 873−878. [36] Y. Okamoto, C. Premachandra, and K. Kato, “A study on computational
[15] Y. S. Suh, S. K. Park, D. N. Kim, and K. H. Jo, “Remote control of a time reduction of road obstacle detection by parallel image processor,”
moving robot using the virtual link,” in Proc. IEEE Int. Conf. Robotics J. Adv. Comput. Intell. Intell. Informat., vol. 18. no. 5, pp. 849−855,
and Automation, Roma, Italy, 2007, pp. 2343−2348. Aug. 2014.
[16] C. Lee, J. Park, W. Bahn, T. Kim, T. Lee, M. M. Shaikh, K. Kim, and [37] Y. Okamoto, C. Premachandra, and K. Kato, “Parallel processing based
D. Cho, “Vision tracking of a moving robot from a second moving robot speed-up road obstacle detection algorithm using discriminant analysis,”
using both relative and absolute position referencing methods,” in Proc. in Proc. 14th Int. Symp. Advanced Intelligent Systems, Daejeon, Korea,
37th Ann. Conf. IEEE Industrial Electronics Society, Melbourne, VIC, 2013.
Australia, 2011, pp. 325−330. [38] R. Gohara, C. Premachandra, and K. Kato, “A study on smooth
[17] Q. Y. Li and P. L. Yang, “Keep up with me: A Gesture guided moving automatic vehicle stopping control for suddenly-appeared obstacles,”
robot with microsoft kinect,” in Proc. 10th IEEE Int. Conf. Mobile Ad- in Proc. IEEE Int. Conf. Vehicular Electronics and Safety, Yokohama,
Hoc and Sensor Systems, Hangzhou, China, 2013, pp. 435−436. Japan, 2015, pp. 86−90.
[18] C. R. Torres, G. L. Torres, L. E. B. da Silva, and J. M. Abe, “Intelligent [39] C. Premachandra, Y. Okamoto, and K. Kato, “High performance em-
system of paraconsistent logic to control autonomous moving robots,” in bedding environment for reacting suddenly appeared road obstacles,” in
Proc. 32nd Ann. Conf. IEEE Industrial Electronics, Paris, France, 2006, Proc. IEEE Int. Conf. Robotics and Biomimetics, Bali, Indonesia, 2014,
pp. 4009−4013. pp. 2394−2397.
[19] X. L. Hou, R. Mahony, and F. Schill, “Comparative study of haptic
interfaces for bilateral teleoperation of VTOL aerial robots,” IEEE Trans.
Syst. Man Cybernet.: Syst., vol. 46, no. 10 pp. 1352−1363, Oct. 2016.
[20] X. L. Hou and R. Mahony, “Dynamic kinesthetic boundary for haptic
teleoperation of VTOL aerial robots in complex environments,” IEEE

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.
336 IEEE/CAA JOURNAL OF AUTOMATICA SINICA, VOL. 6, NO. 1, JANUARY 2019

Chinthaka Premachandra received the B.Sc. and Ryo Gohara received the B.S degree in electri-
M.Sc. degrees from Mie University, Tsu, Japan in cal engineering from Tokyo University of Science,
2006 and 2008 respectively, and the Ph.D. degree Tokyo, Japan, in 2015.
from the Nagoya University, Nagoya, Japan, in From 2016 to 2017, he was a M.S student in De-
2011. partment of Electrical Engineering, Graduate School
From 2012 to 2015, Dr. Premachandra was an of Engineering, Tokyo University of Science, Tokyo,
Assistant Professor in the Department of Electrical Japan. His research interests include robotics and
Engineering, Faculty of Engineering, Tokyo Univer- intelligent transport systems.
sity of Science, Tokyo, Japan. From 2016 to 2017, he
was an Assistant in the Department of Electronic En-
gineering, School of Engineering, Shibaura Institute
of Technology, Tokyo, Japan. In 2018, he was promoted to Associate Professor
in the Department of Electronic Engineering, Graduate School of Engineering,
Shibaura Institute of Technology. In addition, he is the manager of Image
Processing and Robotic Lab at the same department. His lab conducts research Takao Ninomiya received the B.S degree in electri-
in two main fields: image processing and robotics. Former research includes cal engineering from Tokyo University of Science,
computer vision, pattern recognition, speed up image processing, and camera Tokyo, Japan, in 2016.
based intelligent transportation systems, while latter fields include terrestrial From 2017 to 2018, he was a M.S degree in De-
robotic systems, flying robotic systems, and integration of terrestrial robot and partment of Electrical Engineering, Graduate School
flying robot. of Engineering, Tokyo University of Science, Tokyo,
He is the member of IEEE, IEICE (Japan), SICE (Japan) and SOFT (Japan). Japan. His research interests include image process-
ing and aerial robotics.

Kiyotaka Kato (M’10) received the B.S. degree


from Waseda University, Tokyo, Japan in 1978.
Masahiro Otsuka received the B.Sc. and M.Sc. de- From 1978 to 2000, he was a Researcher in Mit-
grees from the Tokyo University of Science, Tokyo, subishi Electric Cooperation, Japan. From 2000, He
Japan in 2014 and 2016 respectively. He is currently has been Professor in the Department of Electri-
working in Daifuku Inc., Japan as an Engineer. cal Engineering, Graduate School of Engineering,
His research interests include image processing and Tokyo University of Science, Tokyo, Japan.
aerial robotics. He is member of IEEE, The Japan Society for
Precision Engineering and The Society of Instrument
and Control Engineers, Japan. His research interests
include aerial robotics, factory automation, numeri-
cal control, geometric modeling, virtual reality and computer graphics.

Authorized licensed use limited to: VIT University. Downloaded on November 26,2020 at 09:52:19 UTC from IEEE Xplore. Restrictions apply.

You might also like