Fire 06 00093 v3

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

fire

Article
An Indoor Autonomous Inspection and Firefighting Robot
Based on SLAM and Flame Image Recognition
Sen Li 1 , Junying Yun 1 , Chunyong Feng 2 , Yijin Gao 1 , Jialuo Yang 1 , Guangchao Sun 1 and Dan Zhang 1, *

1 The School of Building Environment Engineering, Zhengzhou University of Light Industry,


Zhengzhou 450001, China
2 The School of Mechanical and Electrical Engineering, Xi’an University of Architecture and Technology,
Xi’an 710055, China
* Correspondence: zhangdan@zzuli.edu.cn

Abstract: Indoor fire accidents have become increasingly common in recent years. More and more
firefighting robots have been designed to solve the problem of fires. However, the indoor environment
is very complex, with high temperatures, thick smoke, more turns, and various burning substances.
In this study, a firefighting robot with autonomous inspection and automatic fire-extinguishing
functions intended for use in unknown indoor environments was designed. Considering water’s
poor efficiency and its inability to extinguish some combustion materials, other fire extinguishers
were applied to design the extinguishing system. The robot can install four different extinguishers
as required and select the appropriate fire extinguisher to spray it automatically. Based on the
Cartographer SLAM (simultaneous localization and mapping) theory, a robot map-building system
was built using Lidar scanners, IMU (inertial measurement unit) sensors, encoders, and other sensors.
The accurate identification and location of the fire source were achieved using an infrared thermal
imager and the YOLOv4 deep learning algorithm. Finally, the performance of the firefighting robot
was evaluated by creating a simulated-fire experimental environment. In an autonomous inspection
process of the on-fire environment, the firefighting robot could identify the flame in real-time, trigger
the fire-extinguishing system to carry out automatic fire extinguishing, and contain the fire in its
embryonic stage.

Citation: Li, S.; Yun, J.; Feng, C.; Gao, Keywords: firefighting robot; laser SLAM; YOLOv4 deep learning algorithm; autonomous inspection;
Y.; Yang, J.; Sun, G.; Zhang, D. An automatic fire extinguishing
Indoor Autonomous Inspection and
Firefighting Robot Based on SLAM
and Flame Image Recognition. Fire
2023, 6, 93. https://doi.org/10.3390/
1. Introduction
fire6030093
More and more people live and work in dense indoor buildings with the acceleration
Academic Editor: Faraz Hedayati of urbanization [1]. Large numbers of casualties and serious economic losses can be caused
Received: 1 January 2023
when indoor building fires occur. From a global perspective, the cost of fires is high.
Revised: 21 February 2023
According to statistics, the direct and indirect losses caused by fires account to ~1% of
Accepted: 21 February 2023 global GDP [2]. Public fire departments across the U.S. attended 499,000 fires in indoor
Published: 28 February 2023 buildings in 2018, which caused 2910 deaths and 12,700 injuries [3]. The Chinese Fire
Statistical Yearbook recorded 395,052 indoor building fires in China, resulting in 1815 deaths,
1513 injuries, and a loss of RMB 4.7 billion in 2014 alone [4]. Why are there so many serious
indoor building fires? On the one hand, a fire accident occurs because the fire was not
Copyright: © 2023 by the authors. detected promptly, resulting in the fire’s rapid spread. On the other hand, firefighting
Licensee MDPI, Basel, Switzerland. facilities are inadequate, and effective firefighting cannot be carried out once a fire has been
This article is an open access article discovered [5,6]. Some practical measures to reduce the frequency of major fires are to learn
distributed under the terms and to locate the fire source early in the fire, quickly and accurately mark the fire source, and
conditions of the Creative Commons
automatically carry out fire extinguishing. Given the above situation, the development of
Attribution (CC BY) license (https://
firefighting robots with autonomous inspection and automatic extinguishing functions has
creativecommons.org/licenses/by/
evolved into a new development direction in the field of fire protection.
4.0/).

Fire 2023, 6, 93. https://doi.org/10.3390/fire6030093 https://www.mdpi.com/journal/fire


Fire 2023, 6, x FOR PEER REVIEW 2 of 21

Fire 2023, 6, 93 development of firefighting robots with autonomous inspection and automatic 2extin- of 21
guishing functions has evolved into a new development direction in the field of fire pro-
tection.
AAfirefighting
firefightingrobotrobotcan canassist
assistor orreplace
replacefirefighters
firefightersin incarrying
carrying out out firefighting
firefightingwork, work,
reducing firefighters’ labor intensity and improving their
reducing firefighters’ labor intensity and improving their firefighting and rescue abilityfirefighting and rescue ability
andefficiency.
and efficiency.Scholars
Scholarsworldwide
worldwide have
have been
been working
working onon developing
developing firefighting
firefighting robots
robots as
as robotics technology advances. Liu [7] designed a firefighting
robotics technology advances. Liu [7] designed a firefighting robot with multiple sensor robot with multiple sensor
fusions. The
fusions. The robot
robotprecisely
preciselypositions
positionsitselfitselfininsmoke
smokescenesscenesby byfusing
fusingmagnetic
magneticencoderencoder
trajectoryestimation
trajectory estimationposes, poses,UWB UWBabsolute
absoluteposes,poses,and andIMU IMUheading
headingangleangleposesposesthrough
through
ananimproved
improvedKalman Kalmanfiltering
filteringalgorithm.
algorithm.The Theshortcoming
shortcomingof ofthis
thisdesign
designisisthatthatthe
therobot
robot
needs to be designed with a fire extinguishing system. Yazhou
needs to be designed with a fire extinguishing system. Yazhou Jia [8] developed a crawler Jia [8] developed a crawler
firefightingrobot.
firefighting robot.ItItis is equipped
equipped withwith a water
a water gungun devicedevice onfront
on the the front
end ofendtheof the robot’s
robot’s body
body
and and a pipe
a water water onpipe on the
the back back
end, and end,
it isand it is equipped
equipped to accurately
to accurately detect high-tem-
detect high-temperature
perature
objects in objects in a fire
a fire scene andscene
assistand assist firefighters
firefighters in firefighting.
in firefighting. Xinyu Gao Xinyu [9]Gao [9] devel-
developed a
oped a firefighting
firefighting robot capable
robot capable of autonomous
of autonomous navigation navigation and extinguishing.
and extinguishing. The front The front
part of
partrobot
the of theisrobot is equipped
equipped with a withwatera cannon.
water cannon. The waterThe water
cannon cannon
must must
use an use an external
external fire
fire hydrant to extinguish distant flames. SnakeFighter [10] is
hydrant to extinguish distant flames. SnakeFighter [10] is a snake-like hydraulic robot. The a snake-like hydraulic robot.
The robot
robot has a function
has a function that allowsthat itallows
to avoid it toobstacles
avoid obstacles while walking.
while walking. The front-
The front-mounted
nozzles
mounted onnozzles
the robot onspray ontospray
the robot the fire.
onto the fire.
The
Thefirefighting
firefightingrobot robotLUF60
LUF60[11] [11]hashasaablower
blowerand andaawater
watermist mistnozzle.
nozzle.The Theblower
blower
refines
refinesthethewater
waterflowflowand andsprays
spraysititas aswater
watermist,
mist,reducing
reducingthe theambient
ambienttemperature
temperatureover over
aawide
wide range to achieve the goal of firefighting. The robot’s back end is connected toaa
range to achieve the goal of firefighting. The robot’s back end is connected to
water
waterhosehoseso sothat
thatititcan canspray
spraywater
watermist mistindefinitely.
indefinitely. The The LUF60
LUF60 firefighting
firefighting robotrobot isis
shown
shown in Figure 1. The firefighting robot GARM [12] has a tele-operated monitornozzle
in Figure 1. The firefighting robot GARM [12] has a tele-operated monitor nozzle
and
andan anIRIRvideo
videosystem
systemthat thatcancandetect
detecthidden
hiddenglows glowsand andother
otherheatheatsources.
sources.The Therobot
robot
can
canoften
oftenautomatically
automatically detect detect and
and extinguish
extinguish remaining
remaining glowsglows and and newly
newly igniting
igniting fires.
fires.
The firefighting robot Thermite 3.0 [13] has a multidirectional
The firefighting robot Thermite 3.0 [13] has a multidirectional nozzle that can pump 2,500 nozzle that can pump 2500
gallons
gallonsof ofwater
waterperperminute
minutefrom fromthe thecannon.
cannon.The Therobot
robotcan cancontinuously
continuouslyspray spraywater
waterthatthat
isispumped in through the rear hose. Shuo Zhang [14] designed
pumped in through the rear hose. Shuo Zhang [14] designed an intelligent firefighting an intelligent firefighting
robot
robotbased
basedon onmultisensor
multisensorfusion fusiontechnology
technologywith withautonomous
autonomousinspection
inspectionand andautomatic
automatic
firefighting
firefighting functions. The robot uses the tandem operation of a binocular visioncamera
functions. The robot uses the tandem operation of a binocular vision camera
and
and an infrared thermal imager to locate a fire source. Like the above robot, this robotuses
an infrared thermal imager to locate a fire source. Like the above robot, this robot uses
water
waterto toextinguish
extinguishfires. fires.

Figure1.1.LUF60
Figure LUF60robot.
robot.

In addition, there are the MVF-5 [15], the WATER CANNON [16], aerial-hose-type
robots that use water jets [17], intelligent firefighting robots [18], elevating firefighting
Fire 2023, 6, x FOR PEER REVIEW 3 of 21

In addition, there are the MVF-5 [15], the WATER CANNON [16], aerial-hose-type
Fire 2023, 6, 93 3 of 21
robots that use water jets [17], intelligent firefighting robots [18], elevating firefighting
robots Based on ROS elevating firefighting robots[19], and others. These firefighting ro-
bots are currently unsuitable for use in indoor environments, as evidenced by the follow-
robots Based(1)
ing factors: onBecause
ROS elevating firefighting
of the large size ofrobots
these [19], and others.
firefighting These
robots, it isfirefighting robots
inconvenient for
are currently unsuitable for use in indoor environments, as evidenced
them to move indoors. (2) Currently, these firefighting robots use water as the primaryby the following
factors: (1) Because
extinguishing medium of and
the large size
require anofexternal
these firefighting
water hose,robots,
but theitindoor
is inconvenient for
space is often
them to move indoors. (2) Currently, these firefighting robots use water
small, there are many turns, and the water hose restricts the robots’ movement. (3) Water as the primary
extinguishing
as a medium has medium
a low and require an external
fire-extinguishing waterand
efficiency hose, but the
cannot meetindoor space is often
the fire-extinguish-
small, there are many turns, and the water hose restricts the robots’ movement. (3) Water
ing needs of various burning substances.
as a medium has a low fire-extinguishing efficiency and cannot meet the fire-extinguishing
This paper discusses the development of a firefighting robot for indoor use based on
needs of various burning substances.
the abovementioned issues. The robot not only has an autonomous inspection function
This paper discusses the development of a firefighting robot for indoor use based on
but also has an automatic fire-extinguishing function. The firefighting robot has a flexible
the abovementioned issues. The robot not only has an autonomous inspection function
appearance, does not need an external water source, and can move freely around the
but also has an automatic fire-extinguishing function. The firefighting robot has a flexible
house. Moreover, the robot can also utilize different fire-extinguishing media according
appearance, does not need an external water source, and can move freely around the house.
to the different burning materials identified.
Moreover, the robot can also utilize different fire-extinguishing media according to the
different burning materials identified.
2. Related Work
Currently,
2. Related Workthe four primary components of the laser SLAM framework for mobile
robots are front-end
Currently, the four matching,
primary back-end
components optimization,
of the laser loopback detection,for
SLAM framework and map
mobile
building [20]. The SLAM framework is shown in Figure 2. According
robots are front-end matching, back-end optimization, loopback detection, and map build- to the different back-
end[20].
ing optimization
The SLAM methods,
framework it canis be divided
shown into filter-based
in Figure 2. According SLAM and
to the graph-optimiza-
different back-end
tion-based SLAM
optimization [21]. itThe
methods, canfirst and most
be divided intopopular SLAM
filter-based approach
SLAM is filter-based. It is
and graph-optimization-
possible
based to consider
SLAM [21]. Thefilter-based
first and most SLAM as a SLAM
popular continuous Bayesian
approach estimation
is filter-based. It isprocess.
possibleEx-to
tended Kalman filters [22], particle filters [23], and Rao–Blackwellized
consider filter-based SLAM as a continuous Bayesian estimation process. Extended particle filters [24]
Kalman
are the[22],
filters three basic filters
particle filter steps thatRao–Blackwellized
[23], and SLAM goes through. Based
particle on the
filters [24]RBPF,
are theMontemerlo
three basic
et al. steps
filter [25] proposed
that SLAM thegoes
FastSLAM
through. method,
Based on while
the Grisetti et al. [26] proposed
RBPF, Montemerlo et al. [25]the Gmap-
proposed
ping
the SLAM algorithm.
FastSLAM method, while Grisetti et al. [26] proposed the Gmapping SLAM algorithm.

Figure 2.
Figure 2. SLAM framework.
framework.

In
In SLAM
SLAM based
based on on graph
graph optimization, the pose
optimization, the pose ofof robots
robots at
at various
various times
times isis repre-
repre-
sented as vertices in the graph. The interaction between vertices due to spatial constraints
sented as vertices in the graph. The interaction between vertices due to spatial constraints is
represented as edges in the graph [27]. The spatial constraint relationship between
is represented as edges in the graph [27]. The spatial constraint relationship between ver- vertices
is obtained
tices by matching
is obtained by matchingodometer
odometer information
informationor or
observation
observation information.
information.Then,
Then, the
the
SLAM
SLAM problem is transformed into a graph optimization problem to solve the optimal
problem is transformed into a graph optimization problem to solve the optimal
robot
robot pose
pose in
in attitude
attitude space
space under
under thethe premise
premise ofof satisfying
satisfying constraint
constraint relations
relations [28].
[28]. The
The
Cartographer SLAM algorithm was proposed by Hess et al. [29] and is based
Cartographer SLAM algorithm was proposed by Hess et al. [29] and is based on a stand- on a standard-
ized graph-optimized
ardized graph-optimized SLAMSLAMframework.
framework.TheThe
algorithm consists
algorithm of two
consists parts:parts:
of two frontfront
end
and back end. The front end is used to construct the pose map, and the back end is used to
end and back end. The front end is used to construct the pose map, and the back end is
optimize the pose map. This algorithm can build a map with good quality in a large-scale
used to optimize the pose map. This algorithm can build a map with good quality in a
environment because the concept of submap is proposed, and the accumulated errors
large-scale environment because the concept of submap is proposed, and the accumulated
between submaps can be eliminated by loopback detection. In this study, the Cartographer
errors between submaps can be eliminated by loopback detection. In this study, the Car-
SLAM algorithm was used for robot map construction.
tographer SLAM algorithm was used for robot map construction.
Firefighting robots mostly use visual sensors for flame detection and location, and
Firefighting robots mostly use visual sensors for flame detection and location, and
they carry out firefighting and rescue according to the flame location information [30].
they carry out firefighting and rescue according to the flame location information [30].
Thermal imaging cameras and visible cameras are two categories of visual sensors. Since
visible-light cameras are significantly impacted by smoke, they cannot detect flames in
smoke environments. Thermal imaging cameras, which passively pick up the object’s
thermal radiation, are unaffected by the smoke. Therefore, they can detect flames in a
Fire 2023, 6, x FOR PEER REVIEW 4 of 21

Thermal imaging cameras and visible cameras are two categories of visual sensors. Since
Fire 2023, 6, 93 visible-light cameras are significantly impacted by smoke, they cannot detect flames in
4 of 21
smoke environments. Thermal imaging cameras, which passively pick up the object’s
thermal radiation, are unaffected by the smoke. Therefore, they can detect flames in a
smoke-filledatmosphere.
smoke-filled atmosphere. There
There are
arecurrently
currentlythree
threemethods
methodsfor forflame
flamedetection
detectionsystems
systems
based on vision sensors: traditional image detection, traditional machine
based on vision sensors: traditional image detection, traditional machine learning, and learning, and
deep learning
deep learning [31].[31].
Differentflame
Different flamedetection
detection systems
systems have
have been
been designed
designed by scholars
by scholars suchsuch as Toreyin
as Toreyin [32],
[32], Celik
Celik [33], [33], Marbach
Marbach [34],[34],
andandLinLin[35][35] based
based onon traditionalimage
traditional imageprocessing
processingmethods.
methods.
Amongthe
Among theflame
flamedetection
detectiontechnologies
technologiesbasedbasedon ontraditional
traditionalmachine
machinelearning,
learning,thethemost
most
representative are the flame detection methods proposed by Byoung
representative are the flame detection methods proposed by Byoung [36], Borges [37], and[36], Borges [37], and
Chi et al. [38]. The deep-learning-based flame detection technology
Chi et al. [38]. The deep-learning-based flame detection technology uses a convolutionaluses a convolutional
neuralnetwork
neural networkto todesign
designaaself-learning
self-learningclassifier,
classifier,which
whichcan canautomatically
automaticallyextract
extractflame
flame
features. Currently,
features. Currently,there
therearearetwo
twotypes
typesof ofobject
objectdetectors
detectorsbased
basedon onconvolutional
convolutionalneural
neural
networks:two-stage
networks: two-stageand andone-stage.
one-stage. One-stage
One-stage algorithms
algorithmswith withthethe highest
highest representation
representation
includeYOLO,
include YOLO,SSD, SSD,andandRetinaNet
RetinaNet[39].[39]. One
Oneofofthe
thebest
besttarget
targetdetection
detectionalgorithms
algorithmsthat that
strikes aa compromise
strikes compromise between
between speed
speed andand accuracy
accuracy isis the
theYOLOv4
YOLOv4 method,
method, which
which waswas
recently
recentlyproposed.
proposed. TheThe YOLOv4
YOLOv4 method
method is is used
used as
asthe
theflame
flamedetection
detectionalgorithm
algorithm in inthis
this
paper.
paper.The
Thenetwork
networkstructure
structureof ofthe
theYOLOv4
YOLOv4target
targetdetection
detectionmodelmodelisisshown
shownin inFigure
Figure3.3.

Figure 3. The network structure of the YOLOv4 target detection model.


Figure 3. The network structure of the YOLOv4 target detection model.
3. Hardware Design of the Autonomous Inspection and Firefighting Robot
The hardware
3. Hardware Designdesign
of thefor autonomousInspection
Autonomous inspectionandandFirefighting
firefighting Robot
robots is divided
into two sections: autonomous inspection, and automatic firefighting.
The hardware design for autonomous inspection and firefighting robots The autonomous
is divided
inspection system includes a motion unit and a sensor unit. The motion unit
into two sections: autonomous inspection, and automatic firefighting. The autonomouscomprises
a mobile carrier, a drive controller, an industrial computer, and a power supply system,
inspection system includes a motion unit and a sensor unit. The motion unit comprises a
while the sensing unit comprises a Lidar, an IMU, and an encoder. The automatic fire-
mobile carrier, a drive controller, an industrial computer, and a power supply system,
extinguishing system includes video monitoring and fire-extinguishing systems. The video
while the sensing unit comprises a Lidar, an IMU, and an encoder. The automatic fire-
monitoring system uses a camera and a pan/tilt to detect and locate the flame. The fire-
extinguishing system includes video monitoring and fire-extinguishing systems. The
extinguishing system consists of a fire extinguisher trigger device, a fire-extinguishing
video monitoring system uses a camera and a pan/tilt to detect and locate the flame. The
system rotating device, and a proximity switch. The hardware system architecture is
fire-extinguishing system consists of a fire extinguisher trigger device, a fire-extinguishing
depicted in Figure 4.
system rotating device, and a proximity switch. The hardware system architecture is de-
The robot’s motion unit controls the robot’s motion state. The Lidar scans the envi-
picted in Figure 4.
ronment to generate laser point cloud information, and the encoder and IMU estimate
the robot’s pose as it moves. The SLAM algorithm in the industrial computer creates its
positioning and map based on the laser point cloud information and the robot posture
information. Using the positioning and map data, the robot conducts an autonomous
inspection. The image data collected by the video surveillance system and the map created
by SLAM are communicated to the host computer via the wireless local area network for
real-time viewing. At the same time, the flame detection algorithm on the host computer
identifies and locates the flame. The fire-extinguishing system targets the flame and ac-
tivates the fire-extinguishing device to put out the fire, completing the robot’s automatic
fire-extinguishing function. Figure 5 depicts the autonomous inspection firefighting robot
proposed in this paper.
Fire 2023,
Fire 2023, 6,
6, 93
x FOR PEER REVIEW 55 of 21
of 21

Figure 4. Schematic diagram of the robot’s hardware system architecture.

The robot’s motion unit controls the robot’s motion state. The Lidar scans the envi-
ronment to generate laser point cloud information, and the encoder and IMU estimate the
robot’s pose as it moves. The SLAM algorithm in the industrial computer creates its posi-
tioning and map based on the laser point cloud information and the robot posture infor-
mation. Using the positioning and map data, the robot conducts an autonomous inspec-
tion. The image data collected by the video surveillance system and the map created by
SLAM are communicated to the host computer via the wireless local area network for real-
time viewing. At the same time, the flame detection algorithm on the host computer iden-
tifies and locates the flame. The fire-extinguishing system targets the flame and activates
the fire-extinguishing device to put out the fire, completing the robot’s automatic fire-
extinguishing function. Figure 5 depicts the autonomous inspection firefighting robot pro-
Figure
posed4. 4.inSchematic robot’s hardware
diagram of the robot’s
this paper. hardware system
system architecture.
architecture.

The robot’s motion unit controls the robot’s motion state. The Lidar scans the envi-
ronment to generate laser point cloud information, and the encoder and IMU estimate the
robot’s pose as it moves. The SLAM algorithm in the industrial computer creates its posi-
tioning and map based on the laser point cloud information and the robot posture infor-
mation. Using the positioning and map data, the robot conducts an autonomous inspec-
tion. The image data collected by the video surveillance system and the map created by
SLAM are communicated to the host computer via the wireless local area network for real-
time viewing. At the same time, the flame detection algorithm on the host computer iden-
tifies and locates the flame. The fire-extinguishing system targets the flame and activates
the fire-extinguishing device to put out the fire, completing the robot’s automatic fire-
extinguishing function. Figure 5 depicts the autonomous inspection firefighting robot pro-
posed in this paper.

Figure5.5.Indoor
Figure Indoorautonomous
autonomousinspection
inspectionfirefighting
firefightingrobot.
robot.

3.1. Design of the Robot Map Construction System


3.1.1. Robot Motion Unit
The hardware of the robot motion unit is shown in Figure 6. The robot must possess
certain abilities to overcome obstacles in the fire field. The Safari-600T crawler chassis was
chosen as the mobile carrier in this study. The crawler support surface is crawler-toothed
and has strong traction and adhesion, meeting the criteria for climbing obstacles. The robot
moves around by controlling the rotation of the motor. Each of the brushless DC motors on
the left and right tracks of the robot chassis was designed in this study. To control two DC
brushless motors simultaneously, the drive controller selects the KYDBL4850- 2E controller.
The power supply used for the firefighting robot described in this paper must be strong
because it must transport a large volume of fire-extinguishing materials. This paper chose
aFigure
48 V, 5.30Indoor autonomous
AH LiFePO4 inspection
battery with afirefighting robot. and high temperature resistance to
large capacity
power the robot.
bot moves around by controlling the rotation of the motor. Each of the brushless DC mo-
tors on the left and right tracks of the robot chassis was designed in this study. To control
two DC brushless motors simultaneously, the drive controller selects the KYDBL4850- 2E
controller. The power supply used for the firefighting robot described in this paper must
be strong because it must transport a large volume of fire-extinguishing materials. This
Fire 2023, 6, 93 6 of 21
paper chose a 48 V, 30 AH LiFePO4 battery with a large capacity and high temperature
resistance to power the robot.

Figure 6.
Figure 6. The
The main
main hardware
hardware of
ofthe
themotion
motionunit.
unit.

The
The industrial
industrial computer
computer in in the
the robot
robot directs
directs thethe work
work ofof the
the drive
drive controller.
controller. This
This
robot
robot uses
uses aa Core
Core i5
i58250U
8250Uindustrial
industrial computer
computer for for robot
robot data
data processing
processing andand algorithm
algorithm
operations,
operations, because
because the
the Core
Core i5i5 8250U
8250U is is aahigh-performance
high-performance industrial
industrial computer
computer that
that is
is
dustproof, small in size, and can run stably in a fire field. The industrial
dustproof, small in size, and can run stably in a fire field. The industrial computer iscomputer is housed
in the inside
housed in thecabin of the
inside robot’s
cabin chassis.
of the Thechassis.
robot’s industrialThecomputer’s
industrialantenna is connected
computer’s antenna to is
the robot’s shell
connected to thefor the industrial
robot’s shell for computer to receive
the industrial information
computer to receivereliably, and thereliably,
information chosen
antenna is a 2.4G/Wi-Fi
and the chosen antenna high-gain antenna.
is a 2.4G/Wi-Fi high-gain antenna.

3.1.2.
3.1.2. Robot
Robot Sensor
Sensor Unit
Unit
The
The robot sensor unit
robot sensor unit comprises
comprises three
three sensors:
sensors: Lidar,
Lidar, IMU,
IMU, and
and encoder.
encoder. The
The sensor
sensor
placement
placement positions are schematically illustrated in Figure 7. The RPLIDAR A2 Lidar is
positions are schematically illustrated in Figure 7. The RPLIDAR A2 Lidar is
Fire 2023, 6, x FOR PEER REVIEW positioned upside-down in front of the robot, 0.7 m from its center point (the Lidar technical
positioned upside-down in front of the robot, 0.7 m from its center point (the Lidar7 tech-of 21
parameters are shown in Table 1). In the robot’s center, the SC-AHRS-100D IMU is mounted.
nical parameters are shown in Table 1). In the robot’s center, the SC-AHRS-100D IMU is
The incremental 1024-line encoder EB58W12R is installed at the crawler drive motor’s tail.
mounted. The incremental 1024-line encoder EB58W12R is installed at the crawler drive
motor’s tail.

Figure7.7.Schematic
Figure Schematicdiagram
diagramof
ofthe
thesensor
sensorinstallation
installationpositions.
positions.

Table 1. Main parameters of the RPLIDAR A2 Lidar.

Technical Indicators Typical Value Maximum Value


Ranging range 0.2~12 m Not applicable
Ranging resolution <0.5 mm Not applicable
Scan angle 0~360° Not applicable
Scanning frequency 10 Hz 15 Hz
Measurement frequency Magnetization 8010 Hz
Angular resolution Magnetization 1.35°
Laser wavelength 785 nm 795 nm
Fire 2023, 6, 93 7 of 21
Figure 7. Schematic diagram of the sensor installation positions.

Table 1. Main parameters of the RPLIDAR A2 Lidar.


Table 1. Main parameters of the RPLIDAR A2 Lidar.
Technical Indicators Typical Value Maximum Value
Technical Indicators Typical Value Maximum Value
Ranging range 0.2~12 m Not applicable
Ranging range
Ranging resolution <0.5 mm 0.2~12 m Not applicable
Not applicable
Ranging resolution
Scan angle 0~360° <0.5 mm Not applicable
Not applicable
Scan angle
Scanning frequency 10 Hz 0~360◦ 15 Hz Not applicable
Scanning frequency 10 Hz 15 Hz
Measurement frequency Magnetization 8010 Hz
Measurement frequency Magnetization 8010 Hz
Angular resolution Magnetization 1.35°
Angular resolution Magnetization 1.35◦
Laser wavelength
Laser wavelength 785 nm 785 nm 795 nm 795 nm

The coordinate system of the RPLIDAR A2 Lidar is based on the left-hand rule,
whereas the ROS Thesystem
coordinate system
is based of the
on the RPLIDAR
right-hand A2InLidar
rule. is based
the left- on the left-hand rule,
and right-handed
coordinate systems, the direction of a specific coordinate axis is reversed. The and
whereas the ROS system is based on the right-hand rule. In the left- right-handed
forward
coordinate systems, the direction of a specific coordinate axis is reversed.
direction of the robot is set as the positive direction of the x-axis, and the x-axis of the The forward
direction of the robot is set as the positive direction of the x-axis, and the x-axis of the
Lidar is set as the reverse direction to complete the transition of the left- and right-hand
Lidar is set as the reverse direction to complete the transition of the left- and right-hand
coordinate systems. The installation of the robot Lidar is depicted in Figure 8. The robot’s
coordinate systems. The installation of the robot Lidar is depicted in Figure 8. The robot’s
primary sensor is the Lidar. The Lidar is installed upside-down to protect it from falling
primary sensor is the Lidar. The Lidar is installed upside-down to protect it from falling
objects or the fire-extinguishing substances sprayed by the fire extinguisher. The conver-
objects or the fire-extinguishing substances sprayed by the fire extinguisher. The conversion
sion coordinate system must be changed in the startup file because the coordinate system
coordinate system must be changed in the startup file because the coordinate system of
of the Lidar is mounted upside-down, and the coordinate systems of the usual installation
the Lidar is mounted upside-down, and the coordinate systems of the usual installation
are mirror images of one another. Due to the robot’s shell and track shielding, the Lidar
are mirror images of one another. Due to the robot’s shell and track shielding, the Lidar
scanning range is 0–270°.
scanning range is 0–270◦ .

(a) (b)
Figure 8. Lidar mounted
Figure 8. Lidarupside-down on the robot:
mounted upside-down on(a)
theLidar.
robot:(b)
(a)Lidar
Lidar.mounted
(b) Lidaron the robot.
mounted on the robot.

3.1.3. Robot Kinematics Model


The robot’s driving mechanism is rear-drive, with an independent driving wheel
on each left and right track. The robot’s motion node controls the two driving wheels’
speed changes, allowing the robot to perform a variety of maneuvers. Using encoder data
from the drive wheel motor, the robot’s motion state is determined in real time, and the
robot’s pose and odometer data are then estimated. The current robot’s real-time control
information is based on the odometer feedback information, which is used to give speed
control instructions. The robot’s motion control node calculates the speed of the driving
wheel motor and converts the motor speed into the robot’s forward speed in response to
the speed command. The robot’s steering is determined by the difference between the left
and right motors. The calculation structure of the tracked robot’s kinematic solution is
integrated into the ROS robot motion control node to achieve motor speed control.
Figure 9 depicts the robot’s motion model, where V is the robot’s linear velocity, ω
is the robot’s angular velocity, O is the robot’s center position, Vl is the robot’s left track
speed, Vr is the robot’s right track speed, R is the robot’s turning radius, and L is the robot’s
wheel spacing. The turning radius, angular velocity, and linear velocity of the robot are
calculated using Formulas (1)–(3), respectively. The linear and angular velocity of the robot
speed command. The robot’s steering is determined by the difference between the left and
right motors. The calculation structure of the tracked robot’s kinematic solution is inte-
grated into the ROS robot motion control node to achieve motor speed control.
Figure 9 depicts the robot’s motion model, where V is the robot’s linear velocity, ω is
Fire 2023, 6, 93 the robot’s angular velocity, O is the robot’s center position, Vl is the robot’s left track 8 of 21
speed, Vr is the robot’s right track speed, R is the robot’s turning radius, and L is the ro-
bot’s wheel spacing. The turning radius, angular velocity, and linear velocity of the robot
are calculated using Formulas (1)–(3), respectively. The linear and angular velocity of the
can be used to calculate the travel speed of the left and right tracks. The motor speed is
robot can be used to calculate the travel speed of the left and right tracks. The motor speed
then calculated based on the track driving wheel’s diameter.
is then calculated based on the track driving wheel’s diameter.
L (Vr +L
V(l )Vr + Vl )
R=
R 2=(Vr − Vl ) , , (1) (1)
2(Vr − Vl )
Vr -Vl
= Vr − Vl (2)
ω= L , , (2)
L
− V−
VrV L (V + V ) V + V
V =R = r l  Vl l rL (=Vll + rVr ) Vl + Vr
(3)
V =ω×R= L 2 (Vr×
− Vl ) 2 = , (3)
L 2(Vr − V,l ) 2

FigureFigure
9. Motion modelmodel
9. Motion of crawler robot. robot.
of crawler

Depending on theon
Depending track
the speed, the robot
track speed, can becan
the robot classified into four
be classified intomotion states:states:
four motion If If
the track
the track speedspeed
on bothon both
sidessides
is theissame,
the same, the robot’s
the robot’s mobility
mobility direction
direction is straight
is straight ahead or
ahead
straight
or straight behind.
behind. Steering
Steering motion
motion willbebeproduced
will producedififthethetrack
track speeds
speeds on both sides sides differ,
differ, with the track on the side with the slower speed serving as the steering direction.While
with the track on the side with the slower speed serving as the steering direction.
Whileoneoneside
sideofofthe
thetrack
trackrotates,
rotates,the
theother
otherremains
remainsstationary.
stationary.TheTherobot
robotmustmuststop
stopthe
thetrack’s
track’smovement
movementatat thethecenter
centerto to
produce
produce in in
situ rotation.
situ The
rotation. robot
The willwill
robot rotate in place
rotate with the
in place
beam
with the beamof the two
of the twotracks as as
tracks thethe
center
centerwhen
when thethe
two
twotracks
tracksrotate
rotateatatthe
thesame
samespeed
speedand in
and inopposite
oppositedirections.
directions.

3.2. Design of the Robot’s Automatic Fire-Extinguishing System


An automatic fire-extinguishing system includes video monitoring and fire-extinguishing
systems. Flames are detected and located using a video surveillance system. The fire-
extinguishing system aims at the flame based on its location information and sprays an
extinguishing agent to put out the fire.

3.2.1. Video Surveillance System


The video surveillance system consists of a thermal imaging camera and a PTZ
(Pan/Tilt/Zoom). A Dahua DH-TPC-BF5400 thermal imaging camera with clear and
reliable imaging was used in this study. The camera is mounted on the PTS-303Z cradle
to extend the monitoring range, and the thermal camera is connected to the industrial
computer via network connections. Figure 10 shows the video surveillance system.
3.2.1. Video Surveillance System
The video surveillance system consists of a thermal imaging camera and a PTZ
(Pan/Tilt/Zoom). A Dahua DH-TPC-BF5400 thermal imaging camera with clear and relia-
ble imaging was used in this study. The camera is mounted on the PTS-303Z cradle to
Fire 2023, 6, 93 extend the monitoring range, and the thermal camera is connected to the industrial 9com-
of 21
puter via network connections. Figure 10 shows the video surveillance system.

Figure 10.
Figure 10. Video
Video surveillance
surveillance system.
system.

The thermal
The thermalimaging
imagingcamera’s
camera’s video
video imageimagedatadata areinto
are fed fedthe
into the detection
flame flame detection
model
model
for for detection.
detection. If a flameIf aisflame is detected,
detected, the distance
the distance betweenbetween
the flame the flamecoordinate
center center coordi-
and
nateimage
the and the image
center center is calculated;
is calculated; if a flame isif not
a flame is notthe
detected, detected, the flamecontinues.
flame detection detection con-
The
tinues. The
positive andpositive
negativeand negative
position positionare
deviations deviations
judged when are judged when deviation
the position the position devi-
exceeds
ation
the exceeds
defined the value.
pixel defined If pixel value. Ifdeviation
the position the position deviationthe
is negative, is robot
negative,
willthe robota will
receive left
rotation control instruction. The robot is sent the proper rotation control instruction, andin-
receive a left rotation control instruction. The robot is sent the proper rotation control if
struction,
the position and if the position
deviation deviation
is positive, the flameis positive, the flame isposition
position deviation calculated deviation is calcu-
simultaneously.
latedrotation
The simultaneously.
stops when Thetherotation
position stops when the
divergence position
is less than divergence
the set pixelisvalue.
less than the set
pixel value.
3.2.2. Fire-Extinguishing System
3.2.2.Due
Fire-Extinguishing
to the inability ofSystemthe current firefighting robot’s fire-extinguishing system to cope
with Due
the diversity of the environment,
to the inability of the currentthis paper proposes
firefighting robot’s afire-extinguishing
rotating fire-extinguishing
system to
system,
cope with as illustrated
the diversity in Figure 11. The robot’sthis
of the environment, fire-extinguishing
paper proposesequipment consists of four
a rotating fire-extinguish-
portable
ing system, fireasextinguishers.
illustrated in FourFigurefire
11.extinguisher
The robot’s cans are evenly spaced
fire-extinguishing on the consists
equipment robot’s
disc. Various types of fire extinguishers can be combined and matched
of four portable fire extinguishers. Four fire extinguisher cans are evenly spaced10onofthe to combat various
Fire 2023, 6, x FOR PEER REVIEW 21
types
robot’sofdisc.
fires.Various
Four firetypes
extinguisher canisters cancan
of fire extinguishers be sprayed
be combinedalternately or simultaneously
and matched to combat
by controlling
various types the rotating
of fires. Fourdevice and the fire canisters
fire extinguisher extinguisher can trigger.
be sprayed alternately or sim-
ultaneously by controlling the rotating device and the fire extinguisher trigger.

Figure 11.
Figure Fire-extinguishing system
11. Fire-extinguishing system installed
installed on
on the
the robot.
robot.

The fire-extinguishing
The fire-extinguishing system comprises the
system comprises the following
following components:
components: fire
fire extinguisher
extinguisher
fixtures, trigger device, rotating device, and system control device.
fixtures, trigger device, rotating device, and system control device.
(1) Fire extinguisher fixtures
The robot’s fire extinguisher fixtures can install and replace various fire extinguish-
ers. The fire extinguishers are inserted into four circular grooves in the rotating disc of the
robot. Two metal clasps are positioned above the grooves to secure the fire extinguishers
further. Screws can be used to adjust the clasps’ tightness.
(2) Fire extinguisher trigger device
Fire 2023, 6, 93 10 of 21

(1) Fire extinguisher fixtures


The robot’s fire extinguisher fixtures can install and replace various fire extinguishers.
The fire extinguishers are inserted into four circular grooves in the rotating disc of the robot.
Two metal clasps are positioned above the grooves to secure the fire extinguishers further.
Screws can be used to adjust the clasps’ tightness.
(2) Fire extinguisher trigger device
The fire extinguisher trigger device is adjacent to the fire extinguisher fastening frame.
The fire extinguisher trigger device comprises a pushrod motor and a concave metal card
slot. The concave metal card slot is mounted on the top of the pushrod motor, which
is attached to the rotating disc via a pushrod. The pushrod motor descends when the
fire extinguisher must be triggered, driving the concave metal card slot to press the fire
extinguisher handle.
(3) Rotating device
The motor’s rotation drives the revolving disc’s rotation. The rotating disc is secured
to the rotating bearing, secured to the robot chassis via the rotating bearing base. A giant
gear is installed on the outside of the rotating bearing, and a small gear is installed on the
top of the motor’s rotating shaft. A DC motor powers the biting between the large and
small gears.
To ensure precise fire extinguishing, it is necessary to ensure that the fire extinguisher
rotates precisely to the front of the robot when the rotating disc rotates. In this article, the
revolving disc is made of aluminum. A 1004NO NPN proximity switch sensor that is only
sensitive to iron is installed below it, along with a metal foil attached to the underside
of each fire extinguisher fixture. Figure 12 illustrates the proximity switch’s installation
Fire 2023, 6, x FOR PEER REVIEW position schematically. When the proximity switch detects iron, the robot’s fire extinguisher
11 of 21
nozzle is directly ahead. At this point, the proximity switch signal is received by the
fire-extinguishing system’s control device, which immediately issues the stop command.

Figure 12. Schematic diagram of the installation position of the proximity switch.
Figure 12. Schematic diagram of the installation position of the proximity switch.

(4) System
(4) System control
control device
device
In this
In this paper,
paper,the
theSTM32F103ZET6
STM32F103ZET6 microcontroller
microcontroller serves
serves asas the
the controller
controller for
for the
the
automatic fire-extinguishing system. To drive and control the rotating device’s
automatic fire-extinguishing system. To drive and control the rotating device’s four pushrodfour push-
rod motors
motors and and DC motors,
DC motors, the the L289N
L289N DCDC motor
motor drive
drive module
module is is selected.The
selected. TheL289N
L289Nisis
capableof
capable ofdriving
drivingand
and controlling
controlling two
two direct-current
direct-current motors
motors simultaneously.
simultaneously. As As aa result,
result,
thissystem
this systememploys
employsthree
threeL289Ns
L289Nsto tocontrol
controlthe
thedrive
driveofoffive
fiveDC
DCmotors.
motors.

4.4. ROS-Based
ROS-BasedMap-Building
Map-BuildingSystem
SystemandandAutomatic
AutomaticFire-Extinguishing
Fire-ExtinguishingSystem
System
Implementation
Implementation
ROS
ROSisisaasoftware
softwareframework
framework for
for developing
developing robots
robots that
that supports
supports various
various program-
program-
ming
minglanguages
languagesandandincludes
includesthe
theGazebo
Gazebosimulation
simulationplatform
platformandandthe
theRVIZ
RVIZvisualization
visualization
tool, enabling programmers to more easily design and debug mobile robot-related appli-
cations [40]. ROS was installed on Ubuntu 16.04 in this study, and its corresponding ver-
sion was Kinetic. A fire inspection robot’s software system is composed primarily of a
remote control system, an industrial computer software system, and a bottom-drive con-
Fire 2023, 6, 93 11 of 21

tool, enabling programmers to more easily design and debug mobile robot-related applica-
tions [40]. ROS was installed on Ubuntu 16.04 in this study, and its corresponding version
was Kinetic. A fire inspection robot’s software system is composed primarily of a remote
control system, an industrial computer software system, and a bottom-drive control system.
Through ROS distributed multi-machine communication, the computer running the remote
control system communicates with the industrial computer. This work transplanted the
map-building and flame detection algorithms into the ROS system as nodes to increase the
stability of the system’s programs, allowing all programs to perform unified engineering
management and data communication within the ROS system.

4.1. Map-Building System Implementation


Figure 13 depicts the Cartographer map construction system’s node diagram. The ma-
jority of the robot map construction system comprises multiple ROS nodes; /mbot bringup
is the robot’s bottom control node, and the topic is /firerobot odom wheel odometer; /rpli-
darNode is a lidar node, and the topic is scan—that is, Lidar cloud point information; /imu
is the node for the IMU sensor, and the topic is /imu/data raw; /base link to imu link is
a node for static coordinate conversion, and the topic is/tf coordinate conversion, which
performs the static conversion between the robot’s base coordinate base link and the IMU
Fire 2023, 6, x FOR PEER REVIEW 12 of 21
Fire 2023, 6, x FOR PEER REVIEW coordinate. The extended Kalman filter node/robot pose ekf subscribes to the imu 12 and
of 21
odom nodes to perform information fusion and then publish their coordinate information
via the /tf coordinate change.

Figure 13. ROS node diagram of the Cartographer map construction system.
Figure13.
Figure 13.ROS
ROSnode
nodediagram
diagramofofthe
theCartographer
Cartographermap
mapconstruction
constructionsystem.
system.
4.2. Realization of the Automatic Fire-Extinguishing System
4.2.
4.2.Realization
Realizationofofthe theAutomatic
AutomaticFire-Extinguishing
Fire-ExtinguishingSystem System
An ROS node diagram of an automatic fire-extinguishing system is shown in Figure
An ROS
Anthe ROS node diagram of an automatic fire-extinguishing system is shown in Figure 14.
14. At far node
left ofdiagram of an automatic
the illustration is the image fire-extinguishing
acquisition nodesystem is shown in Figure
/ThermalImagingCamera
At14.the far left of the illustration is the image acquisition node /ThermalImagingCamera of a
of At the far left
a thermal of the illustration
imaging camera, which is theisimage acquisition
responsible node /ThermalImagingCamera
for capturing video images; /image
thermal
of imaging
a thermal camera, which is responsible for capturing video images; /image
images;listener
listener is theimaging camera,
image preprocessing which is responsible
node, which subscribesfor capturing video
to the topic /image
/ThermalImaging-
islistener
the image is thepreprocessing
image node, which
preprocessing node, subscribes
which to the topic
subscribes to /ThermalImagingCamera
the topic /ThermalImaging-
Camera compressed published by /ThermalImagingCamera and preprocesses the video
compressed
Camera publishedpublished
by /ThermalImagingCamera and preprocesses the video image to
image tocompressed
complete the by /ThermalImagingCamera
image format conversion; /darknet rosand is apreprocesses
flame detection the model
video
complete
image the image
to complete format conversion;
thedetection,
image format /darknet ros is a flame detection model node used
node used for flame whichconversion;
processes the /darknet
flame ros is a flame
detection detection
results and themodel
flame
for
nodeflameused detection,
for flame which processes
detection, which the flame detection
processes the flame results and the
detection flame
results andcoordinate
the flame
coordinate
information information
and publishes and publishes
thepublishes
/darknetthe the /darknet
ros/bounding ros/bounding boxes
boxes topic;boxes topic;
/fire topic; /fire
aim is/fire aim
the flame is
coordinate
the flame information
aiming node. and
The flame aiming /darknet
function ros/bounding
is accomplished by controlling aim
the is
ro-
aiming
the flamenode. The flame
aiming node. aiming
The function
flame aimingis accomplished
function by controlling
is accomplished bythe robot’s rotation
controlling the ro-
bot’s
based rotation
on the based
flame on the
coordinate flame coordinate
information. information.
bot’s rotation based on the flame coordinate information.

Figure 14. ROS node diagram of the automatic fire-extinguishing system.


Figure 14.ROS
Figure14. ROSnode
nodediagram
diagramofofthe
theautomatic
automaticfire-extinguishing
fire-extinguishingsystem.
system.
5. Results and Analysis
5. Results and Analysis
5.1. Building Fire Simulation Experimental Device Setup
5.1. Building Fire Simulation Experimental Device Setup
In this study, a building fire simulation experimental device, as shown in Figure 15a,
was In this study,toa evaluate
constructed building the
fire autonomous
simulation experimental device,
inspection and as shown
automatic in Figure 15a,
fire-extinguishing
was constructed to evaluate the autonomous inspection and automatic fire-extinguishing
functions of robots in the fire environment. The experimental device as a whole is an “L”-
functions of robotsAinsimulated
shaped corridor. the fire environment. The experimental
fire field experimental devicewith
environment as a whole
a lengthis of
an 10
“L”-
m,
shaped corridor. A simulated fire field experimental environment with a length of 10 m,
Fire 2023, 6, 93 12 of 21

5. Results and Analysis


5.1. Building Fire Simulation Experimental Device Setup
In this study, a building fire simulation experimental device, as shown in Figure 15a,
was constructed to evaluate the autonomous inspection and automatic fire-extinguishing
functions of robots in the fire environment. The experimental device as a whole is an
“L”-shaped corridor. A simulated fire field experimental environment with a length of 10
m, a width of 1.5 m, and a height of 2.2 m is formed inside the device. The external console
and the device’s display screen can control and display the experimental equipment inside
Fire 2023, 6, x FOR PEER REVIEW the device. The dry powder fire extinguisher agent employed in the experiment is corrosive 13 of 21
to some extent. To protect the experimental device in the experiment, a glass magnesium
Fire 2023, 6, x fireproof board with a length of 2 m and a width of 1.2 m is placed on both sides of the
FOR PEER REVIEW 13
burning area, and a glass magnesium fireproof board is also placed in the internal test, as
shown in Figure 15b.

(a) (b)
Figure 15. Experimental(a)
setup: (a) Building fire simulation experimental device. (b) (b) Inside the de-
vice before the experiment.
Figure
Figure 15. Experimental 15. Experimental
setup: setup:
(a) Building fire (a) Building
simulation fire simulation
experimental device.experimental device.
(b) Inside the (b) Inside th
device
vice before the experiment.
before the experiment.
5.2. Robot Map Construction
The cotton
5.2. Robot rope
5.2. was
RobotlitMap
Map Construction during the experiment to create a fire environment. Figure 16
Construction
depicts the cotton rope
The cotton rope was used
Thelit in
cotton the
during experiment.
rope
thewas In the
lit during
experiment the
to robot, the
experiment
create a firedistance between
to create the laser
a fire Figure
environment. environment.
16 Figu
radar and the innermost
depicts wall
the of
cotton the corridor
rope used inis 3.8
the m. The
experiment.experimental
In the
depicts the cotton rope used in the experiment. In the robot, the distance between the robot,procedure
the distanceis de-
between the
picted in Figure 17. On
radar the
and left
the is a picture
innermost of
wall the
of experimental
the corridor is environment
3.8 m.
laser radar and the innermost wall of the corridor is 3.8 m. The experimental procedure is The taken
experimental by the
procedure i
camera embedded in the
picted experimental
in Figure 17. Ondevice.
the leftOnis the
a right
picture is
of a
thelaser point cloud
experimental
depicted in Figure 17. On the left is a picture of the experimental environment taken by the generated
environment taken b
by Lidar
camera scanning.
embedded inAthe
cameradot in a laserinpoint
embedded
experimental the cloud
experimental
device. On therepresents is aaOn
device.
right laser beam,
thepoint
laser right is ai.e.,
cloud a point
laser ranging
generated cloud gene
point.
by Lidar scanning. A dot in a laser point cloud represents a laser beam, i.e., a ranging point.i.e., a ran
by Lidar scanning. A dot in a laser point cloud represents a laser beam,
point.

Figure 16.
Figure 16. Cotton
Cotton rope
rope to be
to
Figure be ignited.
16.ignited.
Cotton rope to be ignited.

(a) (a)
Fire 2023, 6, 93 13 of 21

Figure 17. Laser point cloud in the fire environment: (a) Freshly lit cotton rope. (b) Cotton rope lit for
5 min. (c) Cotton rope lit for 9 min. (d) Cotton rope lit for 10 min. (e) Cotton rope lit for 11 min.

As illustrated in Figure 17, smoke and other factors at the fire site interfered with the
Lidar ranging five minutes after the cotton rope was lit. At 9 min, the number of laser
beams on the far right of the laser point cloud decreased. After 10 min, only three laser
beams remained on the laser point cloud’s far right. At 11 min, the laser beam to the right
Fire 2023, 6, 93 14 of 21

of the laser point cloud vanished completely. The experimental results indicate that the fire
environment affects the maximum measuring distance of Lidar. However, it can continue
to operate normally, except that the laser beam at the farthest point is significantly affected,
while the laser beams at other locations remain normal. At this point, the robot is instructed
to advance by 1 m, as illustrated in Figure 18.

Figure 18. The laser point cloud of controlling the robot to move forward 1 m.

As illustrated in Figure 18, the laser beam on the far right reappeared after the machine
moved forward by 1 m. The distance between the Lidar and the experimental device’s
Fire 2023, 6, x FOR PEER REVIEW
innermost wall at this point was 2.8 m, indicating that the Lidar can detect typical detection
15 of 21
barriers within a range of 5.6 m in a fire environment. At this point, the control robot creates
a map of the environment, as illustrated in Figure 19.

Figure19.
Figure 19.The
Theenvironment
environmentmap
mapconstructed
constructedby
bythe
therobot
robotininthe
thefire
fireenvironment.
environment.

Figure
Figure1919illustrates
illustratesthat thethe
that constructed mapmap
constructed is comprehensive and has
is comprehensive andsmooth edges.
has smooth
Due to the
edges. Duelocation of the fire
to the location prevention
of the board, the
fire prevention mapthe
board, corresponding to the inner
map corresponding to theside of
inner
the
sideexperimental device is device
of the experimental slightlyisfrayed.
slightlySix locations
frayed. Six marked
locationsinmarked
Figure 19inwere
Figurechosen for
19 were
measurement to determinetothe
chosen for measurement map’s accuracy.
determine the map’s The six measurement
accuracy. locations correspond
The six measurement locations
to the six edges of the map that the robot constructed in this paper.
correspond to the six edges of the map that the robot constructed in this paper. The The error dataerror
for
the six measurement positions are shown in Table 2, and the error percentages
data for the six measurement positions are shown in Table 2, and the error percentages for the six
measurement positions are shown in Figure 20.
for the six measurement positions are shown in Figure 20.
The maximum error percentage for each of the six measurement locations was 1.69%,
Table 2. Six edgewas
the minimum errors of the and
1.00%, map the
constructed
averageby the 1.33%.
was robot inAdditionally,
the experiment.as shown in Table 2
and Figure 20, the errors at positions 4, 5, and 6 were slightly greater due to the inclusion
Measuring Position Actual Length (cm) Map Build Length (cm) Absolute Error (cm) Error Percentage (%)
of the fire prevention board in the experimental device. The experimental results show
1 330.00the overall effect is 326.20
that insignificant, although the3.80 fire environment interferes1.15 with the
2 150.00 151.50 1.50 1.00
laser ranging. The robot designed in this paper has a high degree of accuracy in mapping
3 180.00 178.10 1.90 1.06
4 the fire
550.00 scene, which provides
541.20 a solid foundation for the
8.80 robot’s automatic inspection
1.60 and
5 fire extinguishing.
700.00 688.20 11.80 1.69
6 150.00 147.80 2.20 1.47
Table 2. Six edge errors of the map constructed by the robot in the experiment.

Measuring Posi- Actual Length Map Build Absolute Error Error Percentage
tion (cm) Length (cm) (cm) (%)
1 330.00 326.20 3.80 1.15
2 150.00 151.50 1.50 1.00
Fire
Fire2023,
2023,6,
6,x93FOR PEER REVIEW 1615ofof21
21

Figure20.
Figure Errorpercentage
20.Error percentageof
ofmeasured
measuredpositions.
positions.

The maximum
5.3. Robot errorand
Flame Detection percentage forAiming
Automatic each of the six measurement locations was 1.69%,
the minimum was 1.00%, and the average was 1.33%. Additionally, as shown in Table 2
The flame detection model for the firefighting robot uses an image dataset that was
and Figure 20, the errors at positions 4, 5, and 6 were slightly greater due to the inclusion
self-constructed in our lab. Firstly, the experimental setup was built to obtain thermal im-
of the fire prevention board in the experimental device. The experimental results show that
aging flame images by experimental means and select flame images with high quality;
the overall effect is insignificant, although the fire environment interferes with the laser
secondly, the flame targets in the images were labeled; finally, the training and validation
ranging. The robot designed in this paper has a high degree of accuracy in mapping the
sets were divided to construct a standard dataset. The dataset contained a total of 14,757
fire scene, which provides a solid foundation for the robot’s automatic inspection and fire
thermal infrared flame images, of which 70% were used as the training set and 30% were
extinguishing.
used as the test set. The training environment parameters for the flame detection model
are
5.3.shown in Table
Robot Flame 3.
Detection and Automatic Aiming
The flame detection model for the firefighting robot uses an image dataset that was
Table 3. Training environment parameters for flame detection model
self-constructed in our lab. Firstly, the experimental setup was built to obtain thermal
imaging
Hardware flame images by experimental means and select flame images
Envi- with high quality;
Programming Lan-
secondly, the flame targets in Software
the images Environment
were labeled; finally, the training and validation
ronment guage
sets were Quadro
NVIDIA divided toOperating
construct system
a standard dataset.
Ubuntu The
16.04, dataset
deep contained a total of 14,757
learning
thermal infrared Python, C language
K2000 4 G flame images,framework
of which 70% were used as the training
CSPDarknet53 set and 30% were
used as the test set. The training environment parameters for the flame detection model are
shown in Table 3.
The definitions and settings of the major experimental parameters in YOLOv4 are
shown
Table 3.in Table environment
Training 4. Training parameters
was performed using
for flame pretrained
detection model.weight files of the official
YOLOv4 dataset for migration learning to accelerate convergence and improve learning
speed.
Hardware Environment Software Environment Programming Language
Operating system Ubuntu 16.04, deep
NVIDIA Quadro K2000 Table
4 G 4. Definition and setting of the experimental parameters. Python, C language
learning framework CSPDarknet53
Parameter
Parameter Parameter Definition
The definitions and settings of the major experimental parameters in YOLOv4Value are shown
inmax_batches Maximum
Table 4. Training was performed number
using of iterations
pretrained weight files of the official4000
YOLOv4
Thelearning
dataset for migration numberto ofaccelerate
samples participating
convergence andin a improve
batch of learning
train- speed.
batch 64
ing
subdivision Number of sub-batches 16
momentum Momentum parameter in gradient descent 0.9
decay Weight attenuation regular term coefficient 0.0005
learning_rate Learning rate 0.001
ignore_thresh The threshold size of the IOU involved in the calculation 0.5
Fire 2023, 6, 93 16 of 21

Table 4. Definition and setting of the experimental parameters.

Parameter Parameter Definition Parameter Value


max_batches Maximum number of iterations 4000
batch The number of samples participating in a batch of training 64
subdivision Number of sub-batches 16
momentum Momentum parameter in gradient descent 0.9
decay Weight attenuation regular term coefficient 0.0005
Fire 2023, 6,learning_rate
x FOR PEER REVIEW Learning rate 0.001 17 of 21
ignore_thresh The threshold size of the IOU involved in the calculation 0.5

Theflame
The flamedetection
detectionmodel
modelwaswastrained
trainedand
andthen
thentested
testedfor
forits
itsability
abilitytotodetect
detectflames
flames
with automatic flame aiming. As shown in Figure 21, the oil pool was placed
with automatic flame aiming. As shown in Figure 21, the oil pool was placed on a mobile on a mobile
platform with
platform with wheels,
wheels, and
andthe theoil
oilpool
poolcould
couldbebe
moved
moved by by
pulling the the
pulling ironiron
bar at
barthe
atwin-
the
dow. TheThe
window. oil pool waswas
oil pool filled withwith
filled the fuel n-heptane,
the fuel and and
n-heptane, the oil
thepool was 4.5
oil pool wasm4.5
away from
m away
the flame.
from The The
the flame. robot flame
robot flamedetection
detectionprocess and
process andautomatic
automaticflame
flame aiming process are
aiming process are
shownin
shown inFigure
Figure22.22.

Figure21.
Figure 21.Mobile
Mobileplatform
platformand
andoil
oilpool.
pool.

(a)

(b)

Figure 22.The
Figure22. Therobot
robotcan
canstill
stillaim
aimatatthe
theflame
flameafter
aftermoving
movingthetheflame:
flame:(a)
(a)The
Therobot
robotisisaimed
aimedat
atthe
the
flame.(b)
flame. (b)The
Themobile
mobileflame
flamerobot
robotturns
turnstotoautomatically
automaticallyaim
aimatatthe
theflame.
flame.

When the flame is stationary, the robot can automatically identify the flame and aim
at the flame; if the mobile platform moves, the rotating robot can still identify the flame
and automatically aim at the flame. The flame detection confidence of the robot is 72%,
and the frame rate is 11 fps, which is displayed at this time on the flame identification
terminal of the fire-extinguishing system of the firefighting robot. The frame rate meets
Fire 2023, 6, 93 17 of 21

When the flame is stationary, the robot can automatically identify the flame and aim
at the flame; if the mobile platform moves, the rotating robot can still identify the flame
and automatically aim at the flame. The flame detection confidence of the robot is 72%,
and the frame rate is 11 fps, which is displayed at this time on the flame identification
terminal of the fire-extinguishing system of the firefighting robot. The frame rate meets the
requirement of real-time processing.

Fire 2023,
Fire 2023, 6, x PEER
6, x FOR FOR PEER REVIEW
REVIEW
5.4. Robot Autonomous Inspection and Automatic Fire Extinguishing 18 of1821of 21
The experiment involved autonomous robot inspection and automatic fire extinguish-
ing. The robot plans its path using the map created in Section 5.2 and the move_base
function
materialspackage in ROS [41]. and
in experiment,
the experiment, Cotton rope kgand cardboard boxes were
fireused as combustion
materials in the and fourfour
3 kg3portable
portable
dry dry powder
powder extinguishers
fire extinguishers werewere
materials
used as in the experiment,
extinguishing and
agents. The four 3 kg
burning portable
materialsdry powder
and fire fire extinguishers
extinguishers used were
in this
used as extinguishing agents. The burning materials and fire extinguishers used in this
used as extinguishing
experiment are shown agents.
in The
Figure 23.burning materials and fire extinguishers used in this
experiment
experimentare are
shown in Figure
shown 23. 23.
in Figure

(a) (a) (b) (b)


Figure
Figure 23. Burning
23. Burning materials
materials andextinguishers
and and
fire fire extinguishers carried
carried by robot:
by robot: (a) Burning
(a) Burning materials.
materials. (b) The
(b) The
Figure 23. Burning materials fire extinguishers carried by robot: (a) Burning materials. (b) The
robot
robot carries
carries four
fourfour 3 kg
3 kg3dry dry powder
powder fire
fire fire extinguishers.
extinguishers.
robot carries kg dry powder extinguishers.
During
During
During the experiment,
the experiment,
the experiment, aalarge
a large largeamount
amount of of
of smoke
amount smokewas was
smoke generated
generated
was using
generatedusing a shaded
a shaded
using acom- com-
shaded
bustion
bustion cotton
combustion cotton
rope
cotton rope
fire fire to to
to fire
rope rapidly
rapidly reduce
reduce
rapidly the
theexperimental
the experimental
reduce device’s
device’s
experimental visibility,
visibility,
device’s andand
visibility, and the
the the
cardboard
cardboard
cardboard boxboxbox
waswaswas ignited.
ignited. TheThe
ignited. The entrance
entrance of the
entrance of the experimental
experimental
of the experimental device
device waswas
device was
set as set
settheas the starting
as starting
the starting
pointpoint
point ofofthe
of the therobot
robot robot inspection,
inspection,
inspection, thethe the
built built environment
environment
built environment mapmapmap
waswas
was loaded,
loaded,
loaded, the the inspection
theinspection
inspection loca- loca-
location
tion
tionwas was
wasset, set,
set, and and
and the paththe path planning
path planning
planningwas was
wascarriedcarried
carriedout out
using
out using
using the
thethe move_base
move_base
move_base function
function
functionpackagepackage
package
ofROS
of
of ROS ROS totoachieve
to achieve achieve independent
independent
independent inspection.
inspection.
inspection. TheThe The inspection
inspection
inspection path
pathpath of the
of
of the the robot
robot robot isisshown
is shown shown
in in in
Figure
Figure
Figure 24. 24.24. The
The The
robot robot
robot detects
detects
detects the flame
the the
flame flame during
during
during theinspection
the the inspection
inspection process
process
process andand and automatically
automatically
automatically
extinguishes
extinguishes
extinguishes the
the fire, fire,
the fire, and
and the the
androbot’srobot’s
the robot’s automatic
automatic
automatic fire-extinguishing
fire-extinguishing
fire-extinguishing process
processprocess is
is shown shown
is inshownin Fig-
Fig- in
ure 25.
ure Figure
25. 25.

Figure
Figure 24.Inspection
Inspection
24. Inspection
Figure 24. path
pathpath
of theofrobot.
of therobot.
the robot.
Fire 2023, 6, 93 18 of 21
Fire 2023, 6, x FOR PEER REVIEW 19 of 21

Figure 25. The robot finds flames and extinguishes them during the autonomous inspection.
Figure 25. The robot finds flames and extinguishes them during the autonomous inspection.

6.6.Conclusions
Conclusions
Thisstudy
This studydeveloped
developedaafire fireinspection
inspectionrobot
robotwith
withtwo twoparts:
parts:autonomous
autonomousinspection,
inspection,
andautomatic
and automaticfire fireextinguishing.
extinguishing.The Thehardware
hardwarefor forthe
theautonomous
autonomousinspection
inspectionsystem
system
primarilyconsists
primarily consists ofof a motion
motion unit unitand
andsensor
sensorunit.
unit.In contrast,
In contrast, the the
hardware of the
hardware ofauto-
the
automatic fire-extinguishing system primarily consists of a video monitoring system anda
matic fire-extinguishing system primarily consists of a video monitoring system and
afire-extinguishing
fire-extinguishingsystem.
system.The TheCartographer
Cartographer map construction
map construction algorithm is used
algorithm to create
is used to
a real-time
create map of
a real-time maptheoffire
thescene environment,
fire scene and YOLOv4
environment, and YOLOv4 deep-learning-based
deep-learning-based flame
flame detection
detection technology
technology is usedis used to detect
to detect flames.
flames. We We constructed
constructed an L-shaped
an L-shaped corridor
corridor fire
fire environment
environment simulation
simulation experiment
experiment device
device to evaluate
to evaluate thethe robot’s
robot’s performance.
performance. The The
ex-
experimental devicelitlit
perimental device a cotton
a cotton rope
rope to to create
create a fire
a fire environment.
environment. WhenWhenthethe cotton
cotton rope rope
was
was burned,
burned, a portion
a portion of theoflaser
the point
laser cloud
point near
cloudthenear
fire the fire vanished;
vanished; when thewhenrobotthe robot
advanced
advanced
1 m, the point cloud reappeared, indicating that the fire environment did not affectnot
1 m, the point cloud reappeared, indicating that the fire environment did the
affect
robot’sthemap
robot’s map By
building. building.
analyzing By the
analyzing the constructed
constructed map, we canmap, we can
determine determine
that the aver-
that
agethe average
error error percentage
percentage of map construction
of map construction accuracy isaccuracy
only 1.33%, is only 1.33%, providing
providing a solid foun- a
solid
dationfoundation for autonomous
for autonomous robot inspection
robot inspection and fire extinguishing.
and fire extinguishing. Subsequently,
Subsequently, in the ro-
inbot’s
the autonomous
robot’s autonomous
inspection inspection and automatic
and automatic fire-extinguishing
fire-extinguishing experiment, experiment,
the robot thewas
robot was able to detect and extinguish the flame in real time
able to detect and extinguish the flame in real time during the autonomous inspection during the autonomous
inspection
process. process.
The
Therobot
robotperforms
performstwo twofunctions:
functions:autonomous
autonomousinspection,
inspection,and andautomatic
automaticfire fireextin-
extin-
guishing,
guishing,reducing
reducingfirefighters’
firefighters’ workload
workload andanddecreasing
decreasing the frequency of significant
the frequency fires.
of significant
fires. Limited by the problem of laboratory space and the danger of fire experiments, the
Fire 2023, 6, 93 19 of 21

Limited by the problem of laboratory space and the danger of fire experiments, the current
fire simulation experimental device is small. In the following work, we will actively work
with relevant companies to conduct further testing on the usage performance of the robot
through industry–academia integration.

Author Contributions: Conceptualization, S.L.; methodology, S.L., J.Y. (Junying Yun) and C.F.;
software, J.Y. (Junying Yun) and C.F.; validation, S.L., J.Y. (Junying Yun), C.F. and G.S.; formal analysis,
D.Z.; investigation, Y.G. and J.Y. (Jialuo Yang); writing—original draft preparation, S.L. and J.Y.
(Junying Yun); writing—review and editing, S.L. and J.Y. (Junying Yun). All authors have read and
agreed to the published version of the manuscript.
Funding: This work was supported by the National Natural Science Foundation for Young Scientists
of China (Grant No. 51804278), the Training Plan for Young Backbone Teachers in Colleges and
Universities in Henan Province (Grant No. 2021GGJS094), the Henan Science and Technology
Development Plan (Grant No. 212102210535), and the National Innovation Training Program for
College Students (Grant No. 202210462007).
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Zhou, Y.; Pang, Y.; Chen, F.; Zhang, Y. Three-dimensional indoor fire evacuation routing. ISPRS Int. J. Geo-Inf. 2020, 9, 558.
[CrossRef]
2. Kodur, V.; Kumar, P.; Rafi, M.M. Fire hazard in buildings: Review, assessment and strategies for improving fire safety. PSU Res.
Rev. 2020, 4, 1–23.
3. Li, Z.; Huang, H.; Li, N.; Zan, M.L.C.; Law, K. An agent-based simulator for indoor crowd evacuation considering fire impacts.
Autom. Constr. 2020, 120, 103395. [CrossRef]
4. Zhou, Y.-C.; Hu, Z.-Z.; Yan, K.-X.; Lin, J.-R. Deep learning-based instance segmentation for indoor fire load recognition. IEEE
Access 2021, 9, 148771–148782.
5. Birajdar, G.S.; Baz, M.; Singh, R.; Rashid, M.; Gehlot, A.; Akram, S.V.; Alshamrani, S.S.; AlGhamdi, A.S. Realization of people
density and smoke flow in buildings during fire accidents using raspberry and openCV. Sustainability 2021, 13, 11082. [CrossRef]
6. Ahn, Y.-J.; Yu, Y.-U.; Kim, J.-K. Accident cause factor of fires and explosions in tankers using fault tree analysis. J. Mar. Sci. Eng.
2021, 9, 844. [CrossRef]
7. Liu, Y.T.; Sun, R.Z.; Zhang, X.N.; Li, L.; Shi, G.Q. An autonomous positioning method for fire robots with multi-source sensors.
Wirel. Netw. 2021, 1–13. [CrossRef]
8. Jia, Y.-Z.; Li, J.-S.; Guo, N.; Jia, Q.-S.; Du, B.-F.; Chen, C.-Y. Design and research of small crawler fire fighting robot. In Proceedings
of the 2018 Chinese Automation Congress (CAC), Xi’an, China, 30 November–2 December 2018; pp. 4120–4123.
9. Gao, X.; Zhang, F.; Chaoxia, C.; He, G.; Chong, B.; Pang, S.; Shang, W. Design and Experimental Verification of an Intelligent
Fire-fighting Robot. In Proceedings of the 6th IEEE International Conference on Advanced Robotics and Mechatronics (ICARM),
Chongqing, China, 3–5 July 2021; pp. 943–948.
10. Liljeback, P.; Stavdahl, O.; Beitnes, A. SnakeFighter—Development of a water hydraulic fire fighting snake robot. In Proceedings
of the 9th International Conference on Control, Automation, Robotics and Vision, Singapore, 5–8 December 2006; pp. 1–6.
11. Kim, J.-H.; Starr, J.W.; Lattimer, B.Y. Firefighting robot stereo infrared vision and radar sensor fusion for imaging through smoke.
Fire Technol. 2015, 51, 823–845. [CrossRef]
12. Schneider, F.E.; Wildermuth, D. Using robots for firefighters and first responders: Scenario specification and exemplary system
description. In Proceedings of the 18th International Carpathian Control Conference (ICCC), Sinaia, Romania, 28–31 May 2017;
pp. 216–221.
13. Liu, P.; Yu, H.; Cang, S.; Vladareanu, L. Robot-assisted smart firefighting and interdisciplinary perspectives. In Proceedings of the
22nd International Conference on Automation and Computing (ICAC), Colchester, UK, 7–8 September 2016; pp. 395–401.
14. Zhang, S.; Yao, J.; Wang, R.; Liu, Z.; Ma, C.; Wang, Y.; Zhao, Y. Design of intelligent fire-fighting robot based on multi-sensor
fusion and experimental study on fire scene patrol. Robot. Auton. Syst. 2022, 154, 104122. [CrossRef]
15. Hong, J.H.; Min, B.-C.; Taylor, J.M.; Raskin, V.; Matson, E.T. NL-based communication with firefighting robots. In Proceedings of
the 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Seoul, Republic of Korea, 14–17 October 2012;
pp. 1461–1466.
16. Tamura, Y.; Amano, H.; Ota, J. Analysis of cognitive skill in a water discharge activity for firefighting robots. ROBOMECH J. 2021,
8, 13. [CrossRef]
Fire 2023, 6, 93 20 of 21

17. Ando, H.; Ambe, Y.; Ishii, A.; Konyo, M.; Tadakuma, K.; Maruyama, S.; Tadokoro, S. Aerial hose type robot by water jet for fire
fighting. IEEE Robot. Autom. Lett. 2018, 3, 1128–1135. [CrossRef]
18. Mingsong, L.; Tugan, L. Design and Experiment of Control System of Intelligent Fire Fighting Robot. In Proceedings of the 2020
IEEE 4th Information Technology, Networking, Electronic and Automation Control Conference (ITNEC), Chongqing, China,
12–14 June 2020; pp. 2570–2573.
19. Gan, Z.; Huang, G.; Zhang, J.; Liu, X.; Shan, C. The Control System and Application Based on ROS Elevating Fire-Fighting Robot.
J. Phys. Conf. Ser. 2021, 2029, 012004. [CrossRef]
20. Wang, J.; Meng, Z.; Wang, L. A UPF-PS SLAM algorithm for indoor mobile robot with NonGaussian detection model. IEEE/ASME
Trans. Mechatron. 2021, 27, 1–11. [CrossRef]
21. Zong, W.P.; Li, G.Y.; Li, M.L.; Wang, L.; Li, S.X. A survey of laser scan matching methods. Chin. Opt. 2018, 11, 914–930. [CrossRef]
22. Van der Merwe, R.; Wan, E.A. The square-root unscented Kalman filter for state and parameter-estimation. In Proceedings of
the 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing, Salt Lake City, UT, USA, 7–11 May 2001;
pp. 3461–3464.
23. Pagac, D.; Nebot, E.M.; Durrant-Whyte, H. An evidential approach to map-building for autonomous vehicles. IEEE Trans. Robot.
Autom. 1998, 14, 623–629. [CrossRef]
24. Murphy, K.P. Bayesian map learning in dynamic environments. In Advances in Neural Information Processing Systems; MIT Press:
Cambridge, MA, USA, 1999.
25. Montemerlo, M.; Thrun, S. Simultaneous localization and mapping with unknown data association using FastSLAM. In Pro-
ceedings of the 2003 IEEE International Conference on Robotics and Automation (Cat. No. 03CH37422), Taipei, Taiwan, 14–19
September 2003; pp. 1985–1991.
26. Grisetti, G.; Stachniss, C.; Burgard, W. Improved techniques for grid mapping with rao-blackwellized particle filters. IEEE Trans.
Robot. 2007, 23, 34–46. [CrossRef]
27. Taguchi, S.; Deguchi, H.; Hirose, N.; Kidono, K. Fast Bayesian graph update for SLAM. Adv. Robot. 2022, 36, 333–343. [CrossRef]
28. Gao, L.L.; Dong, C.Y.; Liu, X.Y.; Ye, Q.F.; Zhang, K.; Chen, X.Y. Improved 2D laser slam graph optimization based on Cholesky
decomposition. In Proceedings of the 8th International Conference on Control, Decision and Information Technologies (CoDIT),
Istanbul, Turkey, 17–20 May 2022; pp. 659–662.
29. Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-time loop closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International
Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1271–1278.
30. Tadic, V.; Toth, A.; Vizvari, Z.; Klincsik, M.; Sari, Z.; Sarcevic, P.; Sarosi, J.; Biro, I. Perspectives of RealSense and ZED Depth
Sensors for Robotic Vision Applications. Machines 2022, 10, 183. [CrossRef]
31. Altay, F.; Velipasalar, S. The Use of Thermal Cameras for Pedestrian Detection. IEEE Sens. J. 2022, 22, 11489–11498. [CrossRef]
32. Töreyin, B.U.; Dedeoğlu, Y.; Güdükbay, U.; Cetin, A.E. Computer vision based method for real-time fire and flame detection.
Pattern Recognit. Lett. 2006, 27, 49–58. [CrossRef]
33. Celik, T.; Demirel, H.; Ozkaramanli, H. Automatic fire detection in video sequences. In Proceedings of the 2006 14th European
Signal Processing Conference, Florence, Italy, 4–8 September 2006; pp. 1–5.
34. Marbach, G.; Loepfe, M.; Brupbacher, T. An image processing technique for fire detection in video images. Fire Saf. J. 2006, 41,
285–289. [CrossRef]
35. Lin, M.X.; Chen, W.L.; Liu, B.S.; Hao, L.N. An Intelligent Fire-Detection Method Based on Image Processing. In Proceedings
of the IEEE 37th Annual 2003 International Carnahan Conference on Security Technology, Taipei, Taiwan, 14–16 October 2003;
pp. 172–175.
36. Ko, B.C.; Cheong, K.H.; Nam, J.Y. Fire detection based on vision sensor and support vector machines. Fire Saf. J. 2009, 44, 322–329.
[CrossRef]
37. Borges, P.V.K.; Izquierdo, E. A probabilistic approach for vision-based fire detection in videos. IEEE Trans. Circuits Syst. Video
Technol. 2010, 20, 721–731. [CrossRef]
38. Chi, R.; Lu, Z.M.; Ji, Q.G. Real-time multi-feature based fire flame detection in video. IET Image Process. 2017, 11, 31–37. [CrossRef]
39. Li, S.; Wang, Y.; Feng, C.; Zhang, D.; Li, H.; Huang, W.; Shi, L. A Thermal Imaging Flame-Detection Model for Firefighting Robot
Based on YOLOv4-F Model. Fire 2022, 5, 172. [CrossRef]
40. Conte, G.; Scaradozzi, D.; Mannocchi, D.; Raspa, P.; Panebianco, L.; Screpanti, L. Development and experimental tests of a ROS
multi-agent structure for autonomous surface vehicles. J. Intell. Robot. Syst. 2018, 92, 705–718. [CrossRef]
41. Li, S.; Feng, C.; Niu, Y.; Shi, L.; Wu, Z.; Song, H. A fire reconnaissance robot based on SLAM position, thermal imaging
technologies, and AR display. Sensors 2019, 19, 5036. [CrossRef]

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.

You might also like