A Low Cost Indoor Mapping Robot Based On Tinyslam Algorithm

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

A LOW COST INDOOR MAPPING ROBOT BASED ON TINYSLAM ALGORITHM

Zheng Gong1, Jonathan Li2 ,Wei Li1


Fujian Key Laboratory of Sensing and Computing for Smart Cities, School of Information Science and
Engineering, Xiamen University, Xiamen, Fujian 361005, China
Mobile Maping Lab, Department of Geography and Environmental Management, University of
Waterloo, Waterloo, Ontario N2L 3G1, Canada

ABSTRACT in contact with the ground and offer protection against the
problem of slippage, the mechanical architecture of the robot
It is important for a robot or a smart device to locate itself provides excellent odometry.
and create a map of its indoor environment. A number of The robot is powered by a 10Ah,14.8V LiPo battery. Two
approaches and techniques exist for indoor mapping, many 30W DC-Geared Motors give the robot a top speed of about
of which require expensive devices and highly complex 2m/s.
computational algorithms. In our work, we introduce a low
cost robot architecture based on a cheap LiDAR system and
a NVIDIA Jetson tk1 platform and perform real-time indoor
mapping based on the tiny SLAM algorithm.
.

Index Terms— LiDAR, SLAM, robot, lowcost,realtime

1. INTRODUCTION

A perception of the environment is essential to many mobile


robotic systems. Much research has aimed to solve the
problem of Simultaneous Localization and Mapping
(SLAM).
There are several kinds of SLAM approaches based on
different sensors and different algorithms. In early studies,
some researchers used sonar, or infrared sensors, as the eyes Fig .1 Two-wheel robot platform comprised of the crash sensor, the
of the robot [1]. Others employed on one camera [2] or a RPLiDAR system, and the main process platform NVIDIA Jetson tk1.
stereo camera [3] to perform SLAM. Of course, 2-D or 3-D
LiDAR sensors are widely used in SLAM research. [1]. Two 2.1. Hardware Description
main algorithms for robot location estimation are Kalman
filter[4] and particle filter[5] Choset et al. [6] used EKF and Fig.2 shows the hardware architecture diagram of our robot
some other filters in SLAM; however, in this study we didn’t platform. There are three kinds of sensors in this platform:
include them. three crash sensors ( Right, Middle, Left ), two odometry
This article introduces a low cost robot architecture based sensors( 663 plus per round ) and a low cost Lidar
on a cheap LiDAR system and a NVIDIA Jetson tk1 sensor( RPLidar ), to be described in detail later.
platform and performs real-time indoor mapping based on Additionally, there are two process boards in our system: the
the tiny SLAM algorithm[7] that uses particle filtering. NVIDIA Jetson tk1 and Arduino mega2560.
First, we present the hardware and software architecture
of our robot platform. Second, we discuss the algorithmic 2.1.1.NVIDIA Jetson tk1
process. Section 4 of this article gives the experimental
results. Jetson TK1 is the NVIDIA's embedded Linux development
platform featuring a Tegra K1 SOC (CPU+GPU+ISP in a
2. PLATFORM DESCRIPTION single chip). Jetson TK1 is pre-installed with Linux4Tegra
OS (basically Ubuntu 14.04 with pre-configured drivers).
The platform we designed is a robot with two odometry TK1 have quad-core 2.3GHz ARM Cortex-A15 CPU and
wheels (Figure 1). Because the two odometry wheels remain the revolutionary Tegra K1 GPU . Jetson TK1 is also

978-1-5090-3332-4/16/$31.00 ©2016 IEEE 4549 IGARSS 2016


equipped with 2GB DDR3L 933MHz DRAM and 16GB fast than 6 meters and has a scan ability of approximately 5.5Hz
eMMC Storage. (2000 samples per second) to ensure that our robot walks at
In our work, we use Jetson TK1 as the robot’s main a top speed of 1.5m/s.
processing board. The TinySLAM algorithm and Astar
algorithm were run on this board. Jetson TK1 also provides 2.2. Software Description
our system’s network and UART communication.
The software is comprised of two parts: the server (software
embedded on the robot) and the client ( iPad software).

Fig. 2 Hardware architecture diagram of our robot platform.

2.1.2.Arduino
Fig.4 Software architecture diagram of our robot platform.

Arduino, intended for anyone making interactive projects, is The embedded software on the Jetson TK1 runs on the
an open-source electronics platform based on easy-to-use Ubuntu Linux distribution, which provides the management
hardware and software. of the USB, UART, and network. Of utmost importance is
The Mega2560 (Arduino Series board) is a that we also run the TinySLAM Algorithm ( which includes
microcontroller board based on the ATmega2560. a particle filter ) and the Astar Algorithm in real-time on this
Mega2560 has 54 digital input/output pins (of which 15 can platform.
be used as PWM outputs), 16 analog inputs, 4 UARTs The embedded software on Arduino provides PID
(hardware serial ports), a 16 MHz crystal oscillator, a USB (proportion, integration, differentiation) speed control,
connection, a power jack, an ICSP header, and a reset button. collects sensor data, and sends the data to the TK1 board.
In our work, we use Arduino Mega2560 as the robot’s The client software shown in Fig. 8 displays the robot’s
Co-Processing board. It analyzes theTK1’s control signal mapping result and location ( X,Y,Theta ) and also provides
and uses this signal to control the robot. It also collects a control panel to allow users to maneuver the robot.
sensor data (crash sensors and odometry sensors) and
transmits the data to the TK1 board.
3. METHOD
2.1.3.Low-cost LiDAR System There are four steps in our system ( Fig.5): (1) data
Collection, (2) TinySLAM algorithm, (3) Auto navigation
algorithm, (4)Astar algorithm and PID speed control.

Step 1: We perform measurements to collect sensor data and


send it to the SLAM Algorithm.
Step 2: We perform SLAM by using the TinySLAM
algorithm that is the core method of our system consisting of
two main operations: calculating distance using a particle
Fig.3 The low-cost laser scanner filter and updating the map using the Monte-Carlo
Fig 3 shows a low cost 360 degree 2D laser scanner based algorithm[7].
on the Triangulation Principle. The scanner performs a 360 Step 3: We use the Auto navigation algorithm to find a
degree laser scan with a distance detection range greater destination on the map generated in the previous step.

4550
Step 4: We apply the Astar algorithm and PID speed control
to guide the robot to the destination determined in step 3.

Fig.5 Flowchart of the robot algorithmic process.

Due to the structural constraints of the 2-D Lidar system,


our robot cannot locate an object that is beyond the Fig.7 The robot’s first experimental environment. The orange robot
was the first version and did not employ a real-time system; rather, it sent
detection area.(The robot will collide with an object whose the data to the computer where the data was used to verify the TinySLAM
location is lower or higher than the LiDAR ) . Thus, we add Algorithm.
three crash sensors in front of the robot. The crash sensors
generate a plus signal if the robot has hit an object; the robot Unlike other algorithms and methods [8], our system
will then stop and plan an alternate route in the Astar step. exhibits superior performance in different environments.
Fig.8 shows the final results of the new robot. The
4. RESAULTS AND DISCUSSION positioning accuracy is about 10mm-30mm.

Experimental results are given in Figs. 6,7, and8 .As seen in


Fig. 6, an environment containing many objects is very
difficult to detect and trace using the laser scanner.

Fig.8 A map of the laboratory obtained during our experments display


on an iPad. The map on the left is the real map generated by the
TinySLAM algorithm. The lmap on the right is a Rough map, which is
used in the Astar algorithm.

Fig.6 The first experiment simulation on the PC and it’s not realtime.
5. CONCLUSION

We developed a low-cost robot, based on a cheap LiDAR


system and the NVIDIA Jetson tk1 platform. We performed
real-time indoor mapping, based on the tinySLAM algorithm

4551
that uses particle filtering. From this work, we conclude the
following :

1. TinySLAM is a real-time approach (ARM Platform)


that enables our robot to locate and map itself with a
positioning accuracy of 10mm-30mm.
2. Our system runs without odometer data but the
accuracy will increase if we add odometers data to
this algorithm, especially if the robot is operating in
a single environment like long corridor.
3. The kidnapped robot problem, in which a mobile
robot must recover from localization failure, is yet to
be sloved [4].
4. Because of the inherent sampling feature (2D Laser
sensor), the robot sometimes fails when avoiding an
obstacle.

6. REFERENCES

[1] Riisgaard, Søren, and Morten Rufus Blas. "SLAM for


Dummies." A Tutorial Approach to Simultaneous Localization and
Mapping 22.1-127 (2003): 126.

[2] Davison, Andrew J. "Real-time simultaneous localisation and


mapping with a single camera." Computer Vision, 2003.
Proceedings. Ninth IEEE International Conference on. IEEE, 2003.

[3] Sola, Joan, et al. "Fusing monocular information in


multicamera SLAM." Robotics, IEEE Transactions on 24.5 (2008):
958-968.

[4] Thrun, Sebastian, Wolfram Burgard, and Dieter Fox.


Probabilistic robotics. MIT press, 2005.

[5] Eliazar, Austin, and Ronald Parr. "DP-SLAM: Fast, robust


simultaneous localization and mapping without predetermined
landmarks." IJCAI. Vol. 3. 2003.

[6] Choset, Howie, and Keiji Nagatani. "Topological simultaneous


localization and mapping (SLAM): toward exact localization
without explicit localization." Robotics and Automation, IEEE
Transactions on 17.2 (2001): 125-137.

[7] Steux, Bruno, and Oussama El Hamzaoui. "tinySLAM: A


SLAM algorithm in less than 200 lines C-language program."
Control Automation Robotics & Vision (ICARCV), 2010 11th
International Conference on. IEEE, 2010.

[8] Ouellette, Robert, and Kotaro Hirasawa. "A comparison of


SLAM implementations for indoor mobile robots." Intelligent
Robots and Systems, 2007. IROS 2007. IEEE/RSJ International
Conference on. IEEE, 2007.

4552

You might also like