Development of Mobile Robot Using LIDAR Technology Based On Arduino Controller

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

240

Development of Mobile Robot Using LIDAR


Technology Based on Arduino Controller
Pavlo Denysyuk, Vasyl Teslyuk Iryna Chorna
CAD Department Separated structural subdivision Educational and Scientific
Lviv Polytechnic National University Institute of Enterprise and Perspective Technologies
Ukraine, Lviv Lviv National Polytechnic University,
pavlo.denysyuk@gmail.com, vasyl.teslyuk@gmail.com Lviv, Ukraine
iryna.chorna@ukr.net

Abstract—In the work, a software model based on the graph- interface software that allows them to control the robot and
based SLAM methodology is developed using the LIDAR monitor the operations that the system performs
technology, which allows direct and automatic control of the automatically.Also, the ability to automatically find the route
mobile robot in real-time.The architecture of the developed distinguishes the developed system from conventional radio-
software and the main algorithms on which the work of the system controlled devices and provides a basis for further improvement
is based is described. The technical support provided by the and the implementation of useful functions.
Arduino microcontroller is developed.Examples of application of
built-in software are given. Accordingly, the purpose of the work is to develop a system
based on the SLAM methodology [10, 11] and LIDAR
Keywords—SLAM methodology, LIDAR technology, mobile technology and will be able to automatically route the path and
robot system, Arduino microcontroller. move to the specified point on the map.
I. INTRODUCTION II. DEVELOPMENT OF TECHNICAL SUPPORT OF THE SYSTEM
Nowadays, robotics occupies a significant place in everyday The Arduino DUE [12] microcontroller is selected for
life of a person [1].Robots are used as assistants in systems of implementation, because it meets all the requirements and is
smart houses [2], in industry, in many fields of science and much faster, compared with other similar functional boards.
technology, and perform a number of works much better than a
person [3, 4]. The development of a system that can For the realization of the system, a LIDAR sensor was used,
independently move and orientate in space is the first step the principle of action of which is similar to the work of the radar
towards the realization of a robotic system with artificial - the directed light beam is reflected from the objects returned to
intelligence. the receiver [13].The disadvantage is the inability to obtain
correct data about the distance to transparent objects. Connection
From a technological point of view, the easiest way to scheme of LIDAR sensor connection is shown in fig.1.
measure the distance to objects in the environment is to use an
ultrasonic sensor [5].However, there is an imperfection in this
type of sensor, that lies in the "blind zone", it occurs when the
sound wave is reflected from surfaces at too large angles,
because of this the sound does not return to the receiver.
The LIDAR technology [6, 7] is the most effective method of
collecting data about the distance to surrounding objects. This
technology is to obtain environmental information through active
optical systems. Using the LIDAR sensor allows you to build a
two-dimensional, or three-dimensional space map. We can
achieve the anthropomorphic behavior of the system [8],
analyzing it with the help of an artificial neural network [9], or
more simple algorithms.
The task of simultaneously constructing a map of space and
positioning on it a mobile robot is not trivial since it requires the
Fig. 1. Simplified LIDAR connection of the sensor
system to work in real time, to ensure the accuracy of
measurements and reduce the impact of errors on the results of In the process of designing a sensor,
work. PiccoloLaserDistanceSensor is used. In standard mode, the laser
There is a wide range of similar systems on the market, but performs 240 revolutions per minute, which provides a complete
in most cases, they do not provide the user with a graphical user update of data 4 times per second.

978-1-5386-5881-9/18/$31.00 ©2018 IEEE

MEMSTECH 2018, 18-22 April, 2018, Polyana-Svalyava (Zakarpattya), UKRAINE


241

To connect the control program with the microcontroller the • Transmission code of the distance traveled. After
Bluetooth module HC-06 is used.The signal strength provided by receiving data the number, which means the distance
this module is quite sufficient forcontrolling in a large room. traveled in centimeters is expected. The status of the
Robo.distance variable is updated.
A gyroscope MPU6050 is used to measure the angle on
which the platform is turned.For the correct operation of the • Completion code for the operation.
gyroscope, before using, you must measure the error and
After receiving the code for completing the operation, the
determine the correction factor for the results.As a mobile
control module goes out of standby and passes the following
platform, the Pirate - 4WD MobilePlatform with four FIT0016
command (if any) to the executable program.
specification engines is used.PowerBank 10000 mah, 5
accumulators 2700 MAH is used as a power source. After receiving data from the LIDAR sensor, a graph is
constructed. The graph is used to find a way and to position the
The orientation in space and positioning on the graph is wheelbase of the technical platform. After that, the task of
provided by three types of sensors: encoder, gyroscope and constructing a path on the graph is solvedand control commands
LIDAR, the data of which is processed continuously, in real time. are formed to automatically move the platform to a given point.
A protocol for data exchange has been developed to establish View: Used to build the user interface. The interface contains
communication and exchange information between the manager a field for typing text commands and a visual graph image that
and the executing programs. Identification of the type received corresponds to a two-dimensional map of space constructed
through the data link is due to the corresponding numerical based on data obtained from the LIDAR sensor.
codes.
Controller: The key features of managing a mobile platform
III. DEVELOPMENT OF SOFTWARE SYSTEM MODEL with the help of a keyboard and mouse are realized.
In the process of software development, Java was used, while Additionally, auxiliary classes were developed to display
developing the interface used standard JavaSwing library.It graphical and textual information, to reset key parameters when
provides an easy implementation of the interface using the MVC the system is freezing, logging of all operations, and tracking
architecture.IntelliJ IDEA [14] and Arduino IDE are used as a flows in the program.
development environment. In fig. 3. The general scheme of work of the control and
The program is built according to the data split scheme (Fig. executing program is presented. The user interacts with the
2), the user interface and the control logic into three separate system through a graphical interface and a ribbon for entering
Model-View-Controller (MVC) components. Below is a commands and settings.Management can be done in three ways:
description of the main classes shown in the diagram and split using the arrows on the keyboard; using a graphical interface -
according to the MVC pattern [15, 16]. by installing points on the graph in which the wheeled platform
should move; through the console tape by entering the
Model: Execution starts with the Main class.Its main appropriate commands.
function is to launch the user interface. The Robo class is used in
In any control mode, the program switches to the last
the robot management program. After the call, the method
command received from the user. By pointing the path on the
delegates the execution of the COMport class to transfer the
graph and entering the command "run" the program switches to
corresponding command through the communication channel in
automatic mode, the control is transferred to the corresponding
the executable program.After calling the method, it delegates the
module to find the path to the specified point. The control
implementation of COMport class to transfer the corresponding
program stops working automatically when all preset destination
command through the communication channel to executing the
points are reached or the user stops the automatic search.
program.
All commands in the form of the corresponding codes are
The COMport class is used to transmit data through the sent to the executing program via the COM port,All commands
communication channel. Passes commands that recognize the in the form of the corresponding codes are sent to the executing
executing program as numeric codes. Four types of commands: program via the COM port. After that, the Robo class object gets
• LIDAR data transfer code. After receipt, two numbers are a label (inProgress), which is used by the program in automatic
expected: angle, distance. The resulting data is mode and signals that the executing program is currently in run
transmitted to the module for constructing a graph - the mode. The label (inProgress) is removed after receiving a code
class GraphCreator. from the executing program about the completion of the
operation.
• Data transfer code from a gyroscope. After waiting, a
running angle is expected on which the wheeled platform The software model of the robotic system, which makes it
is reversed relative to the original direction. After possible to process data in real time, is elaborated. The
receiving the information, it updates the status of the data uniqueness of the developed software model is to use the theory
in the Robo class. of graphs to find the optimal path of motion.

MEMSTECH 2018, 18-22 April, 2018, Polyana-Svalyava (Zakarpattya), UKRAINE


242

In the state of waiting for commands from the user, the


program on the microcontroller refers to the LIDAR class and
the MPU to read the interference data in the room and track the
angle of rotation. The data obtained from LIDAR and MPU are
sent to the control program through the communication channel
with the corresponding data code. In the control program, a
separate stream processes the information received through the
COM port and updates the program data state.
In automatic mode, the control program receives from the
mobile robot the distance traveled data, which is used to update
the position of the start node in the graph.

Fig. 2. The general scheme of work with the data received from the MD

Fig. 4 Starting node graph at the beginning of the program

To construct the graph, the polar coordinates of the points


that the program receives from the robot through the
communication channel is used. Before use, the polar
coordinates turn into Cartesian coordinates of node N (X1, Y1).
Thus d is the distance obtained, μ is the angle obtained, X1 = d *
cos (μ), Y1 = d * sin (μ). If a negative distance is obtained then
there is no obstacle in the given radius of visibility, all nodes
through which the straight line from the node with coordinates N
(X0, Y0) to the node with the received coordinates N (X1, Y1)
is designated as free (through them can be laid path).
If the received distance is positive, all nodes, starting from
the start to the node with the received coordinates, are denoted as
free, and the node with the coordinates N (X1, Y1) is denoted as
occupied - laying the path through this vertex is impossible.
Figure 5 shows an example of constructing a graph with
coordinates of three points. After receiving Cartesian
coordinates, the distance is divided by the dimension of one node
graph. Direct is divided into the received number of segments.
When generating, the nodes, through which the straight line
passes from the starting node - N (X0, Y0), to the point with the
received coordinates - N (X1, Y1), are created. All created nodes
are added to the hash table. If the node of the graph through
which the straight line is already exists, the previously created
Fig. 3. General scheme of the system workAt the beginning of the program, the
start node of the graph is created (Fig. 4) with the coordinates N (X0, Y0). The object is used. While creating a new node, all adjacent nodes are
starting point is the vertex at which at the moment a robot is located. While identified and reported by coordinates to preserve the integrity of
moving, it can change its position on the graph. the grid.

MEMSTECH 2018, 18-22 April, 2018, Polyana-Svalyava (Zakarpattya), UKRAINE


243

Fig. 5. Construction of the graph on the coordinates of three points N (4,2), N (-


3,2), N (-2,3) Fig. 6. Links between the nodes of the graph.

To save memory and computing resources of the processor, Using the theory of graphs in the software model allows you
the graph has a limited size. When you change the coordinates of analyze the obstacles that can appear on the path of the robot and
the start node N (X0, Y0) on the graph while in the automatic adjust the motion in real time.
mode, all nodes that go beyond the specified radius of visibility
are deleted. Objects that overlap with nodes in the "NO_WAY" IV. RESULTS OF THE SYSTEM WORK
state are also deleted The results of the system work are shown in fig. 7 and fig.
During generation, a coherent graph (mesh) is formed. An 8.In particular, in fig. 7 the movement of the robotic system
example of links between vertices of a graph is shown in Fig. 6. (destination point) is indicated, and in fig. 8 - a screenshot of the
graphics window of the program is shown.The tests of the
All objects that lie on the line on the node with the resulting developed system allow to state that the system works correctly.
coordinates are deleted.

Fig. 7. The next destination point is indicated on the picture

MEMSTECH 2018, 18-22 April, 2018, Polyana-Svalyava (Zakarpattya), UKRAINE


244

Fig. 8. Screenshot of the graphics window of the program.The function of displaying all vertices of the graph is enabled.The nodes that are behind the obstacles or
are not within the radius area of the scan are deleted

Accident Prevention Subsystem for the Smart Home System",


V. CONCLUSIONS International Journal of Intelligent Systems and Applications(IJISA),
For orientation in space, in the process of a system Vol.10, No.1, pp.1-8, 2018. DOI: 10.5815/ijisa.2018.01.01
developing,the SLAM methodology using the graph for tracking [3] R. Silva Ortigoza et al., "Wheeled Mobile Robots: A review," in IEEE
the running position in space is used. The software of the system, Latin America Transactions, vol. 10, no. 6, pp. 2209-2217, Dec. 2012.
based on the use of Java, the standard JavaSwing library and the [4] I. Tsmots, V. Teslyuk and I. Vavruk, "Hardware and software tools for
motion control of mobile robotic system," 2013 12th International
IntelliJ IDEA and Arduino IDE development environments has Conference on the Experience of Designing and Application of CAD
been developed. In the form of a module that builds a graph Systems in Microelectronics (CADSM), Polyana Svalyava, 2013, p. 368.
based on data obtained from a mobile robot via a communication [5] http://wiki.ros.org/gmapping
channel, software is developed. The program can work in three [6] http://www.robogeek.ru/bytovye-roboty
modes: direct control, text command control, and an automatic
[7] S. Hwang, N. Kim, Y. Choi, S. Lee and I. S. Kweon, "Fast multiple objects
mode in whichin which the software installed on the computer detection and tracking fusing color camera and 3D LIDAR for intelligent
independently looks for a path on the graph and directs the robot vehicles," 2016 13th International Conference on Ubiquitous Robots and
to move it to the user-specified point has been developed. Ambient Intelligence (URAI), Xi'an, 2016, pp. 234-239.
[8] P. Denysyuk and T. Teslyuk, "Main algorithm of mobile robot system
A graphical interface that visualizes a two-dimensional map based on the microcontroller Arduino," 2013 XVIIIth International
built on the basis of data from LIDAR and provides the ability to Seminar/Workshop on Direct and Inverse Problems of Electromagnetic
monitor the operations performed by the robot in real-time. and Acoustic Wave Theory (DIPED), Lviv, 2013, pp. 209-212.
[9] Zou AM., Hou ZG., Fu SY., Tan M. (2006) Neural Networks for Mobile
The technical support based on the use of the Arduino DUE Robot Navigation: A Survey. In: Wang J., Yi Z., Zurada J.M., Lu BL., Yin
microcontroller has been developed. The mobile platform is H. (eds) Advances in Neural Networks - ISNN 2006. ISNN 2006. Lecture
equipped with sensors to ensure accurate positioning on a two- Notes in Computer Science, vol 3972. Springer, Berlin, Heidelberg
dimensional map and track its movement in space. [10] J. J. Leonard and H. F. Durrant-Whyte, "Simultaneous map building and
localization for an autonomous mobile robot," Intelligent Robots and
The scientific novelty of the proposed development is to use Systems '91. 'Intelligence for Mechanical Systems, Proceedings IROS '91.
the theory of graphs in the developed software model for laying IEEE/RSJ International Workshop on, Osaka, 1991, pp. 1442-1447 vol.3.
and selecting the path of mobile robot movement. The use of [11] G. Grisetti, R. Kummerle, C. Stachniss and W. Burgard, "A Tutorial on
graphs made it possible to analyze the obstacles that may appear Graph-Based SLAM," in IEEE Intelligent Transportation Systems
Magazine, vol. 2, no. 4, pp. 31-43, winter 2010.
on the path of the robot and adjust the motion in real time.
[12] https://www.arduino.cc/en/Main/arduinoBoardDue
REFERENCES [13] https://en.wikipedia.org/wiki/Lidar
[1] C. Hernandez Corbato, M. Bharatheesha, J. van Egmond, J. Ju and M. [14] https://www.jetbrains.com/idea/
Wisse, "Integrating Different Levels of Automation: Lessons from [15] [15] https://en.wikipedia.org/wiki/Model-view-controller
Winning the Amazon Robotics Challenge 2016," in IEEE Transactions on [16] W. Pree and H. Sikora, "Design patterns-essentials, experience, Java case
Industrial Informatics, vol. PP, no. 99, pp. 1-11. study," Proceedings of Joint 4th International Computer Science
[2] Vasyl Teslyuk, Vasyl Beregovskyi, Pavlo Denysyuk, Taras Teslyuk, Conference and 4th Asia Pacific Software Engineering Conference, 1997,
Andrii Lozynskyi, "Development and Implementation of the Technical pp. 534-535.

MEMSTECH 2018, 18-22 April, 2018, Polyana-Svalyava (Zakarpattya), UKRAINE

You might also like