Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

1

GENERALIZED SOFTWARE APPLICATION FOR OPERATION OF


A 3D VEHICLE IN AIR, WATER AND LAND

Diogo Peres(1) (2), António Raimundo (1) (2), Nuno Santos(1) (2), Pedro Sebastião(1) (2), Nuno Souto(1) (2) , Alexandre Almeida(1) (2)

(1)
ISCTE-Instituto Universitário de Lisboa, DCTI, Avenida das Forças Armadas, 1649-026 Lisboa
(2 )
Instituto de Telecomunicações, Av. Rovisco Pais 1, 1049-001Lisboa
email: drbps@iscte.pt; aslro@iscte.pt; nmass@iscte.pt;
pedro.sebastiao@iscte.pt; nuno.souto@iscte.pt; alexandre.almeida@iscte.pt

those of multi-copters, most applications are focused to use


Abstract — The unmanned vehicles (UV) and its applications with multi-copters. However, this fast grown led to the need
are growing exponentially. Using the radio control is the most of new applications capable of controlling new types of
common way to control these types of vehicles for being a UV’s. This paper was focused in developing an application
simple and cheap method to control an UV. However, it doesn’t not only capable of controlling and monitoring various types
have a visual interface that allows the user to see the vehicle
of UV’s as well as a hybrid 3D vehicle, a vehicle capable of
information such as, the battery status, speed, distance,
geolocation, etc. To help with this problem, some mobile and moving in three dimensions, water, land and air.
desktop applications have been developed. To connect the
device to the vehicle, dongles are commonly used, they can 2. 3D VEHICLE
connect by radio, Bluetooth or Wi-Fi. In most cases, these
technologies don’t allow the user to control at long distances,
To simulate a vehicle capable of moving in three
beyond line-of-sight, and the applications are focused to use
mostly on multi-copters, and most of the times, they allow the environments, it was used a program called SITL, capable of
connection of only a vehicle at each time. simulating various types of UVs. This program was used to
The purpose of this work is to study the reliability of a software simulate an unmanned aerial vehicle (UAV), an unmanned
be able to control various types of vehicles, such as aerial, land ground vehicle (UGV) and an unmanned surface vehicle
and water vehicles. This software allow the user to connect (USV).
multiple vehicles to the same device at the same time, easily It was chosen to test in three vehicles separately to prepare
change the vehicle assigned to control, by using mobile and test the application with a single vehicle connected at a
networks to perform the communication between the software time and lastly, connect the three vehicles simultaneously to
and vehicle. In this way, it will be possible to connect a vehicle
simulate a 3D vehicle.
3D, which is a hybrid vehicle capable of moving in water, land
and air environments, allowing the user to control the vehicle at A real 3D hybrid vehicle was not construct since it would
long distances with video feedback. took a long time to build and the main focus of this paper is
the application to control a hybrid vehicle.
Key-words: vehicle 3D, Android, unmanned vehicles, multi- In addition, the controller boards are not prepared for 3D
vehicle, mobile networks. vehicles, so it was decided to construct three vehicles and in
case the user wants to make a 3D vehicle it will be only
1. INTRODUCTION necessary to put together the three electronics in a single
frame.
In recent years, the use of unmanned vehicles has grown The application was constructed in a way that allows the
quickly. The technology was developed primarily to use in user to change between vehicles easily, in a 3D vehicle this
the military area. The ability to use them in dangerous action corresponds to the exchange between the different
missions without risking human lives was the main factor driving modes. Two real vehicles were also constructed, a
that led to its creation. The success of this technology hexacopter and a scaled model of a car, to test the
aroused the curiosity of people who quickly realized the application in a real scenario.
potential of these technologies. Nowadays, unmanned
vehicles can do all kinds of tasks. In the military field, they 3. HARDWARE USED
can help with search and rescue operations, surveillance,
explosives disarming, pursuit operations, delivering supplies Besides the common electronics used in a UV, such as: the
to remote and/or inaccessible regions, border patrol motors and servos, the electronics speed controllers to
missions, crowd monitoring, and so on [1]. Its great utilities regulate the amount of power passing to the motors and the
have led to a fast growth of the UV market which has led to battery, it was used other hardware to be able to connect the
a growth in the appearance of applications capable of vehicle to the network and communicate via wireless
controlling and monitoring them, usually called ground networks with the mobile application. Figure 1 shows the
control stations (GCS). Since one of the markets that has technical architecture of the solution proposed. The
been growing the most is the UAV market, more precisely Raspberry Pi and the Controller Board were placed in the
2

vehicle and communicate with the server. The server The camera was used so the application could receive video
communicates also with the mobile application to exchange feedback and allow the user to see were it’s going in case the
the messages coming from and to the vehicle vehicle is out off line of sight. The raspberry pi was placed
in the vehicle and connected to the controller board and had
Raspberry Pi Mobile a 3G/4G dongle to be able to connect to the internet.
Controller Application
MAVProxy
Board 4. SETTING UP

Java Program Server To communicate and prepare the hardware to deal with the
different types of vehicles and communicate between
different units it was used several software programs.
Figure 1 - Technical Architecture
1) Mission Planner
1) Controller Board
To make the controller board functional it was needed to
Controller boards for UVs are generally referred as flight load a firmware. There are several GCS available in the
controllers because they are mostly used for aerial vehicles, market capable of doing it. For this paper, it was chosen
responsible for managing the flight of multi-rotor aircrafts Mission Planner for being a simple, free and widely used
and airplanes. Its purpose is to stabilize the aircraft during GCS that is available for Windows. After installing it, the
flight, to do so, it takes the signals from on-board controller board should be connected to the computer with
gyroscopes and passes them to the processor, which the help of a micro USB cable and installed the firmware
processes signals according to the user’s selected firmware according to the type of vehicle pretended (ArduRover for
(corresponding to its type of vehicle) and passes the control cars and boats, ArduPlane for fixed-wings, ArduCopter for
signals to the installed ESCs to make fine adjustments to the Helicopters, ArduCopter Quad for quadcopters, ArduCopter
motors rotational speed which in-turn stabilizes the aircraft. Hexa for hexacopters, etc) [2].
Controllers boards also uses signals from radio or other
sources like Bluetooth or Wi-Fi modules and passes these 2) MAVLink
signals (e.g. aileron, elevator, throttle and rudder) together
with stabilization signals to the processor, which once Micro Air Vehicle Link (MAVLink) is a communication
processed send them to the ESCs to turn and control the protocol used mostly for communications between a GCS
vehicle orientation. There are several controller boards and UVs. It can be used to transmit the altitude of the
available in the market, although, it was necessary a vehicle, speed, battery status, its GPS location, and other
controller board with advance functionalities like GPS and information. MAVLink can pack C-structs over serial
auto guide. One of the most known flight controller is channels with high efficiency and send these packets to the
Ardupilot Mega (APM), and its evolution Pixhawk which GCS. It was extensively tested on various controller boards
was chosen as the main controller to be used because it has a like the PX4, Pixhawk and APM. In other words the
better performance, uses an open-source software and is also MAVLink message is just a stream of bytes encoded by the
suitable for most types of radio control vehicles: fixed GCS to send to the controller board or vice-versa. The
wings, copters, cars and boats, depending on the firmware stream of bytes can be sent by USB serial or telemetry (both
loaded cannot be used at the same time, if the both are plugged in,
To communicate with external systems such as ground USB is given preference and Telemetry is ignored).
control stations this controller board uses a messaging
protocol called MAVLink. Table 1 – MAVLink Message packet structure

2) Raspberry Pi Content STX LEN SEQ SYS COMP MSG PAYLOAD CKA CKB

Byte Ind 0 1 2 3 4 5 6 to (n+6) n+7 n+8


To connect the vehicle to the network its necessary a way to
forward the messages coming from the controller board
through Wi-Fi or 3G/4G, to do so it will be used a raspberry A MAVLink packet has the structure described in Table 1.
pi. The first 6 bytes are the message’s header, being the first
Raspberry pi was chosen for being a tiny and affordable one to indicate the start of a new MAVLink packet, the
computer, capable of running Linux, with a big community second byte has the message’s length, third byte is the
of developers, and capable of connecting to real world sequence number, fourth corresponds to the system ID and
objects with the help of GPIO pins, USB ports, a HDMI port indicates which system is sending the message, allowing to
and audio jack. distinguish each vehicle or GCS in the same network, the
It was also used a raspberry camera module and installed a fifth byte has the component ID and allows to identify
collection of drivers and API called Video4Linux for different components on the same system, such as two GCS
supporting realtime video capture on Linux systems. running in the same machine and the last byte in the header
3

corresponds to the message’s ID, which allows to decode the security. The application redirects to the server all the
payload correctly. The Payload, as shown in Table 1 is the packages that are coming from MAVProxy and vice-versa
actual data and the last two bytes are the checksum which (Figure 1).
the software uses to check if the message is valid and not
corrupted. A MAVLink message has a minimum length of 8 5. MOBILE APPLICATION
bytes, typically for acknowledgement packets (ACK)
without payload and a maximum length of 263 for full The mobile application was developed in android studio. It
payload [3]. was opted for Android because it has many users on a global
scale, it’s free and is a native technology, having a better
3) MAVProxy performance in the devices using android. The minimum
version targeted was android Jelly Bean 4.1.xx since it will
MAVProxy is a command line GCS with the ability to run on 97,5% of android phones (considering the time that
forward the messages from the UV controller board over the the paper was written).
network via UDP to other GCSs or devices. It is written To be able to decode the MAVLink messages it was used a
100% in Python and it’s an open source software. This library called “MAVLink Java” which decodes the bytes
software will be used in the Raspberry Pi to be able to read with the help of xml files and generates Java code.
the messages coming from the controller board and forward The mobile application called ‘Mobile Control Station’ starts
them through the network to other devices. in a login screen which receives a username and a password
to authenticate in the server. If the server responds with a
4) SITL successful message the user is redirected to a main screen
with the vehicle list, displaying the vehicles ready to be
To help the tasks of this work it was used a program to controlled, so the user can select one.
simulate the controller board in the vehicle (able to simulate The main screen has two tabs at the top. In the first one it’s
APM or Pixhawk). The simulator is called SITL (Software possible to manually control the vehicle by using two
In The Loop) and is based on Arducopter, allowing to run joysticks and in the other tab, called mission tab, it’s
Plane, Copter or Rover without any hardware. SITL is possible to monitor and control the vehicle by sending
especially useful since some malfunctions in the program waypoints.
while in the development phase can cause the vehicle to The main screen has also a menu with five options that will
crash. With SITL, the program can be first tested in the be explained later.
simulator and then, when stable, passed to the real vehicle.
This simulator can be run either on Windows or Linux.
Since SITL will emulate the controller board it will be still A. Controller Screen
needed a way to communicate with it, to do this it will be
installed MAVProxy alongside with SITL. The controller tab redirects to a controller screen which is
used to check the vehicle information’s and manually
5) Server control it.

One of the main challenges of this work was to deal with the
network address translation (NAT). The public IP of a
device changes according to the network that it’s in and due
to this fact the application doesn’t know the IP of its vehicles
and vice-versa. To overcome this problem the
communications were done using a server.
Each time a vehicle or an application is turned on, it will
connect to a server that will authenticate the device and
register its IP, the server will also redirect the messages from
the application to its vehicles and vice versa.
The server stores the user information such as the username,
password and the UV’s corresponding to each user.

6) Vehicle Java Program


Figure 2 – Controller Screen
It was also developed a java application that was exported to
a jar file. This jar file runs in the raspberry pi placed in the
vehicle and it’s started each time the vehicle its turn on. There are several features in the controller screen, such as
These program is responsible for connecting the vehicle to the Arm/disarm button. When the vehicle is disarmed it will
the server. The applications stores in a .txt file an individual not respond to the commands sent by the user. The vehicle
ID and token that are different for each vehicle and they are has to be first armed in order to control it.
used by the server to authenticate the vehicle to increase its
4

Mode Button, in Figure 2 corresponds to the button


displayed as “stabilize”. Once this button is clicked it will
display a popup to allow the user to change the guiding
mode of the vehicle. Each type of vehicle has its own modes
that can be for example, the stabilize mode, only existent in
flying vehicles, corresponding to the manual mode in the
rover firmware. This mode allows the user to control the
vehicle by using the joysticks in the application.
Auto mode. Once the user sends a set of waypoints to the
vehicle and changes to auto mode, the vehicle will start the
mission and go through the waypoints sent autonomously
until it reaches the final one where it will remain still.
Guided mode. If the vehicle is in guided mode the user can
send a waypoint to the vehicle which will move immediately
to the waypoint sent. Figure 3 – Missions Screen
RTL mode (Return to Home). As the name suggests, if this
mode is enabled the vehicle will return to home which by
default is the point where it was turned on. There are several buttons in the missions screen.
Some modes require the GPS module to be installed in the 1. There is a takeoff button, displayed in Figure 3. In this
controller board to work. button, the user can click to edit the desired altitude
Controller screen has also a battery level indicator and a or/and takeoff, the vehicle will then takeoff
connection status indicator. This indicator displays the autonomously and stay or circle the desired altitude.
Estimated RTT (Estimated round trip time of the messages) 2. Request waypoints button gets the last waypoints sent to
the controller board, which are the ones the vehicle will
̂ ̂ travel once the user changes the vehicle mode to auto.
𝑅𝑇𝑇𝑛𝑒𝑤 = (∝ 𝑅𝑇𝑇𝑜𝑙𝑑 ) + ((1−∝)𝑅𝑇𝑇𝑛𝑒𝑤𝑠𝑎𝑚𝑝𝑙𝑒 ) (1) 3. Edit waypoints button enables the waypoint editing, if
this button is clicked, its label changes to stop editing
The value for variable ∝ (that could go from 0 to 1) was 0 to and two more buttons appear, one to clear all the
make the new ̂ 𝑅𝑇𝑇 the value of the last round trip time. waypoints and one to send the waypoints created to the
In the grey top bar there is placed three labels, the first one controlled board. To create waypoints the user only has
indicates the height (in meters) of the vehicle. The second, to have the edit waypoints option enabled and click in the
labeled as ‘VS’, indicates the vertical speed in meters per map in the desired location. The user can then click in a
second, and the last one, labeled as ‘HS’, indicates the waypoint to edit its altitude. The vehicle will move to the
horizontal speed, in meters per second as well. altitude introduced in the waypoint, which by default is
The Blue button at the bottom center enables and disables 10 meters, ground and surface UV’s will ignore this
the joysticks that will enable to manually control the vehicle. parameter. Once the user finishes editing the waypoints,
Controller screen has also the video feedback overlaid with the send button must be clicked to send the waypoints to
an HUD (Heads Up Display, displayed at green) with the the controller board and change the vehicle mode to auto
purpose of helping the user to have a better perception of the to start the autonomous mission.
vehicle’s direction and position. The vehicle will move through the waypoints in the order
they were introduced, the application helps the user by
showing the UV’s path while editing the waypoints.
B. Missions Screen
The user can make a long press in a location in the map and
the application will automatically change the vehicle mode
As the controller screen, the missions screen also displays to guided and send it to the location clicked while
the vehicle’s information, like the altitude, speed, battery maintaining its altitude.
status, connection status, has an arm/disarm and a mode
button and a HUD. The mission tab allows the user to C. Side Menu
monitor one or more drones and send waypoints so they
could move autonomously. The three dots at the top left of the screen opens a side menu
with five options. These options are:

1) UV’s selection

The first option displays a popup list with the vehicles


connected to the application, an image corresponding to the
type of vehicle and a unique identifier.
In this list the user can change the vehicle selected to
control, this is necessary since the application it’s able to
5

display and monitor several vehicles at a time but it can only


control a single one, which is the one selected by the user. 𝐷𝑒𝑣 epresents the deviation, 𝐷𝑒𝑣𝑛𝑒𝑤 is the new value for
deviation and 𝐷𝑒𝑣𝑜𝑙𝑑 is the estimated old deviation value.
2) Parameters If a response message is not received within the timeout, it is
given as lost. The percentage of received messages, RM, is
This option requests the controller board all the parameters given by:
and their values. Once all the parameters are received, it
opens a screen with all the parameters so the user can see or 𝑇𝑝𝑚𝑠−𝑇𝑚𝑙
𝑅𝑀 = × 100 (4)
𝑇𝑝𝑚𝑠
edit their values. Parameters allows the user to change the
vehicle configurations. An example of a parameter is the
vehicle sensibility to the controlling commands. Tpms is the total ping messages sent and Tml is the total of
messages lost.
3) Settings
5) Logout
The Settings Screen have the settings for each axis in the
joystick. In this screen is possible to revert the controls, Lastly the logout option button sends a TCP message to
define the minimum and maximum of the axis and trim the disconnect the user.
middle value up or down like in a real radio controller.
6. SIMULATION AND EXPERIMENTAL RESULTS
4) Statistics
To test the application several tests were done using SITL
The Statistics option opens a statistics screen where there are and using real unmanned vehicles. First the application was
displayed four graphs. The first one indicates the battery tested with a vehicle at a time, next it was tested with two
level over time. The second graph has two lines, a blue one and finally with three vehicles connected to the same user at
for the Vertical Speed and a brown line for the Horizontal the same time, simulating a 3D vehicle.
Speed, indicating the variation of speed over time. The third The 3D Vehicle will appear in the vehicle list as three
graph has the number of messages received and a label with separated vehicles and the user will only need to select the
the average number of messages per second as well as the type of vehicle it wants to control (Figure 5).
total number of messages received since the application was
initialized.

Figure 5 – Vehicle list

Figure 4 – Statistics screen.

The last graph displays the RTT over time and the packet
loss. The RTT is calculated with the equation (1). To get the
packet loss it’s necessary to calculate the timeout for a
packet, which is calculated as

̂ + 4 × 𝐷𝑒𝑣
𝑇𝑖𝑚𝑒𝑜𝑢𝑡 = 𝑅𝑇𝑇 (2)

and

̂ | (3)
𝐷𝑒𝑣𝑛𝑒𝑤 = (1−∝) 𝐷𝑒𝑣𝑜𝑙𝑑 + ∝ |𝑁𝑒𝑤𝑅𝑇𝑇𝑛𝑒𝑤𝑠𝑎𝑚𝑝𝑙𝑒 − 𝑅𝑇𝑇 Figure 6 – Multi vehicle monitoring
6

It was tested three types of firmware. Rover, Copter and For future work, it could be developed a firmware to the
Plane. Tested both on controlling and monitoring. The Plane controller board able to deal with a 3D vehicle, so that it
firmware was only tested in the simulator since it was not wasn’t necessary to have three separated units together.
constructed a fixed-wing. Another suggestion is to improve the delay, i.e., minimizing
The car, boat and copter were tested in simulation and with the delay time in the video streaming.
real vehicles.
The modes tested where: ACKNOWLEDGEMENTS
• Guided (Copter, Plane and Rover);
• Auto (Copter, Plane and Rover); The authors would like to thank Instituto de Telecomu-
• Stabilize (Copter and Plane); nicações, (Portuguese R&D Centre) and ISCTE-University
• Position Hold (Copter); Institute of Lisbon, by providing funds for the acquisition of
• Manual (Rover); all the material needed to carry out this project. Without
• RTL (Copter and Plane); these funds the work would not be possible.
The application was continuously being tested to ensure all
the bugs were tested and maximize the application’s REFERENCES
performance.
The statistics screen allowed to gather information. One of [1] C. E. Nehme, "Modeling human supervisory control in
the information collected was the average RTT for the heterogeneous unmanned vehicle systems,"
controller messages, which was tested several times using Massachusetts, 2009.
different scenarios, to construct Table 2. It was tested with
[2] "Mission Planner Overview" [Online]. Available:
and without video feedback, the RTT increase with video
http://ardupilot.org/planner/docs/mission-planner-
because the video streaming requires more processing
overview.html. [Accessed 12 August 2016].
power, has a higher bit rate than the controlling messages
(MAVLink) and requires more Bandwidth. [3] "MAVLink Micro Air Vehicle Communication
Protocol," [Online]. Available:
Table 2 – Controlling messages RTT http://qgroundcontrol.org/mavlink/start. [Accessed 29
August 2016].
Video 1 vehicle 2 vehicles 3 vehicles [4] Szablewski, D. (Ed.),“HTML5 live video streaming via
Feedback [ms] [ms] [ms] websockets,” 11 September 2013. [Online]. Available:
No 127 134 142 http://phoboslab.org/log/2013/09/html5-live-video-
Yes 172 188 210 streaming-via-websockets. [Accessed em 6 December
2016].
The approached used to have video feedback was video
streaming via websockets. First a multimedia framework
called ffmpeg need to be installed in the raspberry pi in order
to handle the video coming from the camera. This
framework will encode the video to MPEG and send it to the
server via HTTP. The server will need to have installed a
JavaScript framework called nodejs to run a nodejs script
that will distribute the MPEG stream via Websockets to the
connected clients that will decode it to reproduce the video
[4].
To test the connection a HTML and Javascript files were
downloaded from jsmpeg project in GitHub. These files
were loaded and placed in a webview in the application.
They connect the application to the server via a websocket
and display the image in the webview.

7. CONCLUSIONS AND FUTURE WORK

As a main conclusion, it was successfully developed an


application able for control and monitoring of one or more
vehicles as well as a 3D vehicle, using mobile networks to
communicate and receive video feedback. Allowing the
control at long distances. The application provides the user
an affordable and versatile way to control in real time
various types of vehicles.

You might also like