Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Proceedings of the International Conference on Intelligent Computing and Control Systems (ICICCS 2020)

IEEE Xplore Part Number:CFP20K74-ART; ISBN: 978-1-7281-4876-2

Robot Operating System based Charging Pad


Detection for Multirotors
Abhijith .V.S A.A. Bazil Raj
Department of Mechanical Engineering Department of Electronics Engineering
Defence Institute of Advanced Technology Defence Institute of Advanced Technology
Pune, India Pune, India
mr.abhijithvs@gmail.co m brazilraj.a@diat.ac.in

Abstract—The objective of this paper is to develop a wireless information passed through nodes are called messages. The
charging pad detection mechanism for autonomous quadrotors data type of the message is the data type of the information
using AprilTag, and analyze its performance. It uses a Raspberry contained in the message file. Messages can be of complex
pi 3B+ single board for data processing and Microsoft webcam data types. That is, it can include different kinds of data types
for image acquisition and Robot Operating S ystem (ROS ) [2].
framework and PIXHAWK series flight controller. The charging
pad has an AprilTag marker on its top, the camera captures the AprilTag is a marker widely used in robotic applications. It
image and gives it to the ROS node to get the relative position uses a barcode-like a tag system to obtain 6 DOF localization
between the camera and charging pad. The system can detect the features from a single image. It is immune to occlusion, lens
charging pad at moderate altitudes with a limited number of false distortion, and wrapping. AprilTag system consists of different
detections and give the relative coordinates. An increase in tag tag families [3]. The AprilTag 2 system uses a new algorithm
size causes an increase in the detection range. The system is that reduces false detection, increases detection rates, and
compatible with any S ingle Board Computers (S BC) and reduces computation time [4]. Dynamic markers use an LCD
cameras. This method is mainly for detecting wireless charging screen to display the AprilTag. The size of the marker reduces
pad to charge quadrotor without human involvement. as the Unmanned Aerial Vehicle (UAV) approaches the
marker. This method is useful for UAVs flying at high altitudes
Keywords— Quadrotor; ROS ; AprilTag; Wireless Charging;
[5]. For indoor autonomous multirotor, Motion Capture system
Raspberry Pi (MoCap), together with AprilTag, can be used for localization
[6]. The center of the marker is the origin of the world frame,
I. INT RODUCT ION
and it is in East North Up (ENU) coordinate format. The
Nowadays, quadrotors and other multirotor are becoming physical center of the camera is considered as the origin of the
more and more popular. They are widely used in military body center and it follows North East Down (NED) coordinate
surveillance, delivery services, warehouse automation, system. The transformation between the two frames gives the
photography, etc. [1]. The major problem faced by these relative pose between marker and camera [7,8]. There are
multirotor is limited fly time. The leading causes are the mainly two transformation methods available one is Euler
motors, and other electronics components consume more angle another is quaternions. Quaternion is a four-dimensional
power (more compared to ground robots since more energy complex number used to represent transformation. Quaternion
required to overcome the gravity) and the usage of bulky reduces the computational complexity, and it does not have the
batteries. Bulky batteries increase the capacity of the battery problem of gimbal lock [9]. This relative pose can directly send
and which intern increases the flight time. But as the capacity to any PIXHAWK family flight controllers using the ROS
of the battery increases, the weight of the multirotor also package called MAVROS [10]. In this article, we describe only
increases. There are are many solutions to tackle this problem; about implementing the AprilTag detection system using ROS.
one of the leading s olutions we got was using a wireless
charging pad. Whenever the battery of the multirotor is low, it The paper is divided into the following sections. Section 2
can come to the charging pad and charge itself and continue its describes the experimental setup used. Section 3 presents the
work. Here, we propose a method to implement the detection methodology adopted for implementing the system. Section 4
of the charging pad for quadrotors during its flight. This system illustrates the results obtained and their interpretations. Finally,
is performed entirely on ROS. ROS is a meta-operating system section 5 concludes this paper.
that works along with the primary operating system (OS)
(Linux or Windows) in which it is installed. ROS acts as an II. EXPERIMENT AL SET UP AND DESCRIPT IONS
intermediate between the main OS and hardware. In ROS, each The complete project consists of three parts . The first one is
node represents a program that has a specific function (for a downward-facing camera mounted on a quadcopter, second is
example, fetching a sensor data or sending a command to rotate an AprilTag marker placed on the ground, and the third is a
motor). ROS has an inter-process communication facility. The slave laptop for visualizing the output data [11,12]. Fig.1
topic is a link through which two nodes communicate, and the shows the pictorial representation of the entire system. In this

978-1-7281-4876-2/20/$31.00 ©2020 IEEE 1151

Authorized licensed use limited to: University of Gothenburg. Downloaded on July 26,2020 at 16:26:24 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the International Conference on Intelligent Computing and Control Systems (ICICCS 2020)
IEEE Xplore Part Number:CFP20K74-ART; ISBN: 978-1-7281-4876-2

paper, we did not consider the quadcopter hardware. It was camera calibration procedure, we used the camera calibration
used just for mounting the camera and the SBC. The SBC was ROS node [16-18]. OpenCV libraries were used for creating
used for running the main program, and it was connected to a the camera node. When we run the camera calibration node, a
slave computer (a laptop, for visualizing the output data). The Graphical User Interface (GUI) was displayed. We held the
SBC used was raspberry pi 3 B+. To install ROS on raspberry checkerboard in front of the camera; then, we moved the
pi, Linux OS is required. We installed Ubuntu 18.04 as base checkerboard in the field view of the camera until the right side
OS, and on top of that ROS version, Melodic Morenia was parameters showed green. We moved the checkerboard
installed [13-15]. The important thing is that SBC and slave left/right, top/bottom away/near, and tilted position in the field
computer (laptop) should have the same version of ROS in it. of view of the camera to get X, Y, size, and skew calibration,
For WiFi communication, we used a D-Link router as a respectively. When all the four bars mentioned above turned
hotspot, and this gave a faster data transfer between laptop and into green, indicating calibration was completed [19-21]. Then
SBC. A dedicated router is required for lag-free visualization we pressed the calibration button. Then the program started
of camera output data. In ROS, we used three prebuilt nodes. creating the .yaml file containing calibration parameters. After
The nodes were modified for our requirements. Table 1 shows this process had completed, we pressed the save button. The
a list of components used and their specifications. .yaml file was stored in the “~/.ros/camera_info/” directory of
the Linux system.
There are two reference frames in this system. One is the
camera reference frame and the other is the AprilTag reference
frame. The mathematical equation involved in estimating the
relative position between the camera and the charging pad is
given by (1).
(1)
Where is the coordinate of a point P concerning camera
coordinate frame and is the coordinate of a point P
concerning the AprilTag coordinate frame. and are the
rotation matrix and the translation vector of AprilTag frame
relative to the camera frame, respectively.
An experiment was conducted to get the relative pose
between the camera and AprilTag placed on the floor. For this
experiment, we fixed the camera under the quadrotor. We
Fig. 1. Pictorial representation of the system. moved the quadrotor over the AprtilTag to obtain different
relative pose [22-24]. The quadrotor was translated in X, Y, Z
directions and rotated about the roll, pitch and yaw axis.
We took printouts of different Tag IDs of the same
Another experiment was conducted for measuring the
AprilTag family (36h11) in an A4 size paper. We also made
maximum distance up to which the camera could detect the tag.
tags of dimensions 10.8x10.8 cm, 12.6x12.6 cm, 15.6x15.6 cm,
We also measured the minimum orientation of the camera to
20.1x20.1 cm. The tag dimension is the length of the outer
detect the tag at different distances between the camera and
black border.
AprilTag. For this experiment, we fixed the camera on a table
T able 1. List of components used and their specifications and marked distances at an interval of 1 meter on the floor. At
each marking, AprilTag is placed and observe the detection
Components Specification
[25-28]. We measured the orientation angle by tilting the tag.
The orientation angle is the acute angle between the plane of
Raspberry Pi 3B+ Broadcom BCM2837 ARM Cortex- tag and camera’s Z-axis or axis normal to the camera’s lens.
A53, 1.2GHz Quadcore processor, The minimum orientation angle was noted. The experiment
1GB LPDDR2 (900 MHz) RAM, was repeated for various tag dimensions.
GPU used is Broadcom VideoCore III. M ET HODOLOGY
IV.
Camera calibration is the process of finding out the camera
Microsoft 5MP, 720p HD camera,73° Field of parameters. For the calibration purpose, we used the
LifeCam Cinema view, Pixel size 3µm checkerboard method. The critical calibration parameters are
D-Link DSL- 2.4 GHz, single band, 150Mbps ⅰ) Camera matrix: It is a 3x3 matrix that contains information
regarding the focal length, f, and the optical center (o x, o y ) (It is
2730U WiFi speed
the point where the optic axis intersects the image plane). ⅱ)
Router Distortion Coefficient: It contains parameters corresponding to
radial and tangential distortion. It is a 1x5 matrix. ⅲ)
The camera calibration was the first step of this project. We Rectification Matrix: This matrix defines whether the camera
fixed the camera on a table. A checkerboard of size 8 squares system used is monocular or stereo. Here we used a monocular
camera. ⅳ) Projection Matrix: It represents a map from 3D to
by 6 squares was used for camera calibration. The size of each
square was 2.39 centimeters. Before starting the camera 2D. It is the product of the camera matrix and the
transformation between camera and world coordinates
calibration node, we run the cv_camera node for getting
outputs from the USB camera connected to SBC. For the [11,12,29].

978-1-7281-4876-2/20/$31.00 ©2020 IEEE 1152

Authorized licensed use limited to: University of Gothenburg. Downloaded on July 26,2020 at 16:26:24 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the International Conference on Intelligent Computing and Control Systems (ICICCS 2020)
IEEE Xplore Part Number:CFP20K74-ART; ISBN: 978-1-7281-4876-2

First, we started the camera node. The Linux terminal to others through the right topics. ROS has an inbuilt command
command to run the camera calibration node is: to display the ROS graph.
“$ rosrun camera_calibration cameracalibrator.py --size
8x6 –square 0.0239 image:=/cv_camera/image_raw camera:=
/cv_camera_node –no-service-check”
Here we used a checkerboard of size 8x6 (in terms of the
number of squares), and each square had a size of 2.39 cm. The
above command opened a Graphical User Interface (GUI), as
shown in Fig. 2.

Fig. 3. Block diagrams of all the three nodes created: (a) Camera node (b)
Image processing node (c) AprilTag detection node

Launch files were created for each node. The order of


calling the launch file can be represented as a flow chart, as
Fig. 2. T he camera calibration GUI.
given in Fig.4.

The camera node can capture the image with any USB
webcam. The node captures the image and publishes two
outputs, “image_raw” message, and “camera_info” message.
The “image_raw” is an unprocessed image frame. The
“camera_info” message contains information like image height
and width, location of the camera calibration (.yaml) file. The
resolution of the camera is reduced to 640x480 to increase the
processing speed. Fig.3a shows the structure of the cv_camera
node. The image processing node is to convert the raw image
captured by the camera to a rectified image. The raw image
contains many distortions like radial and tangential. If we
directly feed this to the AprilTag detection node, it may give
erroneous results. The node structure is given in Fig.3b. The
inputs for the AprilTag detection node are the rectified image
and camera calibration parameters. There is one tag
configuration file associated with AprilTag detection node; we
edited the file by inputting values like tag ID numbers (this
system detects only those tags with the ID number provided in
the configuration file) and tag size (in meters) This node can
detect all the tags of the 36h11 AprilTag family. The node
publishes to tag detection topics are “/tag_detections_image”
and transformation “/tf.” “/tag_detections_image” is the same
as input “cv_camera/image_rect,” but the detected tag is
highlighted and also displays the tag ID. “/tf” is the relative
pose between the camera frame and the detected tag. Fig.3c Fig. 4. A flow chart of the order of running launch files in the Linux terminal.
shows the AptilTag node structure.
To see the message passed through the /tf and IV. RESULT S AND DISCUSSIONS
“/tag_detections_image,” we need to give two terminal Fig.5 shows the “.yaml” file output for camera calibration.
commands: “$ rostopic echo /tf” and “$ rqt_image_view.”The All the parameters in the calibration parameter matrices were
“rqt_image_view” is used for visualizing the contents of a node expressed in terms of pixels. To obtain each parameter in
in GUI. Whereas the “rostopic echo” command displays the millimeters, we multiplied the parameters with the size of one
contents on the Terminal window. ROS graph is a pictorial pixel or size of the camera sensor (see Table 1). Therefore,
representation of how all nodes linked to each other. It is a Focal length (f) = 593.487*3*0.001 = 1.779 mm. The optical
great tool to check whether all nodes are appropriately linked centers (o x, o y ) = (332.447, 216.293) = (0.997 mm, 0.649 mm).
Pixel skew (α) is 0. Here the camera coordinate frame and

978-1-7281-4876-2/20/$31.00 ©2020 IEEE 1153

Authorized licensed use limited to: University of Gothenburg. Downloaded on July 26,2020 at 16:26:24 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the International Conference on Intelligent Computing and Control Systems (ICICCS 2020)
IEEE Xplore Part Number:CFP20K74-ART; ISBN: 978-1-7281-4876-2

world coordinate frame were the same. The projection matrix above the maximum range, the system took a considerable
obtained was similar to the camera matrix. The camera rotation amount of time to detect the tag. Also, there were a lot of false
matrix was an identity matrix, and the translation vector was detections that occurred above the maximum range. There was
null. a steady decrease in detectability with orientation as distance
increases. The minimum detectable angle was almost similar in
all the tags dimensions (around 20°), and it occurred near the
maximu m detectable distance.

Fig. 7. T he two main ROS topic outputs: (a) The detected AprilTag, (b) T he
relative pose between camera and AprilTag.

Also, we observed that there was a minimum detectable


Fig. 5. Camera calibration .yaml file contents. distance between tag and camera. Like maximum detectable
distance, minimum detectable distance also increased with an
Fig.6 shows the complete ROS graph of the system. Nodes increase in the tag dimension. The minimum detection range
were represented as ellipses, and topics were shown as for 10.8x10.8cm, 12.6x12.6cm, 15.6x15.6cm and 20.1x20.1cm
rectangles. No node was subscribing to (taking input from) re 5.1cm, 8.3cm, 13.4cm and 18.8cm respectively.
“/tf” topic. The node “image_proc” simultaneously published
to and subscribed from the topic “/cv_camera/image_mono.”
We did not use the contents of the topic
“/cv_camera/image_mono.” This topic only contained the
unrectified monochrome image.

Fig. 6. T he complete ROS Graph.

The detected AprilTag is shown in Fig.7a. The coloured


marking of the detected tag was done to identify the region of
detection. The tag ID was displayed at the center of the
detected marker. In Fig.7a, the tag ID is given as 2. The
relative pose consisted of both translation and rotation, as given
in Fig.7b. The translation was shown in the right hand cartesian Fig. 8. Distance from camera to AprilTag vs. AprilT ag orientation for various
tag sizes.
coordinate system while the rotation was given in the
quaternion form. All the displayed values were in meters.
V. CONCLUSION
Fig.8 shows the relation between the distance from the The system gave acceptable results. The camera can detect
camera to AprilTag and the orientation of the AprilTag for the marker at medium altitude and moderately large angles of
various tag dimensions. From the experiment, we found that orientation. To detect AprilTag from a very high altitude, we
the maximum detection range increases with an increase in tag should increase the size of the marker. Using markers of
dimension. The maximum detection range for 10.8x10.8cm, different dimensions helps to land multirotor flying at high
12.6x12.6 cm, 15.6x15.6 cm and 20.1x20.1cm were 4.35m, altitude in a step by step manner. That is, the biggest tag helps
5.21m, 6.12m and 8.36m respectively. For distances nearer and in detecting the landing pad or charging pad from high altitude,

978-1-7281-4876-2/20/$31.00 ©2020 IEEE 1154

Authorized licensed use limited to: University of Gothenburg. Downloaded on July 26,2020 at 16:26:24 UTC from IEEE Xplore. Restrictions apply.
Proceedings of the International Conference on Intelligent Computing and Control Systems (ICICCS 2020)
IEEE Xplore Part Number:CFP20K74-ART; ISBN: 978-1-7281-4876-2

and the smallest tag helps in landing precisely on the charging 2019 4th International Conference on Recent T rends on Electronics,
pad. This method can be used for precision landing since the Information, Communication & Technology (RTEICT), May 2019, pp.
1370-1374.
AprilTag detection is robust to orientation. The camera-
[16] T . Y. Gite, P. G. Pradeep and A. A. Bazil Raj, “Design and Evaluation of C-
carrying quadrotors can detect the tag and gives reasonably Band FMCW Radar System,” in 2nd International Conference on Trends in
accurate results under unstable flying conditions also. We Electronics and Informatics (ICOEI), May 2018, pp. 1274-1276.
observe a delay of 1 sec in detection since the Raspberry Pi is [17] H. C.Kumawat, and ArockiaBazil Raj, “Data Acquisition and Signal
not a powerful computer for image processing applications. Processing System for CW Radar,” 5th International Conference on
Choosing an SBC with much more processing speed and Computing Communication Control and Automation (ICCUBEA), Sep.
Random Access Memory (RAM) will improve the detection 2019.
speed. The obtained pose message from the “/tf” topic is [18] A.A. Bazil Raj, and U. Darusalam, “ Performance Improvement of
compatible with the MAVROS message. So it can be directly T errestrial Free-Space Optical Communications by Mitigating the Focal-
Spot Wandering,” J. Modern Opti., vol. 63, no.21, pp.2339-2347, Jun.
sent to any PIXHAWK model FCU using the MAVLink 2016.
communication protocol. A free-space optical beam based [19] A.A. Bazil Raj, “FPGA- Based Embedded System Developer’s Guide,”
drone guidance system is one of the other futuristic research 1 st ed., USA: CRC Press, 2018.
fields for this type of application [30,31]. [20] A.A. Bazil Raj, J.A. Vijaya Selvi, and S. Raghavan, “ Real-time
Measurement of Meteorological Parameters for Estimating Low Altitude
REFERENCES Atmospheric Turbulence Strength (Cn2)”, IET- Sci. Meas. Tech., vol. 8,
[1] R. Mahony, V. Kumar, and P. Corke, “ Multirotor Aerial Vehicles: no.6, pp. 459-469, Nov. 2014.
Modeling, Estimation, and Control of Quadrotor,” Robotics Automation [21] S.R. Nishad and A. A. Bazil Raj, “ Sliding Mode Control of Robotic Gait
Magazine, IEEE, vol. 19, pp. 20-32, Sept. 2012. Simulator,” 2019 International Conference on Intelligent Computing and
[2] “ ROSConcepts.” [Online]. Available: http://wiki.ros.org/ROS/Concepts. Control Systems (ICCS), May, 2019, pp. 48-53.
[Accessed: 10 August 2019]. [22] A.A.Raj, J.A. Vijaya Selvi,D. Kumar, and N. Sivakumaran, “Mitigation
[3] E. Olson, “ AprilT ag: A robust and flexible visual fiducial system,” in of beam fluctuation due to atmospheric turbulence and prediction of
2011 IEEE International Conference on Robotics and Automation control quality using intelligent decision-making tools”, Applied opti.,
(ICRA), 2011, pp. 3400-3407. vol. 53, no.17, pp.3796-806, Jun. 2014.
[4] J. Wang and E. Olson, “ AprilT ag 2: Efficient and robust fiducial [23] Upasana Garg , A A Bazil Raj, and K P Ray, “Cognitive Radar Assisted
detection,” in 2016 IEEE/RSJ International Conference on Intelligent T arget T racking: A Study”, in 3rd International Conference on
Robots and Systems (IROS), 2016, pp. 4193-4198. Communication and Electronics Systems (ICCES), Oct. 2018, pp. 427 -
430.
[5] R. Acuna and V. Willert, “ Dynamic Markers: UAV landing proof of
concept,” in 15th Latin American Robotic Symposium, 6th Brazilian [24] C. Rajkumar and A. A. Bazil Raj, “ Design and Development of DSP
Symposium on Robotics (SBR) and 9th Workshop on Robotics in Interfaces and Algorithm for FMCW Radar Altimeter,” in 4th
Education (WRE), 2018, pp. 496-502. International Conference on Recent Trends on Electronics, Information,
Communication & Technology (RTEICT), May 2019, pp. 720-725.
[6] G. Zhenglong, F. Qiang and Q. Quan, “Pose Estimation for Multicopters
Based on Monocular Vision and AprilT ag,” in 37th Chinese Control [25] L. Prasad and A. A. B. Raj, “ Design of 2D-WH/TS OCDMA PON ONU
Conference (CCC), 2018, pp. 4717-4722. Receiver with FLC T echnique,” 2019 4th International Conference on
Recent T rends on Electronics, Information, Communication &
[7] J. Qi, X. Guan and X. Lu, “ An Autonomous Pose Estimation Method of T echnology (RT EICT ), May 2019, pp. 470 -474.
MAV Based on Monocular Camera and Visual Markers,” in 13th World
Congress on Intelligent Control and Automation (WCICA), 2018, pp. [26] Priyanka Shakya , and A. A. Bazil Raj, “Inverse Synthetic Aperture
252-257. Radar Imaging Using Fourier T ransform Technique”, in 1st International
Conference on Innovations in Information and Communication
[8] J. Deeds, Z. Engstrom, C. Gill, Z. Wood, J. Wang, I. S. Ahn, Y. Lu, T echnology (ICIICT ), Apr. 2019,pp. 1-4.
“ Autonomous Vision-based T arget Detection Using Unmanned Aerial
Vehicle,” in IEEE 61st International Midwest Symposium on Circuits [27] A. Gupta and A. A. Bazil Rai, “ Feature Extraction of Intra-Pulse
and Systems (MWSCAS), 2018, pp. 1078-1081. Modulated LPI Waveforms Using ST FT ,” 2019 4th International
Conference on Recent T rends on Electronics, Information,
[9] P. Corke, “Representing Position and Orientation,” in Robotics, Vision Communication & Technology (RT EICT ), May 2019, pp. 742-746.
and Control: Fundamental algorithms in MATLAB, 2nd ed.
Switzerland: Springer International Publishing, 2017, pp. 44-45. [28] D. Gupta, A. A. Bazil Raj and A. Kulkarni, “ Multi-Bit Digital Receiver
Design For Radar Signature Estimation,” in 3rd IEEE International
[10] “ ROS.” [Online]. Available: https://ardupilot.org/dev/docs/ros.html. Conference on Recent T rends in Electronics, Information &
[Accessed: 3 September 2019]. Communication Technology (RT EICT ), May,2018, pp. 1072-1075.
[11] X. Liang, Y. Du and D. Wei, “ An Integrated Camera Parameters [29] R. Vaishnavi, G. Unnikrishnan and A. A. Bazil Raj, “ Implementation of
Calibration Approach for Robotic Monocular Vision Guidance,” in 34th algorithms for Point target detection and tracking in Infrared image
Youth Academic Annual Conference of Chinese Association of sequences,” in 4th International Conference on Recent T rends on
Automation (YAC), 2019, pp. 455-459 Electronics, Information, Communication & T echnology (RT EICT ),
[12] “ Camera Calibration.” [Online]. Available: https://opencv -python- May 2019, pp. 904-909.
tutroals.readthedocs.io/en/latest/py_tutorials/py_calib3d/py_calibration/ [30] A.A. Bazil Raj, “ Free space optical communication: system design,
py_calibration.html. [Accessed: 04 September 2019]. modleing, characterization and dealing with turbulence”, 1 st ed., Ghbh:
[13] D. V. T hiruvoth, A. A. B. Raj, B. P. Kumar, V. S. Kumar and R. D. De Gruyter, 2015.
Gupta, “ Dual-Band Shared-Aperture Reflectarray Antenna Element at [31] Arun K Majumder, Zabih Ghassemlooy and A.A. Bazil Raj, “ Principles
Ku-Band for the TT&C Application of a Geostationary Satellite,” in 4th and applications of free space optical communications”, 1 st ed., UK:
International Conference on Recent Trends on Electronics, Information, IET , 2018.
Communication & Technology (RT EICT ), May 2019, pp. 361-364.
[14] A.B.A. Anthonisamy, P. Durairaj, and L.J Paul, “ Performance Analysis
of Free Space Optical Communication in Open-Atmospheric Turbulence
Conditions with Beam Wandering Compensation Control,” IET Comm. ,
vol. 10, no. 9, pp.1096-1103, Jun. 2016.
[15] S. Batabyal and A. A. Bazil Rai, “ Design of A Ring Oscillator Based
PUF with Enhanced Challenge Response pair and Improved Reliability,”

978-1-7281-4876-2/20/$31.00 ©2020 IEEE 1155

Authorized licensed use limited to: University of Gothenburg. Downloaded on July 26,2020 at 16:26:24 UTC from IEEE Xplore. Restrictions apply.

You might also like