Professional Documents
Culture Documents
Tie 2018 2807401
Tie 2018 2807401
fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
Abstract—This paper develops a method of calculating accurate and reliable vision-based target position estimation
the three-dimensional coordinates of a target with image using UAV is a fundamental issue for disaster monitoring,
information taken from an unmanned aerial vehicle. Unlike target tracking and rescue mission. Generally speaking, it
the usual approaches to target geolocation that rely heavily
on georeferenced terrain data base or accurate attitude depends on accurate yaw-angle measurements and known
sensors, the contribution of the proposed method is to terrain map for providing altitude information of the target,
offset the constraints and also perform three-dimensional so it is not suitable for onboard low-quality attitude sensor
geolocation of the target based on the relative altitude and unknown environment. For such a reason, this paper aims
between the aircraft and the target calculated using stereo to solve the problem by using only vision to estimate yaw-
vision technique. Considering the poor performance of
yaw-angle measurement provided by the low-quality atti- angle error and the altitude of the target and it is a challenge
tude sensors in the unmanned aerial vehicle, a novel vision- work.
based three-dimensional geolocation method is designed. In recent years, researchers have paid much attention to
The proposed method is proven to be valid and practical via target geolocation using visual cameras for UAV system.
simulation results and actual flight experiments. Kaminer et al. [11, 12] showed that the vision-based tracking
Index Terms—UAV, target 3D geolocation, computer vi- system could estimate the coordinates of a moving target in
sion, altitude estimation. real time with a gimbal-stabilized camera, but an accurate geo-
referenced terrain database like the Perspective View Nascent
Technology (PVNT) system [13] is required to obtain the
I. I NTRODUCTION
target’s altitude. Barber et al. [14] discussed four techniques
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
TABLE II
SINGLE SHOT ESTIMATION PERFORMANCE WITH THE USUAL
APPROACH :100MC RUNS
TABLE III
ITERATIVE PROCESS FOR LOITERING SCENARIO
k 0 1 2 3 4
δψ, deg 0 2.57 28.17 28.14 28.15
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
TABLE IV
RELATIVE ALTITUDE AND YAW- ANGLE BIAS ESTIMATION
PERFORMANCE :100MC RUNS
TABLE V
SINGLE SHOT ESTIMATION PERFORMANCE WITH OUR METHOD :100MC
RUNS
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
(a) UAV takes visual measurements of GOI (b) GOI detection (c) Target detection
Fig. 9. Flight test: an outdoor scenario
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
TABLE VII
TARGET 3D GEOLOCATION RESULTS WITH OUR METHOD
calculated using only one-shot measurement. Compared with [16] H. K. Min, and G. N. DeSouza, “Geolocation of multiple targets from
the usual approach, the position uncertainty using this method airborne video without terrain data,” Journal of Intelligent and Robotic
Systems, Vol. 62, No. 1, pp. 159–183, Apr. 2011.
is decreased by about 5m in horizontal direction, while the [17] M. Pachter, N. Ceccarelli, P. R. Chandler, “Vision-Based Target Ge-
achieved position accuracy using this method is about 0.5m in olocation Using Micro Air Vehicles,” Journal of Guidance, Control, and
the vertical direction. Dynamics, Vol. 31, No. 3, pp. 597–615, 2008.
[18] N. R. Draper, and H. Smith, Applied Regression Analysis, Wiley-
For generality of the proposed method, future work will ad- Interscience, New York, pp. 108-111, 1998.
dress moving GOI scenarios and also geolocating and tracking [19] M. Z. Brown, D. Burschka, and G. D. Hager, “Advances in computa-
a moving target. tional stereo,” IEEE Trans. Pattern Anal. Mach. Intell., Vol. 25, No. 8,
pp. 993–1008, Aug. 2003.
[20] E. Trucco, A. Verri, Introductory techniques for 3-D computer vision,
Englewood Cliffs, Prentice Hall, pp. 150-170, 1998.
R EFERENCES [21] F. Deng, S. P. Guan, X. H. Yue, X. D. Guo, J. Chen, J. Y. Lv, J. H. Li,
“Energy-Based Sound Source Localization with Low Power Consumption
[1] D. J. Pack, G. W.P. York, R. E. Sward, and S. D. Cooper, “Searching, in Wireless Sensor Networks,” IEEE Trans. Ind. Electron., Vol. 64, No.
Detecting, and Monitoring Emergency Sites Using Multiple Unmanned 6, pp. 4894–4902, Jan. 2017.
Aerial Vehicles,” AIAA Paper, pp. 2005–7031, Sep. 2005. [22] F. Y. Chen, R. Q. Jiang, K. K. Zhang, B. Jiang, G. Tao, “Robust
[2] B. Grocholsky, S. Bayraktar, V. Kumar, and G. Pappas, “UAV and UGV Backstepping Sliding-Mode Control and Observer-Based Fault Estimation
Collaboration for Active Ground Feature Search and Localization,” AIAA for a Quadrotor UAV,” IEEE Trans. Ind. Electron., Vol. 63, No. 8, pp.
Paper, pp. 2004–6565, Sep. 2004. 5044–5055, Aug. 2016.
[3] J. E. Gomez-Balderas, G. Flores, L. G. Carrillo, and R. Lozano, “Tracking [23] H. Liu, J. X. Xi, Y. S. Zhong, “Robust Attitude Stabilization for
a ground moving target with a quadrotor using switching control,” Journal Nonlinear Quadrotor Systems With Uncertainties and Delays,” IEEE
of Intelligent and Robotic Systems, Vol. 70, No. 1, pp. 65–78, Apr. 2013. Trans. Ind. Electron., Vol. 64, No. 7, pp. 5585–5594, Jul. 2017.
[4] S. Minaeian, J. Liu, Y. Zhong, “Vision-based target detection and local- [24] B. L. Tian, L. H. Liu, H. C. Lu, Z. Y. Zuo, Q. Zong, Y. P. Zhang
ization via a team of cooperative UAV and UGVs,” IEEE Trans. Syst., “Multivariable Finite Time Attitude Control for Quadrotor UAV: Theory
Man, Cybern., Syst., Vol. 46, No. 7, pp. 1005–1016, Jul. 2016. and Experimentation,” IEEE Trans. Ind. Electron., Vol. 65, No. 3, pp.
[5] C. Kanellakis, G. Nikolakopoulos, “Survey on Computer Vision for 2567–2577, Mar. 2018.
UAVs: Current Developments and Trends,” Journal of Intelligent and [25] D. H. Titterton, J. l. Weston, Strapdown inertial navigation technology,
Robotic Systems, Vol. 87, No. 1, pp. 141–168, Jul. 2017. Michael Faraday House, The Institution of Electrical Engineers, pp. 17–
[6] S. Y. Zhao, Z. Y. Hu, B. M. Chen, and T. H. Lee, “A Robust Real- 55, 2004.
Time Vision System for Autonomous Cargo Transfer by an Unmanned [26] D.G Lowe, “Distinctive Image Features from Scale-Invariant Keypoints,”
Helicopter,” IEEE Trans. Ind. Electron.,Vol. 62, No. 2, pp. 1210–1218, International Journal of Computer Vision, Vol. 60, No. 2, pp. 91–110,
Feb. 2015. Nov. 2004.
[7] F. Lin, X. Dong, B. M. Chen, K. Lum, T. H. Lee, “A robust real-time [27] H. Bay, A. Ess, T. Tuytelaars, L. V. Gool, “Surf: Speeded up robust
embedded vision system on an unmanned rotorcraft for ground target features,” European Conference on Computer Vision (ECCV), Graz,
following,” IEEE Trans. Ind. Electron., Vol. 59, No. 2, pp.1038–1049, Austria, pp. 404–417, May. 2006.
Feb. 2012. [28] E. Rublee, V. Rabaud, K. Konolige, G. Bradski, “ORB: An efficient al-
ternative to SIFT or SURF,” IEEE International Conference on Computer
[8] X. Zhang, B. Xian, B. Zhao, Y. Zhang, “Autonomous flight control of a
Vision (ICCV), Barcelona, Spain, pp. 2564–2571, Nov. 2011.
nano quadrotor helicopter in a GPS-denied environment using on-board
[29] R. L. Placket, “A Historical Note on the Method of Least Squares,”
vision,” IEEE Trans. Ind. Electron., Vol. 62, No. 10, pp. 6392–6403, Oct.
Biometrika,Vol. 36, No. 3/4, pp. 458–460, Dec. 1949.
2015.
[30] O. Faugeras, Three-dimensional computer vision: a geometric view-
[9] Q. Fu, Q. Quan, K. Y. Cai, “Robust Pose Estimation for Multirotor UAVs
point,MIT press, pp. 165-230, 1993.
Using Off-Board Monocular Vision,” IEEE Trans. Ind. Electron., Vol. 64,
[31] L. J. Ding, Q. Y. Chen, Numerical Analysis Methods (Second Edition),
No. 10, pp. 7942–7951, Oct. 2017.
Beijing Institute of Technology Press, 2008.
[10] L. Wang, Z. J. Zhang, “Automatic Detection of Wind Turbine Blade
[32] M. Quigley, B. Gerkey and K. Conley, et al, “ROS: an open-source
Surface Cracks Based on UAV-Taken Images,” IEEE Trans. Ind. Electron.,
Robot Operating System,” ICRA workshop on open source software, vol.
Vol. 64, No. 9, pp. 7293–7303, Sep. 2015.
3. No. 3.2, 2009.
[11] I. H. Wang, V. N. Dobrokhodov, I. I. Kaminer, and K. D. Jones, “On [33] E. Olson, “AprilTag: A robust and flexible visual fiducial system,” Proc
vision-based target tracking and range estimation for small uavs,” AIAA of IEEE International Conference on Robotics and Automation (ICRA),
Paper, pp. 2005–6401, 2005. Shanghai, China, pp. 3400-3407, May. 2011.
[12] V. N. Dobrokhodov, I. I. Kaminer, K. D. Jones, and R. Ghabcheloo, [34] J. Wang, E. Olson, “AprilTag 2: Efficient and robust fiducial detec-
“Vision-based tracking and motion estimation for moving targets using tion,”Proc of IEEE/RSJ International Conference on Intelligent Robots
small UAVs,” American Control Conference, Minneapolis, USA, Jun. and Systems (IROS), Daejeon, Korea, pp. 4193-4198, Oct. 2016.
2006.
[13] V. N. Dobrokhodov, I. I. Kaminer, K. D. Jones, and R. Ghabcheloo,
“Vision-based tracking and motion estimation for moving targets using
unmanned air vehicles,” Journal of Guidance, Control, and Dynamics,
Vol. 31, No. 4, pp. 907–917, Jul. 2008.
[14] D. B. Barber, J. D. Redding, T. W. Mclain, R. W. Beard, and C. N.
Taylor, “Vision-Based Target Geolocation Using a Fixed-Wing Miniature
Air Vehicle,” Journal of Intelligent and Robotic Systems, Vol. 47, No. 4,
pp. 361–382, Dec. 2006.
[15] M. E. Campbell and M. Wheeler, “Vision-Based Geolocation Tracking
System for Uninhabited Aerial Vehicles,” Journal of Guidance, Control,
and Dynamics, Vol. 33, No. 2, pp. 521–532, Mar. 2010.
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TIE.2018.2807401, IEEE
Transactions on Industrial Electronics
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS
Lele Zhang received his B.E. degree in Swee King Phang (M’14) received his
automation from Inner Mongolia University, B.Eng. and Ph.D. degrees in electrical
Inner Mongolia Autonomous Region, China, engineering from National University of
in 2013. He engaged in research work as a Singapore (NUS), in 2010, and 2014
visiting Ph.D. student with the Department of respectively.
Electrical and Computer Engineering, National He is currently a lecturer in Taylor’s University,
University of Singapore, from November 2016 Malaysia. His research interests are in system
to November 2017. He is currently a Ph.D. and control of unmanned aerial systems (UAS),
candidate with the School of Automation, Beijing including modelling and flight controller design
Institute of Technology. of UAS, indoor navigation of UAS, and trajectory
His current research interests include camera optimization.
calibration, SLAM and deep learning, specifically in the area of aerial
robotics.
0278-0046 (c) 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.