Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

A Hybrid Localization Approach for UAV in GPS Denied Areas

S. Rady, A.A. Kandil, and E. Badreddin

Abstract—In this paper, the localization problem of an provide reliable UAV position information, as the state
autonomous unmanned aerial vehicle in case of losing the GPS estimation solution (position, velocity and attitude) drifts in
signal is handled. A vision-based solution approach is proposed time [2]. This would result in errors in determining the
consisting of two phases. In the first phase a hybrid map is accurate position of the UAV during flight mission, which
constructed. Such map consists of a set of reduced features
could have disastrous consequences [3], especially when the
obtained by information-theoretic analysis. This enables faster
UAV localization processing without degenerating the UAV becomes out of sight of the pilot and can not be
accuracy. The features are represented by local descriptors brought back to the homing point manually. Thus, such a
which are additionally tagged with their metric position. The safety-critical UAV system should be able to navigate
second phase localizes the UAV using the map, which is safely, even if the GPS signal is disrupted or totally lost.
performed on two scales. A fast and coarse topological location This problem has been investigated by researchers in the
is identified based on matching features of images taken by the
last few years. One potential solution is based on the
camera with the local descriptors information in the map. This
guides the UAV in fast and safe emergency homing. A second integration of a vision system to support the already existing
precise metric position can be estimated in extension with navigation system (fig. 1). A localization method can be
respect to a previously identified topological location and with utilized to detect some pre-extracted features in the images
the aid of the features’ metric position information. This can and hence to calculate the position of the UAV when the
assist the UAV navigation in case the mission should be signal of the GPS is lost. A Kalman filter can be used to fuse
completed without interruption despite the GPS signal loss.
signals from the inertial sensors with the position sensor
(GPS or the vision system in case the GPS is not available).
I. INTRODUCTION
Vision-based solutions are attractive since video cameras

T HE integration of Unmanned Aerial Vehicles (UAVs)


into civil application fields is growing every day. Fixed-
wing aircraft and vertical take-off and landing (VTOL)
became standard equipment for UAV. Moreover, cameras
provide rich information like color and texture.

unmanned vehicles are being increasingly used for


surveillance, search and rescue missions, aerial mapping,
inspection, and agricultural imaging, to name just few
applications domains [1]. Unlike fixed-wing aircrafts,
helicopters (which will be considered here) are distinguished
by their maneuverability and hovering ability, which makes
them suitable for performing a wide variety of tasks.
The autonomous navigation of UAVs relies on accurate
determination of their location, which is commonly acquired
by a Global Positioning System (GPS). An autonomous
UAV is usually equipped with a GPS and an Inertial
Measurement Unit (IMU), consisting of three gyros and
three accelerometers to determine the position and Figure 1. Navigation architecture.
orientation of the UAV during flight. However, the GPS-
signal can be lost or disrupted because of bad weather Many research works utilize the simultaneous localization
conditions, existence of obstacles or terrains, inconvenient and mapping (SLAM) method, which is well described in
position of satellites, or even jamming. In this case, if the [4, 5]. SLAM algorithms are used to develop landmark-
GPS-signal is lost, the IMU alone becomes unable to based navigation systems with the capability of online map
building. SLAM became a standard technique for indoor
Manuscript received October 10, 2011. applications. Numerous research works were carried out to
S. Rady is with the Institute for Computer Engineering of the University apply the SLAM technique to UAV. Wang et al. [6] and
of Heidelberg, 68131 Mannheim, Germany (phone: 0049-621-1812471; Kim et al. [7] presented the integration of vision-based
fax: 0049-621-1812740; e-mail: sherine.rady@ ziti.uni-heidelberg.de).
A. A. Kandil is with the Institute for Computer Engineering of the
measurements with inertial navigation system to
University of Heidelberg, 68131 Mannheim, Germany (e-mail: simultaneously localize an UAV through the position of
amr.kandil@ziti.uni-heidelberg.de). some extracted features by image processing. However,
E. Badredin is with the Institute for Computer Engineering of the
SLAM is still a challenge when applied to large areas.
University of Heidelberg, 68131 Mannheim, Germany (e-mail:
essameddin.badreddin@ziti.uni-heidelberg.de).
Lindsten et al. [8] presented a method which relies on high computational time can not be afforded and a fast
flying over the area of interest to construct a reference map. determination of the position of the UAV is crucial.
The constructed map is then saved onboard to assist later In this work, a novel approach for the localization of
localization of the UAV. According to the authors, using UAV based on using pre-exiting maps is presented. The
existing maps instead of creating a new one online results in approach method applies an information-theoretic technique
more accurate position determination of the UAV. The UAV to construct a map with a reduced set of informative
localization can be performed by matching the images taken features. The technique allows identifying the most
during flight mission with the pre-existing map saved informative data for the map and filtering out the less
onboard. The images are segmented into uniform regions informative ones. In this way, the number of features is
(superpixels). Each superpixel is classified using a neural reduced without negatively affecting the localization
network classifier. The classified images are then to be accuracy. Finally, an efficient-use map is generated that can
matched with the pre-existing map. This method performs be mounted on the UAV platform for localization in case the
well if the overflown area is rich on information. However, GPS system malfunctions. Feature reduction based on
if the area or the pre-existing map is low on information, the information-theoretic analysis has a double benefit of
UAV can not be localized with acceptable accuracy. minimizing the recognition and localization uncertainties, as
The presented work here also complies with constructing well as reducing the features’ matching time, which directly
a reference map to be used later by the helicopter for influences the time needed to calculate the position of the
standby localization. The paper is organized as follows: UAV. The proposed approach integrates additional
section 2 describes the problem to be handled followed by a compression for the filtered features. This compressed
description of the UAV in section 3. The vision-based map feature format can assist the UAV in quick and safe homing
construction and localization approach is depicted in section for emergency situations.
4. Section 5 explains the triangulation method used in the
map-building and the UAV metric position estimation. The III. HELICOPTER DESCRIPTION
paper is finished by a conclusion section. The autonomous helicopter (fig. 2) at the Automation
Laboratory is of the type Scout B1-100 [9]. It has been
II. PROBLEM DESCRIPTION modified to cope with the foreseen research requirements of
The loss of the GPS signal represents an emergency the research group. A radar-based collision avoidance
situation for a UAV executing a mission. Researchers have system has been developed [10]. The autonomous helicopter
developed several measures to be taken in this case is powered by an air-cooled one piston internal combustion
depending on the current situation itself. If the pilot has engine (100cc) and has a main rotor diameter of 3.2 meter. It
sight contact with the UAV, then he can switch from the has two fuel tanks of 5 liters each and a total payload of
autonomous to the manual mode to bring the vehicle to the 31Kg. The flight endurance can achieve about one and half
nearest possible landing position. If the UAV is out of sight hour. The helicopter is equipped with a differential GPS
for the pilot, then it cannot be flown manually and the system and can fly planned missions autonomously. In
homing procedure has to be automated. In some other manual mode, the pilot can fly the helicopter using a RC-
situations, it might be preferable that the UAV continue the control device. It can fly also in a so called assisted mode.
mission without interruption and after that the vehicle flies
directly to the landing point [9]. For these two late
situations, an additional integrated vision navigation system
can handle such emergencies. It helps guiding the UAV to
its home point or support the UAV to continue its mission
when the GPS is not available. However, in such
emergencies, the vision-based localization methods need to
be fast and accurate.
SLAM algorithms perform well for indoor applications,
but have problems when applied to large areas (such as
outdoor applications). Furthermore, the utilization of
previously existing maps to localize the UAV, when the
Figure 2. Autonomous Helicopter from Aeroscout type Scout B1-100.
GPS signal is lost, have proven to lead better results.
However, the accuracy of the solution depends mainly on
the size of features to be extracted from the images. On one IV. VISION-BASED LOCALIZATION APPROACH
side if the number of features is high, an accurate Figure 3 shows a structure for the proposed vision-based
localization of the UAV can be obtained. On the other side, localization solution. Besides the common feature extraction
the computation time can be high, which is actually a main process for constructing the map, the solution relies on
drawback of using pre-exiting maps. In emergency cases, additional two processes of feature evaluation and feature
compression (fig. 3.a). Such processes enable generating an UAV to a node (i.e. a topological region). It extends the
efficient map that provides high localization accuracy with localization to a precise metric one, if desired, confined to
fewer features and consequently provides computational the identified node in a hierarchical manner. Hierarchical
savings in memory and processing time [11,12]. Moreover, localization frameworks [12,13,14] provide computational
the solution structure makes use of combined topological efficiency since the search is executed in more than one step,
and metric features to construct a hybrid map. This map is in which the final search is projected to smaller space rather
capable of localizing the UAV on two scales, a coarse than the exhaustive search of the entire space. Additionally,
topological region and a precise metric position. they scale well to large-space environments.
As shown in fig. 3.b, the realization of the map generation The three previous modules will be explained in details in
concept is divided into two different phases: the map the subsequent subsections.
construction phase and the UAV localization phase. The
A. Hybrid Feature Extraction Module
main processing for the map generation is executed offline
after data gathering to produce the final map, which is then The hybrid feature extraction consists of a local feature
installed on the UAV platform for stand-by localization. extraction component and a feature localization component.
The proposed solution structure processes the space and A local point feature extraction technique is suggested for
features on two different levels; topological and geometric the first component, such as the Scale Invariant Feature
levels. It employs three main modules as shown in figure 3: Transform (SIFT) [15] or the Speeded-Up Robust Features
(A) Hybrid feature extraction, (C) Map generation and (D) (SURF) [16]. Those techniques are distinguished by the
Hybrid localization. richness of their descriptors and robustness for detection
The hybrid feature extraction module combines two under different scale, different view point and other
different feature representations in order to resolve the UAV transformations and distortions. Additionally, they suit
location at the two scales. The map generation module mapping environments with poor structure. Their
employs components for feature evaluation and feature disadvantage is the high computational power involved with
compression. The first filters the most relevant set from the regard to the extraction and matching. The proposed map
extracted features using an information-theoretic technique. generation described in subsection C will treat this problem
The second reduces the size of the filtered features further of speeding up the matching.
through a codebook compression to generate the final map. The feature localization component calculates the position
The hybrid localization module affords localizing the of the extracted features in the global frame of reference. It

A.
hybrid feature extraction
Sensor images
Feature Feature

MAP CONSTRUCTION
FINAL
Extraction Localization HYBRID
Data data set including MAP
codewords

scene dynamic variations


entropy-based

features

PHASE
position info.
features +

position info.

Feature
features Information-
Extraction + Theoretic Codebook
INITIAL position
HYBRID Feature Feature
info. feature Compression
MAP Evaluation
clusters
Feature B.
Evaluation map generation
C.

features Topological codewords


Feature Matching
LOCALIZATION

Compression Position
?
PHASE

node
topological metric
features features
Feature entropy-based features
Hybrid UAV
Position Triangulation position info.
Map

hybrid
localization Map Building
D.
Localization
(a) (b)

Figure 3. (a) Map generation concept. (b) Solution structure for a hybrid map building and localization.
applies triangulation for such purpose. Different positions of C. Map Generation Module
the moving vehicle are required for the triangulation In order to generate a compact and more efficient map
execution (fig. 4). For this, the GPS system must be than the previous one, the initial map is processed by an
functioning in the map construction phase to supply such information-theoretic evaluation approach which is
information on the vehicle position. The vehicle different described in [11,17]. Such approach makes use of two
positions are used with the corresponding measured bearings components, information-theoretic feature evaluation and
(i.e. angles between the unknown camera heading direction codebook feature compression, to generate the final map.
and the features which are identified in camera in pixels) to The approach utilizes an Entropy-based evaluating criterion
calculate the position of the features. combined with a clustering technique to provide additional
compression for the evaluated data. It has proven to provide
accuracy combined with computational efficiency for robot
indoor navigation [11] and under severe illumination
conditions [17]. The approach aims at maximizing the
topological node recognition by filtering the most
discriminative information at each node. The filtered set
after evaluation is called entropy-based features as identified
in fig. 3. The feature evaluation component is implemented
in a way that supplies cluster information to a second
Codebook component. This Codebook allows further
compression to be applied for the evaluated features
resulting in a feature format called the codewords (see
Figure 4. Feature localization in map building phase using the figure). This additional compressed version of the features,
vehicle position measured by the GPS and the angles to features
measured by the camera. which contains the non-geometric information part only, is
targeted for use in the topological matching, in order to
B. Initial Map accelerate recognition while still maintaining the accuracy of
Using the hybrid feature extraction, the helicopter can fly recognition and matching.
over the area of interest, collect image data and construct a The idea of the information-theoretic technique is to have
feature map for localization. The area is divided into a set of every topological node initially described with a set of local
discrete regions (nodes) which are related to each other by features, and moreover compress those features to define a
neighborhood interconnectivities. The nodes can be assigned set of local codewords. The compression is done by
automatically by generating them at equidistant distances clustering the features extracted in every node using k-
traveled by the flying UAV. An alternative option is to means clustering [18]. Next, the whole feature pool of the
assign the new node at the region exhibiting a variation in nodes is sampled to determine the true variation
the scene detail which can be monitored by the dynamic (distribution) of the features in the high-dimensional space.
change of the tracked features in the camera view. Afterwards, a conditional-entropy-based criterion is applied
After the UAV executes the mission of collecting data at to evaluate the contribution of the features to uniquely
each node, a hybrid map can be constructed. The hybrid map identify a local codeword. This correspondingly measures
in this case will employ nodes with local features, which the features’ contribution to the identification of a specific
possess both non-geometric and geometric information. The node because the codewords are defined locally. The

H(O|f i )   P(ok| f i ).log 2 P(ok|f i )


non-geometric information part is represented by the criterion has the following form [19]:
Gradient in the case of SIFT or SURF feature extraction and
(1)
is used to characterize the topological nodes. The geometric k

information part is represented by the tagged position where ok represents a local codeword and fi represents a true
information of the features in the nodes and will be used variation sample; k=1,…,Ω; Ω is the total number of
together with the non-geometric part to resolve the UAV codewords, and i=1,…,Ψ; Ψ equal to the total number of
metric location. The hybrid structure of the map information feature samples. Finally, the codewords and their
will be used for two different localization modes for the corresponding uncompressed features which exhibit
UAV as will be explained in subsection D. This constructed relatively large entropy values are discarded, while those
map, however, does not specifically contain the most exhibiting low entropy values are preserved in the final map.
informative data. Excessive or redundant data may exist, Previous investigations have indicated that a large number
which increases localization uncertainty and computational of the features can be eliminated (i.e. feature with high-
overhead. Therefore, further processing is suggested to be entropy values), while still maintaining the same localization
applied, in order to extract the possible minimum data that accuracy [11,17].
maximizes the localization efficiency in terms of accuracy Therefore, the final form of the hybrid map will contain
and computational performance. the reduced feature set, where every node preserves features
with their measured gradient and position information, and Rotation transformation
bet. the two frame
Camera coordinate
with an additional compressed gradient information version. coordinate systems Z' Y'
Sytem
R(ω,φ,κ)
X'
D. Hybrid Localization Module Camera
perspective C
The hybrid localization module consists of two center
components: a topological matching component and a ( X C , YC , Z C )T f
feature triangulation component. The former matches the Z Positive focal
plane
current camera view to the map to identify the current
topological node and uses the compressed map data for that, ya xa
a lens axis
while the latter triangulates the identified features to
estimate the position of the UAV in 6-DOF by using the Y
non- compressed map data format. ZC
The UAV localization is processed hierarchically making
use of the divided topological space. At the first level of
hierarchy, the image captured by the camera mounted on the A (Landmark)
ZA ( X A , YA , Z A ) T
UAV is compared to the hybrid map. Features are extracted XA

from the current image and are used in fast matching with XC
the compressed codewords to identify the corresponding YC YA
topological node. After the topological node has been W
identified, the UAV location estimation is executed locally World coordinate
in this region in a second level of hierarchy. The identity of System
the features in the node is resolved by matching the non- X

geometric information part and the position of the UAV is Figure 5. Photogrammetric projective model
estimated by applying a triangulation method for the
features using their metric position data. based on a photogrammetric projective model shown in
The two localization components use different versions of figure 5. The model relates the 3-D Cartesian coordinates of
map data for supplying either a coarse topological location features or landmarks in the real world to their
or a precise metric position estimate for the UAV. In case of corresponding 2-D coordinates projected on the image
flight emergencies signaled by the loss of the GPS signal, plane. The fundamental model used in the majority of
the topological localization component can assist the UAV applications is the Collinearity equations [20]. They relate
in fast and safe emergency homing, since less data is the object point and its corresponding image point to the
involved (codewords). In case that the mission should be perspective camera center. The projection model in the
completed without interruption and exact positioning is figure shows the alignment of the three points, with the
required, the non-compressed data with the metric position image plane set in the positive focal position (i.e. z=-f,
data are used to complete the mission, which still provides where f is the camera focal length). Projections in the x and
good computa-tional performance because of the y directions are proportional to the horizontal and vertical
hierarchical processing. bearings respectively, and are described by the Collinearity
The proposed localization method is robust against equations:
r11 ( X A  X C )  r21 (YA  YC )  r31 ( Z A  Z C )
uncertainties in node recognition and data correspondences.
xa   f
r13 ( X A  X C )  r23 (YA  YC )  r33 ( Z A  Z C )
This is because the features used for the map building are (2)
selected based on properties of features discrimination and
r12 ( X A  X C )  r22 (YA  YC )  r32 ( Z A  Z C )
ya   f
node distinguishness measured by the introduced evaluation
r13 ( X A  X C )  r23 (YA  YC )  r33 ( Z A  Z C )
criterion (i.e. ambiguities minimized). Moreover, the local (3)
matching confined to a single topological node minimizes
the possibility of mismatches and data correspondence for a feature position vector TW
q A =(XA YA ZA) and a camera
problem when compared to matching features of the whole
position vector located by its perspective center W q c =(XC YC
map. Additional confidence for the data correspondence can
be provided by utilizing features’ metric relationships as ZC)T. rij are elements of the rotation matrix describing the
proposed in [20]. transformation between the camera and the global world
coordinate systems, and which are function of the roll, pitch
V. METRIC LOCALIZATION USING TRIANGULATION and yaw angles of the camera.

 r11 r13 
R  Rx ( ).R y ( ).Rz ( )  
r23 
Both the hybrid feature extraction and hybrid localization r12
 r21
W
modules apply triangulation to estimate the position of C r22 (4)
features and UAV respectively (i.e. feature localization and  r31 r32 r33 
feature triangulation components). The triangulation is
The problem of either the camera or landmark position accuracy and less computational overhead. The features
estimation will be defined as a nonlinear least squares provide double information that can resolve the topological
(NLLS) problem. Therefore, the solution equations to the location and the metric position separately. The proposed
UAV localization are similar to that of the map-building hybrid structure and hierarchical localization processing
feature localization. offer combined accuracy with computational efficiency of

estimate the position vector W q c R6 using the horizontal


For the camera localization problem, it is required to localization for the purposes of homing or mission execution
under flight emergency situations.
and vertical bearings xˆ 1a ,xˆ a2 ,...,xˆ am  R and yˆ 1a ,yˆ a2 ,...,yˆ am  R
REFERENCES
3
and the given feature location vectors; W p1A ,W p 2A , ...,W p mA
 R where m is the number of features identified in the
[1] P. Parker, The 2010-2015 World Outlook for Unmanned Aerial
Vehicles (UAV) and Systems. ICON Group International Inc., 2009.
[2] G. Conte, and P. Doherty, “An Integrated UAV Navigation System
image. Therefore, a residual cost function is defined from Based on aerial Image Matching,” in IEEE Aerospace Conference, pp.
which W q c is estimated. The function is based on 1-10, 2008.
[3] G. Mao, S. Drake, and B. Anderson, “Design of an extended Kalman
minimizing the sum of the squared error between the actual filter for UAV localization,” Proc. of Information, decision and
measured positions of the features in the image and their Control Conference, pp. 224-229, 2007.
calculated values from the model equations (2) and (3). [4] H. Durrant-Whyte, and T. Bailey, “Simultaneous Localization and
Mapping: Part I,” IEEE Robotics & Automation Magazine, 13, 2006.
Consequently, the cost function will have the form: [5] T. Bailey, and H. Durrant-Whyte, “Simultaneous Localization and

r    xˆ a  f i
 i   i 
   yˆ a  f i 
2 2 Mapping: Part II,” IEEE Robotics & Automation Magazine, 13, 2006.
m i i
U V [6] J. Wang, M. Garatt, A. Lambert, J.J. Wang, S. Han and D. Sinclair,
i 1    
(5) “Integration of GPS/INS/Vision Sensors to Navigate Unmanned aerial
W W
Vehicles,” The International Archives of the Photogrammetry,
with ( xˆ a , yˆ a ) defining the measured feature position in the Remote Sensing and Spatial Information Sciences, Vol.XXXVII, Part
B1, Beijing, pp. 963-969, 2008.
camera in pixels, and [7] J. Kim, and S. Sukkarieh, “6 DoF Slam aided GNSS/INSS Navigation

U  r11 ( X A  X C )  r21 (YA  YC )  r31 ( Z A  Z C )


i i i i
in GNSS Dnied and Unknown Environments,” Journal of Global
(6) Positioning Systems, Vol.4, No.1-2, pp. 120-128, 2005.
V  r12 ( X A  X C )  r22 (YA  YC )  r32 ( Z A  Z C )
i i i i [8] F. Lindsten, J. Callmer, H. Ohlsson, D. Törnqvist, T. B. Schön, and F.
(7)
W i  r13 ( X A  X C )  r23 (Y A  YC )  r33 ( Z A  Z C )
Gustafsson, “Geo-referencing for UAV Navigation Using
i i i Environmental Classification,” in Proc. ICRA, pp.1420-1425, 2010.
(8)
[9] Aeroscout GmbH, http://www.aeroscout.ch.
W [10] A. Kandil, A. Wagner, and E. Badreddin,”Collision Avoidance in a
Solving for (5), the camera position q c can be determined. Recursive Nested Behaviour Control Structure for Unmanned Aerial
The UAV position is deduced from the position of the Vehicles,” IEEE International Conference on Systems, Man, and
Cybernetics, pp. 4276-4281, 2010.
camera. Since the camera is fixed on the helicopter platform, [11] S. Rady and E. Badreddin, “Information-Theoretic Environment
a fixed offset vector q off and a rotation matrix R defines the Modeling for Efficient Topological Localization,” the 10th
International Conference on Intelligent Systems Design and
relationship between the UAV body frame and camera Applications, pp. 1042-1046, 2010.
frame.
q r  W q c  R.q off
[12] S. Rady, A. Wagner, and E. Badreddin, “Hierarchical Localization
W using Entropy-based Feature Maps and Triangulation Techniques,”
(9)
IEEE International Conference on System, Man and Cybernetics, pp.
where W
q r and W 519-525, 2010.
q c define the position of the UAV and the
[13] S. Tully, H. Moon, D. Morales, G. Kantor, and H Choset, “Hybrid
camera respectively. localization using the hierarchical atlas,” IEEE/RSJ International
Conference on Intelligent Robots and Systems, pp. 2857-2864, 2007.
In a similar sense, the same solution is used for localizing
[14] B. Lisien, D. Morales, D. Silver, G. Kantor, I. Rekleitis, and H.
the features in the map building phase. The solution to Choset, “Hierarchical simultaneous localization and mapping,” in
equation (5) is easier this time, since the Collinearity IEEE/RSJ Intern. Conf. on Intelligent Robots & Systems, pp. 448–453,
equations are not function of the non-linear parameters of 2003.
W
. Required will be to estimate each feature’s location [15] D. G. Lowe, “Distinctive image features from scale invariant
C R

p A  R 3 given the camera position vectors


keypoints,” in Int. J. of Computer Vision,” 2(60), pp. 91-110, 2004.
W
vector W
qc [16] H. Bay, A. Ess, T. Tuytelaars, L. V. Gool, “SURF: Speeded Up
R , and with the help of the bearing measurements as has
6 Robust Features”, Computer Vision and Image Understanding,
110(3): 346–359, 2008.
been described previously in figure 4. [17] S. Rady, A. Wagner, and E. Badreddin, “Building Efficient
Topological Maps for Mobile Robot Localization: An Evaluation
Study on COLD Benchmarking Database,” IEEE/RSJ International
VI. CONCLUSION Conference on Intelligent Robots and Systems, pp. 542-547, 2010.
This paper presents a vision-based approach for localizing [18] J. Han and M. Kamber, Data Mining: Concepts and Techniques, 2nd
Edition, Morgan Kauffman, 2006.
an UAV in case of the malfunction of the primary [19] T. M. Cover and J. A. Thomas, Elements of Information Theory.
localization system (GPS). The approach is capable of Wiley Series in Telecommunications, 1991.
localizing the UAV on both topological and metric levels. A [20] S. S. Welch, R. C. Montgomery, and M. F. Barsky, “The spacecraft
control laboratory experiment optical attitude measurement system,”
hybrid map construction is proposed which is based on NASA technical memorandum 102624, March 1991.
maintaining informative features that provide recognition

You might also like