Download as xlsx, pdf, or txt
Download as xlsx, pdf, or txt
You are on page 1of 102

https://github.

com/YoujieXia/Awesome-SLAM

Hot SLAM Repos on GitHub


0. Awesome-SLAM: Resources and Resource Collections of SLAM
1. awesome-slam: A curated list of awesome SLAM tutorials, projects and communities.
2. SLAM: learning SLAM,curse,paper and others
3. A list of current SLAM (Simultaneous Localization and Mapping) / VO (Visual Odometry) algorith
4. awesome-visual-slam: The list of vision-based SLAM / Visual Odometry open source, blogs, and
5. Lee-SLAM-source: SLAM 开发学习资源与经验分享
6. awesome-SLAM-list
7. VIO-Resources

PC End SLAM
Visual SLAM

General

1. BreezySLAM: Simple, efficient, open-source package for Simultaneous Localization and Mapping

Monocular Visual SLAM

1. ORB_SLAM: A Versatile and Accurate Monocular SLAM

Stereo Visual SLAM

1. ORB_SLAM2
2. ORBSLAM2_with_pointcloud_map
3. PL-SLAM: a Stereo SLAM System through the Combination of Points and Line Segments
4. StVO-PL: Stereo Visual Odometry by combining point and line segment features
5. stereo-dso: Direct Sparse Odometry with Stereo Cameras
6. S-PTAM: Stereo Parallel Tracking and Mapping
7. Robust Stereo Visual Odometry

RGB-D Visual SLAM

Visual Inertial SLAM

General
1. maplab: An open visual-inertial mapping framework.

Monocular Visual-Inertial SLAM

1. ROVIO (Robust Visual Inertial Odometry)


2. OKVIS: Open Keyframe-based Visual-Inertial SLAM (ROS Version)
3. LearnVIORB: Visual Inertial SLAM based on ORB-SLAM2 (ROS Version), LearnViORB_NOROS (No

Stereo Visual-Inertial SLAM

1. msckf_vio: Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight
2. ORBSLAM_DWO: stereo + inertial input based on ORB_SLAM
3. LearnVIORBnorosgai2: Visual Inertial SLAM based on ORB-SLAM2 (Non-ROS Version)
4. ygz-stereo-inertial: a stereo-inertial visual odometry

Mobile End SLAM


1. VINS-Mobile: Monocular Visual-Inertial State Estimator on Mobile Phones
2. ORB_SLAM2-iOS
3. ORB_SLAM-iOS
4. MobileSLAM - LSD SLAM on Mobile Phone

Depth Image API with iPhone 7 Plus (or newer)

1. DepthAPISampleForiOS11
2. AVDepthCamera
3. ios11-depth-map-test

Tutorials
1. Computer Vision/Geometric Fundamentals of SLAM

General

1. 视觉SLAM十四讲/14 lectures on visual SLAM,English Version,中文版


2. Practice of the SlamBook
3. GraphSLAM_tutorials_code
4. SLAM 开发学习资源与经验分享
5. Visual SLAM/VIO 算法笔记

Lie Algebra and Lie Groups

1. Lie groups for Computer Vision


2. Lie groups for 2D and 3D Transformations
3. Hermite Splines in Lie Groups as Products of Geodesics

Optimization Techniques

1. Gauss-Newton/Levenberg-Marquardt Optimization
2. How a Kalman filter works, in pictures
3. 卡爾曼濾波 (Kalman Filter)
4. 翻譯 Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Deriv

Selected Blogs
1. The Future of Real-Time SLAM and Deep Learning vs SLAM
2. IMU Data Fusing: Complementary, Kalman, and Mahony Filter
cts and communities.

/ VO (Visual Odometry) algorithms


ometry open source, blogs, and papers

neous Localization and Mapping in Python, Matlab, Java, and C++

ints and Line Segments


egment features
rsion), LearnViORB_NOROS (Non-ROS Version)

nomous Flight

2 (Non-ROS Version)
a Simple and Intuitive Derivation
Rotation (deg/m) - all -

(empty)
Nombre de core utilisé Nombre de Method
1 64
1.5 1
2 21
4 10
5 1
8 4
>8 1
AGX 1
GPU 5
(empty) 1
Total Result 109
PAPERS
https://github.com/liulinbo/slam
GitHub
liulinbo/slam
learning SLAM,curse,paper and others. Contribute to liulinbo/slam development by creating an account
on GitHub.
TYPES DE SLAM
http://wiki.ros.org/crsm_slam
http://wiki.ros.org/orb_slam2_ros
https://github.com/tum-vision/lsd_slam
GitHub
tum-vision/lsd_slam
LSD-SLAM. Contribute to tum-vision/lsd_slam development by creating an account on GitHub.

http://wiki.ros.org/hector_slam
http://www.cvlibs.net/datasets/kitti/eval_odometry.php

Method Setting Code Translation Rotation Runtime

1 OFSVO 0.00 % 0.0000 0.02 s


[deg/m]

0.0013
2 V-LOAM 0.55 % 0.1 s
[deg/m]

J. Zhang and S. Singh: Visual-lidar Odometry and Mapping: Low drift, Robust, and Fast. IEEE International Confere

J. Zhang and S. Singh: LOAM: Lidar Odometry and Mapping in Real- time. Robotics: Science and Systems Conferen

0.0014
4 IMLS-SLAM++ 0.61 % [deg/m] 1.3 s

0.0014
5 SOFT2 0.65 % 0.1 s
[deg/m]

I. Cvišić, J. Ćesić, I. Marković and I. Petrović: SOFT-SLAM: Computationally Efficient Stereo Visual SLAM for Autono

0.0018
6 IMLS-SLAM 0.69 % 1.25 s
[deg/m]
J. Deschaud: IMLS-SLAM: Scan-to-Model Matching Based on 3D Data. 2018 IEEE International Conference on Rob

0.0016
7 MC2SLAM 0.69 % 0.1 s
[deg/m]

F. Neuhaus, T. Koss, R. Kohnen and D. Paulus: MC2SLAM: Real-Time Inertial Lidar Odometry using Two-Scan Moti

0.0026
8 ESO 0.80 % 0.08 s
[deg/m]

0.0025
9 sGAN-VO 0.81 % 0.1 s
[deg/m]

0.0020
10 LG-SLAM 0.82 % 0.2 s
[deg/m]

K. Lenac, J. Ćesić, I. Marković and I. Petrović: Exactly sparse delayed state filter on Lie groups for long-term pose g

0.0026
11 RotRocc+ 0.83 % 0.25 s
[deg/m]

M. Buczko and V. Willert: Flow-Decoupled Normalized Reprojection Error for Visual Odometry. 19th IEEE Intellige

M. Buczko, V. Willert, J. Schwehr and J. Adamy: Self-Validation for Automotive Visual Odometry. IEEE Intelligent V

M. Buczko: Automotive Visual Odometry. 2018.

0.0022
12 LIMO2_GP code 0.84 % 0.2 s
[deg/m]

J. Graeter, A. Wilczynski and M. Lauer: LIMO: Lidar-Monocular Visual Odometry. arXiv preprint arXiv:1807.07524

0.0023
13 UFSF-VLO 0.84 % 0.05 s
[deg/m]
0.0031
14 GDVO 0.86 % 0.09 s
[deg/m]

J. Zhu: Image Gradient-based Joint Direct Visual Odometry for Stereo Camera. International Joint Conference on

0.0022
15 LIMO2 code 0.86 % 0.2 s
[deg/m]

J. Graeter, A. Wilczynski and M. Lauer: LIMO: Lidar-Monocular Visual Odometry. arXiv preprint arXiv:1807.07524

0.0036
16 ICP_LO 0.87 % 0.05 s
[deg/m]

0.0025
17 CPFG-slam 0.87 % 0.03 s
[deg/m]

K. Ji and T. Huiyan Chen: CPFG-SLAM:a robust Simultaneous Localization and Mapping based on LIDAR in off-road

0.0026
18 MLG-VSLAM+ 0.88 % 0.8 s
[deg/m]

0.0022
19 SOFT 0.88 % [deg/m] 0.1 s

I. Cvišić and I. Petrović: Stereo odometry based on careful feature selection and tracking. European Conference o

0.0025
20 RotRocc 0.88 % 0.3 s
[deg/m]

M. Buczko and V. Willert: Flow-Decoupled Normalized Reprojection Error for Visual Odometry. 19th IEEE Intellige

0.0021
21 U-DVSO 0.88 % 0.1 s
[deg/m]

22 scan-to-map PNDT-D2D 0.89 % 0.0030 0.5 s


[deg/m]

0.0021
23 DVSO 0.90 % 0.1 s
[deg/m]
N. Yang, R. Wang, J. Stueckler and D. Cremers: Deep Virtual Stereo Odometry: Leveraging Deep Depth Prediction

0.0028
24 MLG-VSLAM 0.93 % 0.8 s
[deg/m]

0.0026
25 LIMO code 0.93 % 0.2 s
[deg/m]

J. Graeter, A. Wilczynski and M. Lauer: LIMO: Lidar-Monocular Visual Odometry. ArXiv e-prints 2018.

0.0020
26 Stereo DSO 0.93 % 0.1 s
[deg/m]

R. Wang, M. Schw\"orer and D. Cremers: Stereo dso: Large-scale direct sparse visual odometry with stereo came

0.0034
27 MLG-SLAM 0.96 % 0.5 s
[deg/m]

0.0028
28 ROCC 0.98 % 0.3 s
[deg/m]

M. Buczko and V. Willert: How to Distinguish Inliers from Outliers in Visual Odometry for High-speed Automotive

0.0044
29 S4-SLAM 0.98 % 0.2 s
[deg/m]

0.0037
30 DLOAM 0.99 % 2s
[deg/m]

0.0025
31 LiOd 1.01 % 1s
[deg/m]

0.0023
32 IsaacElbrus code 1.02 % 0.0095 s
[deg/m]

0.0040
33 DLO 1.02 % 0.1s
[deg/m]
0.0053
34 S4OM 1.03 % 0.15 s
[deg/m]

0.0043
35 NDT_LO 1.05 % [deg/m] 0.15s

0.0034
36 SuMa++ 1.06 % 0.1 s
[deg/m]

X. Chen, A. Milioto, E. Palazzolo, P. Gigu\`ere, J. Behley and C. Stachniss: SuMa++: Efficient LiDAR-based Semantic

0.0029
37 cv4xv1-sc 1.09 % 0.145 s
[deg/m]

M. Persson, T. Piccini, R. Mester and M. Felsberg: Robust Stereo Visual Odometry from Monocular Techniques. IE

0.0033
38 VINS-Fusion code 1.09 % 0.1s
[deg/m]

T. Qin, J. Pan, S. Cao and S. Shen: A General Optimization-based Framework for Local Odometry Estimation with M

0.0023
39 FPVO 1.10 % [deg/m] 0.08 s

0.0028
40 MonoROCC 1.11 % [deg/m] 1s

M. Buczko and V. Willert: Monocular Outlier Detection for Visual Odometry. IEEE Intelligent Vehicles Symposium

41 ISRI_VO2 1.12 % 0.0029 0.1 s


[deg/m]

0.0030
42 ISRI_VO 1.13 % 0.1 s
[deg/m]

0.0049
43 DEMO 1.14 % [deg/m] 0.1 s

J. Zhang, M. Kaess and S. Singh: Real-time Depth Enhanced Monocular Odometry. IEEE/RSJ International Confere

0.0027
44 ORB-SLAM2 code 1.15 % 0.06 s
[deg/m]
R. Mur-Artal and J. Tard\'os: ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Came

0.0032
45 ElbrusFast 1.15 % 0.018 s
[deg/m]

D. Slepichev, S. Volodarskiy and E. Vendrovsky: Realtime Stereo Visual Odometry. .

0.0035
46 NOTF 1.17 % 0.45 s
[deg/m]

J. Deigmoeller and J. Eggert: Stereo Visual Odometry without Temporal Filtering. German Conference on Pattern

47 FSMVO 1.18 % 0.0022 0.1 s


[deg/m]

0.0025
48 S-PTAM code 1.19 % 0.03 s
[deg/m]

T. Pire, T. Fischer, G. Castro, P. De Crist\'oforis, J. Civera and J. Jacobo Berlles: S-PTAM: Stereo Parallel Tracking an

T. Pire, T. Fischer, J. Civera, P. Crist\'{o}foris and J. Jacobo-Berlles: Stereo parallel tracking and mapping for robot

0.0033
49 S-LSD-SLAM code 1.20 % [deg/m] 0.07 s

J. Engel, J. St\"uckler and D. Cremers: Large-Scale Direct SLAM with Stereo Cameras. Int.~Conf.~on Intelligent Rob

0.0029
50 VoBa 1.22 % [deg/m] 0.1 s

J. Tardif, M. George, M. Laverne, A. Kelly and A. Stentz: A new approach to vision-aided inertial navigation. 2010

0.0058
51 STEAM-L WNOJ 1.22 % 0.2 s
[deg/m]

T. Tang, D. Yoon and T. Barfoot: A White-Noise-On-Jerk Motion Prior for Continuous-Time Trajectory Estimation o

0.0042
52 LiViOdo 1.22 % 0.5 s
[deg/m]

J. Graeter, A. Wilczynski and M. Lauer: LIMO: Lidar-Monocular Visual Odometry. ArXiv e-prints 2018.

0.0041
53 SLUP 1.25 % 0.17 s
[deg/m]

X. Qu, B. Soheilian and N. Paparoditis: Landmark based localization in urban environment. ISPRS Journal of Photo
0.0061
54 STEAM-L 1.26 % 0.2 s
[deg/m]

T. Tang, D. Yoon, F. Pomerleau and T. Barfoot: Learning a Bias Correction for Lidar- only Motion Estimation. 15th

0.0038
55 FRVO 1.26 % 0.03 s
[deg/m]

W. Meiqing, L. Siew-Kei and S. Thambipillai: A Framework for Fast and Robust Visual Odometry. IEEE Transaction

0.0026
56 RTAB-Map code 1.26 % 0.1 s
[deg/m]

0.0029
57 ORBVB code 1.26 % 0.06 s
[deg/m]

0.0034
58 A-LOAM code 1.26 % 0.1 s
[deg/m]

0.0022
59 MIGP 1.28 % 0.2 s
[deg/m]

0.0030
60 MFI 1.30 % 0.1 s
[deg/m]

H. Badino, A. Yamamoto and T. Kanade: Visual Odometry by Multi-frame Feature Integration. First International

0.0038
61 TLBBA 1.36 % 0.1 s
[deg/m]

W. Lu, Z. Xiang and J. Liu: High-performance visual odometry with two- stage local binocular BA and GPU. Intellig

0.0035
62 2FO-CC code 1.37 % 0.1 s
[deg/m]

I. Krešo and S. Šegvić: Improving the Egomotion Estimation by Correcting the Calibration Bias. VISAPP 2015.

0.0051
63 SALO 1.37 % 0.6 s
[deg/m]
D. Kovalenko, M. Korobkin and A. Minin: Sensor Aware Lidar Odometry. 2019 European Conference on Mobile Ro

0.0034
64 SuMa 1.39 % 0.1 s
[deg/m]

J. Behley and C. Stachniss: Efficient Surfel-Based SLAM using 3D Laser Range Data in Urban Environments. Roboti

0.0035
65 ProSLAM code 1.39 % 0.02 s
[deg/m]

D. Schlegel, M. Colosi and G. Grisetti: ProSLAM: Graph SLAM from a Programmer's Perspective. ArXiv e-prints 201

66 JFBVO 1.43 % 0.0038 0.05 s


[deg/m]

0.0051
67 DQV-SLAM 1.47 % 0.2 s
[deg/m]

S. Bultmann, K. Li and U. Hanebeck: Stereo Visual SLAM Based on Unscented Dual Quaternion Filtering. Proceedi

0.0042
68 StereoSFM code 1.51 % 0.02 s
[deg/m]

H. Badino and T. Kanade: A Head-Wearable Short-Baseline Stereo System for the Simultaneous Estimation of Stru

0.0050
69 Leso 1.57 % [deg/m] 0.05 s

0.0044
70 SSLAM code 1.57 % 0.5 s
[deg/m]

F. Bellavia, M. Fanfani, F. Pazzaglia and C. Colombo: Robust Selective Stereo SLAM without Loop Closure and Bun

F. Bellavia, M. Fanfani and C. Colombo: Selective visual odometry for accurate AUV localization. Autonomous Rob

M. Fanfani, F. Bellavia and C. Colombo: Accurate Keyframe Selection and Keypoint Tracking for Robust Visual Odo

0.0061
71 ICP SLAM 1.61 % 0.1 s
[deg/m]

0.0054
72 MLLAB 1.72 % 0.02 s
[deg/m]
0.0036
73 eVO 1.76 % 0.05 s
[deg/m]

M. Sanfourche, V. Vittori and G. Besnerais: eVO: A realtime embedded stereo odometry for MAV applications. IEE

0.0026
74 Stereo DWO code 1.76 % 0.1 s
[deg/m]

J. Huai, C. Toth and D. Grejner-Brzezinska: Stereo-inertial odometry using nonlinear optimization. Proceedings of

0.0036
75 BVO 1.76 % 0.1 s
[deg/m]

F. Pereira, J. Luft, G. Ilha, A. Sofiatti and A. Susin: Backward Motion for Estimation Enhancement in Sparse Visual

0.0051
76 D6DVO 2.04 % 0.03 s
[deg/m]

A. Comport, E. Malis and P. Rives: Accurate Quadrifocal Tracking for Robust 3D Visual Odometry. ICRA 2007.
M. Meilland, A. Comport and P. Rives: Dense visual mapping of large scale environments for real-time localisation

0.0051
77 PMO / PbT-M2 2.05 % [deg/m] 1s

N. Fanani, A. Stuerck, M. Ochs, H. Bradler and R. Mester: Predictive monocular odometry (PMO): What is possible

0.0056
78 GFM code 2.12 % 0.03 s
[deg/m]

Y. Zhao and P. Vela: Good Feature Matching: Towards Accurate, Robust VO/VSLAM with Low Latency. submitted

0.0059
79 SSLAM-HR code 2.14 % 0.5 s
[deg/m]

F. Bellavia, M. Fanfani, F. Pazzaglia and C. Colombo: Robust Selective Stereo SLAM without Loop Closure and Bun

F. Bellavia, M. Fanfani and C. Colombo: Selective visual odometry for accurate AUV localization. Autonomous Rob

M. Fanfani, F. Bellavia and C. Colombo: Accurate Keyframe Selection and Keypoint Tracking for Robust Visual Odo

0.0049
80 FTMVO 2.24 % [deg/m] 0.11 s

H. Mirabdollah and B. Mertsching: Fast Techniques for Monocular Visual Odometry . Proceeding of 37th German
0.0053
81 PbT-M1 2.38 % 1s
[deg/m]

N. Fanani, M. Ochs, H. Bradler and R. Mester: Keypoint trajectory estimation using propagation based tracking. In

N. Fanani, A. Stuerck, M. Barnada and R. Mester: Multimodal scale estimation for monocular visual odometry. In

0.0114
82 VISO2-S code 2.44 % 0.05 s
[deg/m]

A. Geiger, J. Ziegler and C. Stiller: StereoScan: Dense 3d Reconstruction in Real-time. IV 2011.

0.0057
83 MLM-SFM 2.54 % 0.03 s
[deg/m]

S. Song and M. Chandraker: Robust Scale Estimation in Real-Time Monocular SFM for Autonomous Driving. CVPR
S. Song, M. Chandraker and C. Guest: Parallel, Real-time Monocular Visual Odometry. ICRA 2013.

0.0078
84 GT_VO3pt 2.54 % 1.26 s
[deg/m]

C. Beall, B. Lawrence, V. Ila and F. Dellaert: 3D reconstruction of underwater structures. IROS 2010.
0.0089
85 PeNet 2.54 % 0.1 s
[deg/m]

0.0086
86 RMCPE+GP 2.55 % [deg/m] 0.39 s

M. Mirabdollah and B. Mertsching: On the Second Order Statistics of Essential Matrix Elements. Proceeding of 36

0.0079
87 LKNMVO 2.66 % 0.3 s
[deg/m]

0.0068
88 VO3pt 2.69 % 0.56 s
[deg/m]

P. Alcantarilla: Vision Based Localization: From Humanoid Robots to Visually Impaired People. 2011.
P. Alcantarilla, J. Yebes, J. Almazán and L. Bergasa: On Combining Visual SLAM and Dense Scene Flow to Increase

0.0077
89 TGVO 2.94 % 0.06 s
[deg/m]

B. Kitt, A. Geiger and H. Lategahn: Visual Odometry based on Stereo Image Sequences with RANSAC-based Outlie

0.0132
90 OISEL 3.02 % 0.2 s
[deg/m]
0.0104
91 VO3ptLBA 3.13 % 0.57 s
[deg/m]

P. Alcantarilla: Vision Based Localization: From Humanoid Robots to Visually Impaired People. 2011.
P. Alcantarilla, J. Yebes, J. Almazán and L. Bergasa: On Combining Visual SLAM and Dense Scene Flow to Increase

0.0095
92 PLSVO 3.26 % 0.20 s
[deg/m]

R. Gomez-Ojeda and J. Gonzalez- Jimenez: Robust Stereo Visual Odometry through a Probabilistic Combination o

93 NO_OISEL 3.45 % 0.0144 0.1 s


[deg/m]

0.0128
94 BLF 3.49 % [deg/m] 0.7 s

M. Velas, M. Spanel, M. Hradis and A. Herout: CNN for IMU Assisted Odometry Estimation using Velodyne LiDAR.

0.0107
95 CFORB 3.73 % [deg/m] 0.9 s

D. Mankowitz and E. Rivlin: CFORB: Circular FREAK-ORB Visual Odometry. arXiv preprint arXiv:1506.05257 2015.

0.0099
96 VOFS 3.94 % 0.51 s
[deg/m]

M. Kaess, K. Ni and F. Dellaert: Flow separation for fast and robust stereo odometry. ICRA 2009.
P. Alcantarilla, L. Bergasa and F. Dellaert: Visual Odometry priors for robust EKF-SLAM. ICRA 2010.

0.0096
97 UnDFVO 4.03 % 1s
[deg/m]

0.0112
98 VOFSLBA 4.17 % 0.52 s
[deg/m]

M. Kaess, K. Ni and F. Dellaert: Flow separation for fast and robust stereo odometry. ICRA 2009.
P. Alcantarilla, L. Bergasa and F. Dellaert: Visual Odometry priors for robust EKF-SLAM. ICRA 2010.

0.0052
99 CUDA-EgoMotion 4.36 % .001 s
[deg/m]

A. Aguilar-González, M. Arias- Estrada, F. Berry and J. Osuna-Coutiño: The Fastest Visual Ego-motion Algorithm in

0.0175
100 BCC 4.59 % 1s
[deg/m]
M. Velas, M. Spanel, M. Hradis and A. Herout: CNN for IMU Assisted Odometry Estimation using Velodyne LiDAR.

0.0274
101 EB3DTE+RJMCM 5.45 % 1s
[deg/m]

Z. Boukhers, K. Shirahama and M. Grzegorzek: Example-based 3D Trajectory Extraction of Objects from 2D Video

Z. Boukhers, K. Shirahama and M. Grzegorzek: Less restrictive camera odometry estimation from monocular cam

0.0065
102 LVT 5.80 % 0.02 s
[deg/m]

0.0245
103 VISO2-M + GP 7.46 % 0.15 s
[deg/m]

A. Geiger, J. Ziegler and C. Stiller: StereoScan: Dense 3d Reconstruction in Real-time. IV 2011.


S. Song and M. Chandraker: Robust Scale Estimation in Real-Time Monocular SFM for Autonomous Driving. CVPR

0.0227
104 unscene 8.68 % 0.1 s
[deg/m]

0.0163
105 BLO 9.21 % [deg/m] 0.1 s

M. Velas, M. Spanel, M. Hradis and A. Herout: CNN for IMU Assisted Odometry Estimation using Velodyne LiDAR.

0.0069
106 PJ-test 11.79 % [deg/m] 0.1 s

107 VISO2-M code 11.94 % 0.0234 0.1 s


[deg/m]

A. Geiger, J. Ziegler and C. Stiller: StereoScan: Dense 3d Reconstruction in Real-time. IV 2011.

0.0115
108 MTL 14.95 % 0.1 s
[deg/m]

0.0135
109 OABA 20.95 % [deg/m] 0.5 s

D. Frost, O. Kähler and D. Murray: Object-Aware Bundle Adjustment for Correcting Monocular Scale Drift. Procee
0.0394
110 3DC-VO 21.00 % 0.04 s
[deg/m]

0.0489
111 DeepVO 24.55 % 1s
[deg/m]
by creating an account

account on GitHub.

Environmen
t Related Datasets
1 core @ 3.5
Ghz (C/C++)

CMU Visual Localization Data Set: Dataset collected using the Navlab 11
2 cores @
2.5 Ghz NYU RGB-D Dataset: Indoor dataset captured with a Microsoft Kinect th
(C/C++)

Fast. IEEE International Conference on Robotics and Automation(ICRA)


TUM2015.
RGB-D Dataset: Indoor dataset captured with Microsoft Kinect an
New College Dataset: 30 GB of data for 6 D.O.F. navigation and mappin
s: Science and Systems Conference (RSS) 2014. The Rawseeds Project: Indoor and outdoor datasets with GPS, odometr

1 core @
>3.5 Ghz Victoria Park Sequence: Widely used sequence for evaluating laser-base
(C/C++)
Malaga Dataset 2009 and Malaga Dataset 2013: Dataset with GPS, Cam
2 cores @
2.5 Ghz Ford Campus Vision and Lidar Dataset: Dataset collected by a Ford F-25
(C/C++)

t Stereo Visual SLAM for Autonomous UAVs. Journal of Field Robotics 2017.

1 core @
>3.5 Ghz Citation
(C/C++)
nternational Conference on Robotics and Automation (ICRA) 2018.

4 cores @
2.5 Ghz
(C/C++) When using this dataset in your research, we will be happy if you cite us:
Odometry using Two-Scan Motion Compensation. German Conference on Pattern Recognition 2018.
@INPROCEEDINGS{Geiger2012CVPR,
4 cores @
3.0 Ghz
(C/C++)   author = {Andreas Geiger and Philip Lenz and Raquel Urtasun},
  title = {Are we ready for Autonomous Driving? The KITTI Vision Benchma

1 core @ 2.5
Ghz (C/C++)
  booktitle = {Conference on Computer Vision and Pattern Recognition (C

  year = {2012}
4 cores @
2.5 Ghz
(C/C++) }
Lie groups for long-term pose graph SLAM. The International Journal of Robotics Research 2018.

2 cores @
2.0 Ghz
(C/C++)

al Odometry. 19th IEEE Intelligent Transportation Systems Conference (ITSC) 2016. 1

sual Odometry. IEEE Intelligent Vehicles Symposium (IV) 2018. 2

2 cores @
2.5 Ghz
(C/C++)

arXiv preprint arXiv:1807.07524 2018.

4 cores @
3.0 Ghz
(C/C++)
1 core @
>3.5 Ghz
(C/C++)

ernational Joint Conference on Artificial Intelligence, IJCAI 2017.

2 cores @
2.5 Ghz
(C/C++)

arXiv preprint arXiv:1807.07524 2018.

1 core @ 2.5
Ghz (C/C++)

4 cores @
2.5 Ghz
(C/C++)

pping based on LIDAR in off-road environment. IEEE Intelligent Vehicles Symposium (IV) 2018.

1 core @ 3.5
Ghz (C/C++)

2 cores @
2.5 Ghz
(C/C++)

racking. European Conference on Mobile Robots (ECMR) 2015.

2 cores @
2.0 Ghz
(C/C++)

al Odometry. 19th IEEE Intelligent Transportation Systems Conference (ITSC) 2016.

1 core @ 2.5
Ghz (C/C++)

4 cores @
>3.5 Ghz
(C/C++)

GPU @ 2.5
Ghz (C/C++)
veraging Deep Depth Prediction for Monocular Direct Sparse Odometry. European Conference on Computer Vision (ECCV) 2018.

1 core @ 3.5
Ghz (C/C++)

2 cores @
2.5 Ghz
(C/C++)
ArXiv e-prints 2018.

1 core @ 3.4
Ghz (C/C++)

ual odometry with stereo cameras. International Conference on Computer Vision (ICCV), Venice, Italy 2017.

1 core @ 3.5
Ghz (C/C++)

2 cores @
2.0 Ghz
(C/C++)

etry for High-speed Automotive Applications. IEEE Intelligent Vehicles Symposium (IV) 2016.

2 core @ 3.0
Ghz (C/C++)

8 cores @
3.5 Ghz
(Python)

>8 cores @
2.5 Ghz
(C/C++)

AGX Jetson
Xavier
(0.03s
Jetson
Nano)

1 core @ 2.8
Ghz (C/C++)
1 core @ 2.5
Ghz (C/C++)

1 core @ 2.5
Ghz (C/C++)

1 core @ 3.5
Ghz (C/C++)

Efficient LiDAR-based Semantic SLAM. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019.

GPU @ 3.5
Ghz (C/C++)

from Monocular Techniques. IEEE Intelligent Vehicles Symposium 2015.

1 core @ 3.0
Ghz (C/C++)

ocal Odometry Estimation with Multiple Sensors. 2019.

4 cores @
2.3 Ghz
(C/C++)

2 cores @
2.0 Ghz
(C/C++)

Intelligent Vehicles Symposium (IV) 2017.

1 core @ 2.5
Ghz (C/C++)

1 core @ 2.5
Ghz (C/C++)

2 cores @
2.5 Ghz
(C/C++)

. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2014.

2 cores @
>3.5 Ghz
(C/C++)
ocular, Stereo and RGB-D Cameras. IEEE Transactions on Robotics 2017.

1.5 cores @
3.3 Ghz
(C/C++)
.

1 core @ 3.0
Ghz (C/C++)

German Conference on Pattern Recognition (GCPR) 2016.

1 core @ 2.5
Ghz (C/C++)

4 cores @
3.0 Ghz
(C/C++)

TAM: Stereo Parallel Tracking and Mapping. Robotics and Autonomous Systems (RAS) 2017.

tracking and mapping for robot localization. IROS 2015.

1 core @ 3.5
Ghz (C/C++)

as. Int.~Conf.~on Intelligent Robot Systems (IROS) 2015.

1 core @ 2.0
Ghz (C/C++)

-aided inertial navigation. 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, October 18-22, 2010, Taipei, Taiwan

1 core @ 2.5
Ghz (C/C++)

ous-Time Trajectory Estimation on SE (3). arXiv preprint arXiv:1809.06518 2018.

1 core @ 2.5
Ghz (C/C++)

ArXiv e-prints 2018.


4 cores @
3.3 Ghz
(C/C++)

onment. ISPRS Journal of Photogrammetry and Remote Sensing 2017.


1 core @ 2.5
Ghz (C/C++)

r- only Motion Estimation. 15th Conference on Computer and Robot Vision (CRV) 2018.

1 core @ 3.5
Ghz (C/C++)

ual Odometry. IEEE Transaction on Intelligent Transportation Systems 2017.

1 core @ 2.5
Ghz (C/C++)

2 cores @
>3.5 Ghz
(C/C++)

1 core @ 2.5
Ghz (C/C++)

2 cores @
2.5 Ghz
(C/C++)

1 core @ 2.2
Ghz (C/C++)

Integration. First International Workshop on Computer Vision for Autonomous Driving at ICCV 2013.

1 Core
@2.8GHz
(C/C++)

l binocular BA and GPU. Intelligent Vehicles Symposium (IV), 2013 IEEE 2013.

1 core @ 3.0
Ghz (C/C++)

bration Bias. VISAPP 2015.

1 core @ 2.5
Ghz (C/C++)
opean Conference on Mobile Robots (ECMR) 2019.

1 core @ 3.5
Ghz (C/C++)

in Urban Environments. Robotics: Science and Systems (RSS) 2018.

1 core @ 3.0
Ghz (C/C++)

s Perspective. ArXiv e-prints 2017.

1 core @ 3.4
Ghz (C/C++)

2 cores @
2.5 Ghz
(C/C++)

l Quaternion Filtering. Proceedings of the 22nd International Conference on Information Fusion (Fusion 2019) 2019.

2 cores @
2.5 Ghz
(C/C++)

Simultaneous Estimation of Structure and Motion. IAPR Conference on Machine Vision Application 2011.

1 core @ 2.5
Ghz (C/C++)

8 cores @
3.5 Ghz
(C/C++)

M without Loop Closure and Bundle Adjustment. ICIAP 2013 2013.

V localization. Autonomous Robots 2015.

t Tracking for Robust Visual Odometry. Machine Vision and Applications 2016.

1 core @
>3.5 Ghz
(C/C++)

1 core @ 2.5
Ghz (C/C++)
2 cores @
2.0 Ghz
(C/C++)

ometry for MAV applications. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2013.

4 cores @
2.5 Ghz
(C/C++)

ar optimization. Proceedings of the 27th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS+

1 core @
2.5GHz
(Python)

Enhancement in Sparse Visual Odometry. 2017 Workshop of Computer Vision (WVC) 2017.

1 core @ 2.5
Ghz (C/C++)

sual Odometry. ICRA 2007.


nments for real-time localisation. ICRA 2011.

1 core @ 2.5
Ghz (Python
+ C/C++)

dometry (PMO): What is possible without RANSAC and multiframe bundle adjustment?. Image and Vision Computing 2017.

2 cores @
1.5 Ghz
(C/C++)

M with Low Latency. submitted to IEEE Transactions on Robotics 2019.

8 cores @
3.5 Ghz
(C/C++)

M without Loop Closure and Bundle Adjustment. ICIAP 2013 2013.

V localization. Autonomous Robots 2015.

t Tracking for Robust Visual Odometry. Machine Vision and Applications 2016.

1 core @ 2.5
Ghz (C/C++)

ry . Proceeding of 37th German Conference on Pattern Recognition (GCPR) 2015 .


1 core @ 2.5
Ghz (Python
+ C/C++)

g propagation based tracking. Intelligent Vehicles Symposium (IV) 2016.

monocular visual odometry. Intelligent Vehicles Symposium (IV) 2017.

1 core @ 2.5
Ghz (C/C++)

me. IV 2011.
5 cores @
2.5 Ghz
(C/C++)

M for Autonomous Driving. CVPR 2014.


etry. ICRA 2013.

1 core @ 2.5
Ghz (C/C++)

ctures. IROS 2010.


GPU @ 2.0
Ghz (C/C++)

1 core @ 2.5
Ghz (C/C++)

atrix Elements. Proceeding of 36th German Conference on Pattern Recognition 2014.

GPU @ 3.0
Ghz (C/C++)

1 core @ 2.0
Ghz (C/C++)

ired People. 2011.


d Dense Scene Flow to Increase the Robustness of Localization and Mapping in Dynamic Environments. ICRA 2012.

1 core @ 2.5
Ghz (C/C++)

nces with RANSAC-based Outlier Rejection Scheme. IV 2010.

1 core @ 2.5
Ghz (C/C++)
1 core @ 2.0
Ghz (C/C++)

ired People. 2011.


d Dense Scene Flow to Increase the Robustness of Localization and Mapping in Dynamic Environments. ICRA 2012.

2 cores @
2.5 Ghz
(C/C++)

h a Probabilistic Combination of Points and Line Segments. Robotics and Automation (ICRA), 2016 IEEE International Conference on 2016.

1 core @ 2.5
Ghz (C/C++)

1 core @ 2.5
Ghz (C/C++)

timation using Velodyne LiDAR. ArXiv e-prints 2017.

8 cores @
3.0 Ghz
(C/C++)
reprint arXiv:1506.05257 2015.

1 core @ 2.0
Ghz (C/C++)

try. ICRA 2009.


LAM. ICRA 2010.
1 core @ 2.5
Ghz
(Python)

1 core @ 2.0
Ghz (C/C++)

try. ICRA 2009.


LAM. ICRA 2010.
GPU @ 2.5
Ghz
(Matlab)

Visual Ego-motion Algorithm in the West. Microprocessors and Microsystems 2019.

1 core @ 2.5
Ghz (C/C++)
timation using Velodyne LiDAR. ArXiv e-prints 2017.

1 core @ 2.5
Ghz
(Matlab)

action of Objects from 2D Videos. Circuits and Systems for Videos Technology (TCSVT), IEEE Transaction on 2017.

estimation from monocular camera. Multimedia Tools and Applications 2017.

2 cores @
2.5 Ghz
(C/C++)

1 core @ 2.5
Ghz (C/C++)

me. IV 2011.
M for Autonomous Driving. CVPR 2014.

GPU @ 2.5
Ghz (C/C++)

1 core @ 2.5
Ghz (C/C++)

timation using Velodyne LiDAR. ArXiv e-prints 2017.

1 core @ 2.5
Ghz (C/C++)

1 core @ 2.5
Ghz (C/C++)

me. IV 2011.

1 core @ 2.5
Ghz (C/C++)

1 core @ 3.5
Ghz (C/C++)

g Monocular Scale Drift. Proceedings of the International Conference on Robotics and Automation (ICRA) 2012.
1 core @ 2.5
Ghz (Python
+ C/C++)

1 core @ 2.5
Ghz
(Python)
et collected using the Navlab 11 equipped with IMU, GPS, Lidars and cameras.

tured with a Microsoft Kinect that provides semantic labels.

ptured with Microsoft Kinect and high-accuracy motion capturing.


6 D.O.F. navigation and mapping (metric or topological) using vision and/or laser.
oor datasets with GPS, odometry, stereo, omnicam and laser measurements for visual, laser-based, omnidirectional, sonar and multi-sens

quence for evaluating laser-based SLAM. Trees serve as landmarks, detection code is included.

et 2013: Dataset with GPS, Cameras and 3D laser information, recorded in the city of Malaga, Spain.

Dataset collected by a Ford F-250 pickup, equipped with IMU, Velodyne and Ladybug.
we will be happy if you cite us:

z and Raquel Urtasun},


iving? The KITTI Vision Benchmark Suite},

sion and Pattern Recognition (CVPR)},

Environment

Nombre de Fréquence
Method Translation Rotation Runtime Language utilisé
core utilisé (GHz)

0.0000 1 core @ 3.5


OFSVO 0.00 % 0.02 s C/C++ 3.5
[deg/m] Ghz (C/C++)

0.0013 2 cores @
V-LOAM 0.55 % 0.1 s C/C++ 2.5 Ghz 2.5
[deg/m]
(C/C++)
ectional, sonar and multi-sensor SLAM evaluation.
Fréquence
Method Translation Rotation Runtime Language Nombre de utilisé
(deg/m) ( en s) core utilisé (GHz)

1 OFSVO 0.00 % 0.0000 0.02 s C/C++ 1 3.5

2 V-LOAM 0.55 % 0.0013 0.1 s C/C++ 2 2.5

3 SOFT2 0.65 % 0.0014 0.1 s C/C++ 2


2.5

4 IMLS-SLAM 0.69 % 0.0018 1.25 s C/C++ 1 >3,5

5 MC2SLAM 0.69 % 0.0016 0.1 s C/C++ 4 2.5

6 ESO 0.80 % 0.0026 0.08 s C/C++ 4 3

7 sGAN-VO 0.81 % 0.0025 0.1 s C/C++ 1 2.5

8 LG-SLAM 0.82 % 0.0020 0.2 s C/C++ 4 2.5

9 RotRocc+ 0.83 % 0.0026 0.25 s C/C++ 2 2

10 LIMO2_GP 0.84 % 0.0022 0.2 s C/C++ 2 2.5


11 UFSF-VLO 0.84 % 0.0023 0.05 s C/C++ 4 3

12 GDVO 0.86 % 0.0031 0.09 s C/C++ 1 >3,5

13 LIMO2 0.86 % 0.0022 0.2 s C/C++ 2 2.5

14 ICP_LO 0.87 % 0.0036 0.05 s C/C++ 1 2.5

15 CPFG-slam 0.87 % 0.0025 0.03 s C/C++ 4 2.5

16 MLG-VSLAM+ 0.88 % 0.0026 0.8 s C/C++ 1 3.5

17 C/C++ 1 3.5

18 SOFT 0.88 % 0.0022 0.1 s C/C++ 2 2.5

19 RotRocc 0.88 % 0.0025 0.3 s C/C++ 2 2


20 U-DVSO 0.88 % 0.0021 0.1 s C/C++ 1 2.5

21 scan-to-map PNDT-D2D
0.89 % 0.0030 0.5 s C/C++ 4 3.5

22 DVSO 0.90 % 0.0021 0.1 s C/C++ 2.5

23 MLG-VSLAM 0.93 % 0.0028 0.8 s C/C++ 1 3.5

24 C/C++ 1 3.5

25 LIMO 0.93 % 0.0026 0.2 s C/C++ 2 2.5

26 Stereo DSO 0.93 % 0.0020 0.1 s C/C++ 1 3.4

27 MLG-SLAM 0.96 % 0.0034 0.5 s C/C++ 1 3.5

28 ROCC 0.98 % 0.0028 0.3 s C/C++ 2 2

29 S4-SLAM 0.98 % 0.0044 0.2 s C/C++ 2 3


30 DLOAM 0.99 % 0.0037 2s Python 8 3.5
31 LiOd 1.01 % 0.0025 1s C/C++ >8 2.5
AGX Jetson
Xavier
32 IsaacElbrus 1.02 % 0.0023 0.0095 s AGX
Jetson Nano
33 DLO 1.02 % 0.0040 0.1s (0,03s)
C/C++ 1 2.8
34 S4OM 1.03 % 0.0053 0.15 s C/C++ 1 2.5

35 NDT_LO 1.05 % 0.0043 0.15s C/C++ 1 2.5

36 SuMa++ 1.06 % 0.0034 0.1 s C/C++ 1 3.5

37 cv4xv1-sc 1.09 % 0.0029 0.145 s C/C++ GPU 3.5

38 VINS-Fusion 1.09 % 0.0033 0.1s C/C++ 1 3

39 FPVO 1.10 % 0.0023 0.08 s C/C++ 4 2.3

40 MonoROCC 1.11 % 0.0028 1s C/C++ 2 2

41 ISRI_VO2 1.12 % 0.0029 0.1 s C/C++ 1 2.5

42 ISRI_VO 1.13 % 0.0030 0.1 s C/C++ 1 2.5

43 DEMO 1.14 % 0.0049 0.1 s C/C++ 2 2.5

44 ORB-SLAM2 1.15 % 0.0027 0.06 s C/C++ 2 >3,5

45 ElbrusFast 1.15 % 0.0032 0.018 s C/C++ 1.5 3.3

46 NOTF 1.17 % 0.0035 0.45 s C/C++ 1 3

47 FSMVO 1.18 % 0.0022 0.1 s C/C++ 1 2.5

48 S-PTAM 1.19 % 0.0025 0.03 s C/C++ 4 3.0

49 S-LSD-SLAM 1.20 % 0.0033 0.07 s C/C++ 1 3.5

50 VoBa 1.22 % 0.0029 0.1 s C/C++ 1 2

51 STEAM-L WNOJ 1.22 % 0.0058 0.2 s C/C++ 1 2.5

52 LiViOdo 1.22 % 0.0042 0.5 s C/C++ 1 2.5


53 SLUP 1.25 % 0.0041 0.17 s C/C++ 4 3.3

54 STEAM-L 1.26 % 0.0061 0.2 s C/C++ 1 2.5

55 FRVO 1.26 % 0.0038 0.03 s C/C++ 1 3.5

56 RTAB-Map 1.26 % 0.0026 0.1 s C/C++ 1 2.5

57 ORBVB 1.26 % 0.0029 0.06 s C/C++ 2 >3,5

58 A-LOAM 1.26 % 0.0034 0.1 s C/C++ 1 2.5

59 MIGP 1.28 % 0.0022 0.2 s C/C++ 2 2.5

60 MFI 1.30 % 0.0030 0.1 s C/C++ 1 2.2

61 TLBBA 1.36 % 0.0038 0.1 s C/C++ 1 2.8

62 2FO-CC 1.37 % 0.0035 0.1 s C/C++ 1 3

63 SALO 1.37 % 0.0051 0.6 s C/C++ 1 2.5

64 SuMa 1.39 % 0.0034 0.1 s C/C++ 1 3.5

65 ProSLAM 1.39 % 0.0035 0.02 s C/C++ 1 3

66 JFBVO 1.43 % 0.0038 0.05 s C/C++ 1 3.4

67 DQV-SLAM 1.47 % 0.0051 0.2 s C/C++ 2 2.5

68 StereoSFM 1.51 % 0.0042 0.02 s C/C++ 2 2.5

69 Leso 1.57 % 0.0050 0.05 s C/C++ 1 2.5

70 SSLAM 1.57 % 0.0044 0.5 s C/C++ 8 3.5

71 ICP SLAM 1.61 % 0.0061 0.1 s C/C++ 1 >3,5

72 MLLAB 1.72 % 0.0054 0.02 s C/C++ 1 2.5

73 eVO 1.76 % 0.0036 0.05 s C/C++ 2 2


74 Stereo DWO 1.76 % 0.0026 0.1 s C/C++ 4 2.5

75 BVO 1.76 % 0.0036 0.1 s Python 1 2.5

76 D6DVO 2.04 % 0.0051 0.03 s C/C++ 1 2.5

Python +
77 PMO / PbT-M2 2.05 % 0.0051 1s 1 2.5
C/C++

78 GFM 2.12 % 0.0056 0.03 s C/C++ 2 1.5

79 SSLAM-HR 2.14 % 0.0059 0.5 s C/C++ 8 3.5

80 FTMVO 2.24 % 0.0049 0.11 s C/C++ 1 2.5

81 PbT-M1 2.38 % 0.0053 1s C/C++ 1 2.5

82 VISO2-S 2.44 % 0.0114 0.05 s C/C++ 1 2.5

83 MLM-SFM 2.54 % 0.0057 0.03 s C/C++ 5 2.5

84 GT_VO3pt 2.54 % 0.0078 1.26 s C/C++ 1 2.5

85 PeNet 2.54 % 0.0089 0.1 s C/C++ GPU 2.5

86 RMCPE+GP 2.55 % 0.0086 0.39 s C/C++ 1 2.5

87 LKNMVO 2.66 % 0.0079 0.3 s C/C++ GPU 3

88 VO3pt 2.69 % 0.0068 0.56 s C/C++ 1 2

89 TGVO 2.94 % 0.0077 0.06 s C/C++ 1 2.5

90 OISEL 3.02 % 0.0132 0.2 s C/C++ 1 2.5

91 VO3ptLBA 3.13 % 0.0104 0.57 s C/C++ 1 2

92 PLSVO 3.26 % 0.0095 0.20 s C/C++ 2 2.5

93 NO_OISEL 3.45 % 0.0144 0.1 s C/C++ 1 2.5


94 BLF 3.49 % 0.0128 0.7 s C/C++ 1 2.5

95 CFORB 3.73 % 0.0107 0.9 s C/C++ 8 3

96 VOFS 3.94 % 0.0099 0.51 s C/C++ 1 2

97 UnDFVO 4.03 % 0.0096 1s Python 1 2.5

98 VOFSLBA 4.17 % 0.0112 0.52 s C/C++ 1 2

99 CUDA-EgoMotion
4.36 % 0.0052 .001 s Matlab GPU 2.5

100 BCC 4.59 % 0.0175 1s C/C++ 1 2.5

101 EB3DTE+RJMCM5.45 % 0.0274 1s Matlab 1 2.5

102 LVT 5.80 % 0.0065 0.02 s C/C++ 2 2.5

103 VISO2-M + GP 7.46 % 0.0245 0.15 s C/C++ 1 2.5

104 unscene 8.68 % 0.0227 0.1 s C/C++ GPU 2.5

105 BLO 9.21 % 0.0163 0.1 s C/C++ 1 2.5

106 PJ-test 11.79 % 0.0069 0.1 s C/C++ 1 2.5

107 VISO2-M 11.94 % 0.0234 0.1 s C/C++ 1 2.5

108 MTL 14.95 % 0.0115 0.1 s C/C++ 1 2.5

109 OABA 20.95 % 0.0135 0.5 s C/C++ 1 3.5

Python +
110 3DC-VO 21.00 % 0.0394 0.04 s 1 2.5
C/C++

111 DeepVO 24.55 % 0.0489 1s Python 1 2.5


Evalu
CAS 1
Milieu industriel Oil Gas - EMR
Scène Indoor
Durée Courte
Réflectance Basse
Emissivité des corps Haute
Visuelle [spectre visible]
Type de POI
Chimique

POI

Luminosité disponible Appoint ou Disponible


Géométrie des lieux Géométrique sphérique ou quadrilatère
Amplitude de Volume Variable
Obstacles Pas d'obstacle
Dynamique des mouvements Faible
Dynamique d'environnement Très faible
Fréquence de revisit à définir
Analogie de jeu (IA) Look, Smell, Mark and Seek

Approches

Approche

methodes par localisation

Fusion multi-capteurs

Autres approches
Senso
Capteurs proprioceptifs Type de positionnement
Encodeur de servos Relative
IMU Relative
Gryromètres Relative
capteur à effet Doppler (vitesse) Relative

CAS 1
Durée Courte
Luminosité disponible Appoint ou Disponible
Géométrie des lieux Pas d'obstacle
Amplitude de Volume Faible
Obstacles Très faible
Dynamique des mouvements Faible
Dynamique d'environnement Très faible
Réflectance Basse
Emissivité des corps Haute
Visuelle [spectre visible]
Type de POI
Chimique

Caméra HD
Capteurs POI
Capteur de gaz (industriel)

Grandeur physique perturbé Luminosité (Réflectance)


Capteurs avec risque de perturbation Caméras
Contre-Mesure Sonar, Infrarouge, Thermique, Laser
Capteurs SLAM Spécifique Laser, …
Odométrie
Exigeances de performances Précis, Robuste,
Dataset Pas de dataset
Utilité connexe SLAM Point Visible Reconnu + Topologie gaz
Méthode SLAM

Méthodes de SLAM
Lidar Caméra
1. ORB_SLAM2
Evaluation d'environnement
CAS 2
Carrière souterraine
Indoor
Moyenne
Moyenne
Moyenne
Visuelle [spectre visible et extremum]
Volumique (3D)
Eboulis*
Fissures**
[Indices*,**]
Appoint et Indisponible
Cylindriques connectés
(Gallerie)
(couloirs connectés)
Peu fréquent et Inférieur à 25cm
Moyenne
Moyenne
à définir
Look, Record, Mark, Compare and Seek
Jeu des 7 erreurs

Approches SLAM
https://blog.cometlabs.io/teaching-robots-presence-what-you-need-to-know-about-slam-9bf0ca037553
Description

position et orientation du robot avec la carthographie.


Limitation majeurs dû aux limites techniques des capteurs pris séparémment.
La fusion de capteurs a permis d'aller au-delà des limitations de chaque capteur seul.

En utilisant la position relative = positionnement précis constante


En utilisant la position absolue = corrige les erreurs potentiels
Peut on utiliser davantage de capteurs extéroceptifs "non-usuels" pour corriger / améliorer la qualité du SLAM

Sensors
Capteurs extéroceptifs
GPS
Température
Humidité
Pression
Vent
Gaz
Capteurs photoélectriques
Capteurs sonores
Capteurs électromagnétique basse fréquences
Capteurs lumineuses visibles
Ondes électromagnétiques (orienté paramètre externe au système)
Ondes infrarouges et pyroélèctrique

Méthode et Sensors
CAS 2
Moyenne
Appoint et Indisponible
Peu fréquent et Inférieur à 25cm
Moyenne
Moyenne
Moyenne
Moyenne
Moyenne (suivant type de roche)
Moyenne
Visuelle [spectre visible et extremum]
Volumique (3D)
Caméra HD
Caméra à multi-spectre (?)
Caméra 3D ou LIDAR 3D

Géométrie du sol
-

Laser, Visible
IMU
Très précis, Robuste, +Sol
Pas de dataset
Point Visible reconnu + Différence volumique ∆t(j,m,a)

Méthodes de SLAM par catégorie


Dataset
CAS 3
Construction détruite et instable
Indoor
Longue
Haute
Basse
Gaz domestique (butane, propane)
Signature thermique

Pas d'appoint et Indisponible


Quadrilatère brisé
Faible
Fréquent et supérieur à 25 cm (Difficile d'accès)
Forte
Moyenne-Forte
à définir
Look, Smell, Mark and Seek

bf0ca037553
Méthode Difficultés /limitations

Acquisisation de la position
- Position relative (odométrie comme des [Position relative]
encodeurs et navigation inertiel, IMU) Système indépendant des
- Position abolue (beacons, wifi, rangefinder et autres sources
système de vision) d'informations
Interpolation linéaire Estimation de la dérive
Likehood fonction Erreur cumulé
[Position absolue]

Méthode
- merging multiple sensor feeds at the lowest
level before being processed homogeneously
- hierarchical approaches to fuse state estimates
derived independently from multiple sensors
- Markov Localization framework. (on all
available information).
estimating the probability density over the space
of all locations. Relative + Absolute = Belief of
location
r la qualité du SLAM

Type de positionnement Capteurs exproprioceptif Type de positionnement


Absolue Relative et Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue

CAS 3
Longue
Pas d'appoint et Indisponible
Fréquent et supérieur à 25 cm (Difficile d'accès)
Forte
Moyenne-Forte
Forte
Moyenne-Forte
Haute
Basse
Gaz domestique (butane, propane)
Signature thermique

Capteur de gaz (butane, propane)


Caméra thermique

Géométrie de l'environnement, Volume,


Atmosphère
-

Laser, Caméra (M , S , R, 3D)

Précis, Robuste
Dataset Domestique (Sans altération)
Point Thermique Reconnu + Topologie gaz

Stereo Visual SLAM


Autres
2. ORBSLAM2_with_pointcloud_map 1. ORB_SLAM2
2. ORBSLAM2_with_pointcloud_map
3. PL-SLAM: a Stereo SLAM System through
4. StVO-PL: Stereo Visual Odometry by com
5. stereo-dso: Direct Sparse Odometry with
6. S-PTAM: Stereo Parallel Tracking and Ma
7. Robust Stereo Visual Odometry
EKF SLAM
FastSLAM 1.0
FastSLAM 2.0
L-SLAM[1] (Matlab code)

GraphSLAM
Occupancy Grid SLAM[2]
DP-SLAM
Parallel Tracking and Mapping (PTAM)[3]
LSD-SLAM[4] (available as open-source)

S-PTAM[5] (available as open-source)

ORB-SLAM[6] (available as open-source)

ORB-SLAM2 (available as open-source)


MonoSLAM

Equation d'état CoSLAM[7]


x˙ = f (x, u) , u = proprio SeqSlam[8]
y = g (x) . y = extéroceptif iSAM (Incremental Smoothing and Mapping)[9]
CT-SLAM (Continuous Time)[10]

Analyse par intervalles pour la détection de boucles dan

Taihú Pire and Thomas Fischer and Gastón Castro and P


R. Mur-Artal and J. M. M. Montiel and J. D. Tardós (2015
D. Zou and P. Tan (2013). "CoSLAM: Collaborative Visua
Michael J. Milford and Gordon. F. Wyeth. "SeqSLAM: Vis
"iSAM: Incremental Smoothing and Mapping". people.c
M. Bosse and R. Zlot (2009). "Continuous 3D scan-match

2_with_pointcloud_map
a Stereo SLAM System through the Combination of Points and Line Segments
tereo Visual Odometry by combining point and line segment features
: Direct Sparse Odometry with Stereo Cameras
tereo Parallel Tracking and Mapping
ereo Visual Odometry
ping (PTAM)[3]
pen-source)

open-source)

open-source)
hing and Mapping)[9]

la détection de boucles dans la trajectoire d'un robot mobile

her and Gastón Castro and Pablo De Cristóforis and Javier Civera and Julio Jacobo Berlles (2017). "S-PTAM: Stereo Parallel Tracking and
ontiel and J. D. Tardós (2015). "ORB-SLAM: A Versatile and Accurate Monocular SLAM System". IEEE Transactions on Robotics. 31 (5): 1
oSLAM: Collaborative Visual SLAM in Dynamic Environments" (PDF). IEEE Transactions on Pattern Analysis and Machine Intelligence. IE
on. F. Wyeth. "SeqSLAM: Visual Route-Based Navigation for Sunny Summer Days and Stormy Winter Nights". Proc. of Intl. Conf. on Rob
ing and Mapping". people.csail.mit.edu. Retrieved 2018-02-14.
"Continuous 3D scan-matching with a spinning 2d laser". 2009 IEEE International Conference on Robotics and Automation: 4312–4319
Stereo Parallel Tracking and Mapping". Robotics and Autonomous Systems. doi:10.1016/j.robot.2017.03.019. ISSN  0921-8890.
actions on Robotics. 31 (5): 1147–1163. arXiv:1502.00956. doi:10.1109/TRO.2015.2463671. ISSN 1552-3098.
and Machine Intelligence. IEEE.
". Proc. of Intl. Conf. on Robotics and Automation.

and Automation: 4312–4319. doi:10.1109/ROBOT.2009.5152851. ISSN  1050-4729.


19. ISSN  0921-8890.
On suppose que dans le cadre d'inspection
régulière, la vérification s'effectue toutes les X
pour le cas 1, toutes les X pour le cas 2 et toutes
les X pour le cas 3 avec une possibilité
d'intervenir sans préavis.

Court Inférieur à 25 min


Moyen Entre 25 min et 40 min
Long Supérieur à 40 min
CAS 1 (2 cas : Cas "Coke", Cas "Alu") Scène Durée
CAS 1
Indoor Courte
Milieu industriel Oil Gas - EMR

CAS 2
Indoor Moyenne
Carrière souterraine

CAS 3
Indoor Longue
Construction détruite et instable

[Indices*,**] Recherche de traces d'évenement (éboulis, fissure)


Sismographe

Earthquakes are detected using a seismometer.


A seismometer, also known as a seismograph, is
an instrument that records movements of the
ground. It is used to detect seismic waves
generated by earthquakes and nuclear
explosions. A seismoscope can also be used for
the detection of underground movements.
https://shop.raspberryshake.org/
http://www.jlhmesure.fr/details-emissivite+et+mesure+de+temperature+par+infra+rouge+l+etalo

Energie Absorvée A
Energie Réfléchie R
Energie Transmisse T

Réflectance = (?) Emissivité des corps = 1 (?)


Réflectance = (?) Emissivité = 0.5 (?)
Réflectance = (?) Emissivité = 0.3 (?)
Réflectance Emissivité des corps
Basse Haute

Moyenne Moyenne

Haute Basse

e traces d'évenement (éboulis, fissure)

Le coke est un "corps noir" n'a aucune réflexion, mais


une émissivité  maximum égale à 1. photo 1 Par
contre l'aluminium laminé à lui une forte réflexivité
donc une émissivité faible Photo 2 . 

Le coke a une faible réflectivité et donc une émissivité proche de 1µm


Characterizing SLAM Benchmarks and Methods for the Robust Perception AgeWenkai Ye1, Yipu Zhao1, and Patricio A.

Characteristic images of selected typical sequences

Characterizing SLAM Benchmarks and Methods for the RobustPerception AgeWenkai Ye1, Yipu Zhao1, and Patricio
Temps à déterminer
( [vitesse robot + vitesse d'acquisition + vitesse de
Thermique
traitement -latence - bruit]/angle d'acquisition x temps
de translation spatiale + temps d'atteindre tous les POI

Visuelle Appoint ou Disponible


Chimique Appoint et Indisponible
Volumique Pas d'appoint et Indisponible
Type de POI POI Luminosité disponible
Visuelle [spectre visible]
Chimique Appoint ou Disponible
Visuelle [spectre visible et Eboulis*
extremum] Fissures**
Volumique (3D) [Indices*,**] Appoint et Indisponible
Gaz domestique (butane, propane)
Signature thermique Pas d'appoint et Indisponible

émissivité proche de 1µm

Fig. 2.Trained Decision Tree Factors influencing difficulty level


erception AgeWenkai Ye1, Yipu Zhao1, and Patricio A. Vela1

Perception AgeWenkai Ye1, Yipu Zhao1, and Patricio A. Vela1


We performed a literature search fo
SLAM algorithms, and any other visua
pose information permittingquantitativ
time. Pub-lished benchmarks for whi
were excluded, such as Rawseeds [18]
benchmark datasets was identified:Ne
[20], Ford Campus[21], Malaga 2009 [22
[2], Malaga Urban [24], ICL NUIM [9], UM
[26], TUM Mono [27], Pen-nCOSYV
RobotCar [10],TUM VI [7], BlackBird [3
own. Altogether, they reflect over 3
truth si
Géométrique Superficie =
Géométrique combiné Superficie =
Géométrique brisé (RDM) Superficie =
Géométrie des lieux Amplitude de Volume
Géométrique sphérique ou quadrilatère Variable

Cylindriques connectés
(couloirs connectés)
(Gallerie)

Quadrilatère brisé Faible

ng difficulty level
https://github.com/ivalab/Benchmarking_SLAM
Contactable
med a literature search for benchmark datasetsassociated to
thms, and any other visual se-quence data with ground truth
ation permittingquantitative evaluation of camera pose versus
ished benchmarks for which the data is no longer avail-able
ed, such as Rawseeds [18]. In the end,the following corpus of
datasets was identified:NewCollege [1], Alderley [19], Karlsruhe
ampus[21], Malaga 2009 [22], CMU-VL [23], TUM RGBD [16],KITTI
Urban [24], ICL NUIM [9], UMich NCLT[25], EuRoC [8], Nordland
M Mono [27], Pen-nCOSYVIO [28], Zurich Urban MAV [29],
10],TUM VI [7], BlackBird [30], and a Hololens benchmark ofour
gether, they reflect over 310 sequences withavailable ground
truth signals
(Matrice 1-5-9)

Fréquence d'obstacle x Difficulté de l'obstacle Solicitation de l'hexapod

Angle x Vitesse
Vitesse
IMU [tx,ty,tz
5x1 rx,ry,rz]
1x9
5x9
Obstacles Dynamique des mouvements
Faible

Peu fréquent et Inférieur à 25cm Moyenne

Fréquent et supérieur à 25 cm (Difficile d'accès) Forte


chmarking_SLAM
Taux de déformation possible

A demeure si pas
d'activité

Périodicité cas 1

Périodicité cas 2
Périodicité cas 3
Dynamique d'environnement Fréquence de revisit Analogie de jeu (IA)
Très faible à définir Look, Smell, Mark and Seek

Moyenne à définir Look, Record, Mark and Seek Jeu des 7 erreurs

Moyenne-Forte à définir Look, Smell, Mark, Seek


On suppose
que dans le
cadre
d'inspection
régulière, la
vérification
s'effectue
toutes les X
pour le cas
1, toutes les
X pour le cas
2 et toutes
les X pour le
cas 3 avec
une
possibilité
d'intervenir
sans préavis.
CAS 1
Milieu industriel Oil Gas - EMR

Scène Indoor
Durée Courte
Réflectance Basse

Emissivité des corps Haute

Visuelle [spectre visible]


Type de POI Chimique

POI

Luminosité disponible Appoint ou Disponible


Géométrie des lieux Géométrique sphérique ou quadrilatère

Amplitude de Volume Variable

Obstacles

Dynamique des mouvements Faible

Dynamique d'environnement Très faible

Fréquence de revisit à définir


Analogie de jeu (IA) Look, Smell, Mark and Seek

Approches SLAM

Approche
methodes par localisation

Fusion multi-capteurs

Autres approches

Sensors
Capteurs proprioceptifs Type de positionnement
Encodeur de servos Relative
IMU Relative
Gryromètres Relative
capteur à effet Doppler (vitesse) Relative

Méthode et Sensors
CAS 1
Durée Courte
Luminosité disponible Appoint ou Disponible
Géométrie des lieux Pas d'obstacle
Amplitude de Volume Faible
Obstacles Très faible
Dynamique des mouvements Faible
Dynamique d'environnement Très faible
Réflectance Basse
Emissivité des corps Haute
Type de POI Visuelle [spectre visible]
Chimique

Caméra HD
Capteurs POI
Capteur de gaz (industriel)

Grandeur physique perturbé Luminosité (Réflectance)


Capteurs avec risque de perturbation Caméras
Contre-Mesure Sonar, Infrarouge, Thermique, Laser
Capteurs SLAM Spécifique Laser, …
Odométrie IMU
Exigeances de performances Précis, Robuste,
Dataset Pas de dataset
Utilité connexe SLAM Point Visible Reconnu + Topologie gaz
Méthode SLAM
CAS 2
Carrière souterraine

Indoor
[Moyenne-Longue]
Moyenne

Moyenne

Visuelle [spectre visible et extremum]


Volumique (3D)

Eboulis*
Fissures**
[Indices*,**]

Appoint et Indisponible
Cylindriques connectés
(Gallerie)
(couloirs connectés)

Peu fréquent et Inférieur à 25cm

Moyenne

Moyenne

à définir
Look, Record, Mark and Seek
Jeu des 7 erreurs

Approches SLAM
https://blog.cometlabs.io/teaching-robots-presence-what-you-need-to-know-about-slam-9bf0ca037553
Description
position et orientation du robot avec la carthographie.
Limitation majeurs dû aux limites techniques des capteurs
pris séparémment.
La fusion de capteurs a permis d'aller au-delà des limitations
de chaque capteur seul.

En utilisant la position relative = positionnement précis


constante
En utilisant la position absolue = corrige les erreurs
potentiels

Peut on utiliser davantage de capteurs extéroceptifs "non-usuels" pour corriger / améliorer la qualité du SLAM

Sensors
Capteurs extéroceptifs
GPS
Température
Humidité
Pression
Vent
Gaz
Capteurs photoélectriques
Capteurs sonores
Capteurs électromagnétique basse fréquences
Capteurs lumineuses visibles
Ondes électromagnétiques (orienté paramètre externe au sys
Ondes infrarouges et pyroélèctrique

Méthode et Sensors
CAS 2
Moyenne
Appoint et Indisponible
Peu fréquent et Inférieur à 25cm
Moyenne
Moyenne
Moyenne
Moyenne
Moyenne (suivant type de roche)
Moyenne
Visuelle [spectre visible et extremum]
Volumique (3D)
Caméra HD
Caméra à multi-spectre (?)
Caméra 3D ou LIDAR 3D

Géométrie du sol
-

Laser, Visible
IMU
Très précis, Robuste, +Sol
Pas de dataset
Point Visible reconnu + Différence volumique ∆t(j,m,a)
CAS 3
Construction détruite et instable

Indoor
Longue
Haute
Energie Absorvée A
Basse Energie Réfléchie R
Energie Transmisse T

Gaz domestique (butane, propane)


Signature thermique Thermique

Temps à déterminer
( [vitesse robot + vitesse d'acquisition + vitesse de
traitement -latence - bruit]/angle d'acquisition x temps
de translation spatiale + temps d'atteindre tous les POI
Pas d'appoint et Indisponible
Quadrilatère brisé

Faible

Fréquent et supérieur à 25 cm (Difficile d'accès) Fréquence d'obstacle x Difficulté de l'obstacle

Forte Solicitation de l'hexapod

Taux de déformation possible


Moyenne-Forte

à définir A demeure si pas d'activité


Look, Smell, Mark, Seek

ce-what-you-need-to-know-about-slam-9bf0ca037553
Méthode
Acquisisation de la position
- Position relative (odométrie comme des
encodeurs et navigation inertiel, IMU)
- Position abolue (beacons, wifi, rangefinder et
système de vision)
Interpolation linéaire
Likehood fonction

Méthode
- merging multiple sensor feeds at the lowest
level before being processed homogeneously
- hierarchical approaches to fuse state
estimates derived independently from multiple
sensors
- Markov Localization framework. (on all
available information).
estimating the probability density over the
space of all locations. Relative + Absolute =
Belief of location

tifs "non-usuels" pour corriger / améliorer la qualité du SLAM

Type de positionnement
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue
Absolue

CAS 3
Longue
Pas d'appoint et Indisponible
Fréquent et supérieur à 25 cm (Difficile d'accès)
Forte
Moyenne-Forte
Forte
Moyenne-Forte
Haute
Basse
Gaz domestique (butane, propane)
Signature thermique

Capteur de gaz (butane, propane)


Caméra thermique

Géométrie de l'environnement, Volume,


Atmosphère
-

Laser, Caméra (M , S , R, 3D)

Précis, Robuste
Dataset Domestique (Sans altération)
Point Thermique Reconnu + Topologie gaz
Court Moyen Long
Inférieur à 25 min Entre 25 min et 40 min Supérieur à 40 min
Réflectance = (?) Réflectance = (?) Réflectance = (?)

Emissivité des corps = 1 (?) Emissivité = 0.5 (?) Emissivité = 0.3 (?)

Visuelle Chimique Volumique

Appoint ou Disponible Appoint et Indisponible Pas d'appoint et Indisponible

Géométrique Géométrique combiné Géométrique brisé (RDM)


Superficie = Superficie = Superficie =

5x1 1x9 5x9

Angle x Vitesse
Vitesse
IMU [tx,ty,tz
rx,ry,rz]

Périodicité cas 1 Périodicité cas 2 Périodicité cas 3


2 cores @
0.0014
5 SOFT2 0.65 % 0.1 s 2.5 Ghz
[deg/m]
(C/C++)
1 core @
0.0018
6 IMLS-SLAM 0.69 % 1.25 s >3.5 Ghz
[deg/m]
(C/C++)
4 cores @
0.0016
7 MC2SLAM 0.69 % 0.1 s 2.5 Ghz
[deg/m] (C/C++)
4 cores @
8 ESO 0.80 % 0.0026 0.08 s 3.0 Ghz
[deg/m]
(C/C++)

0.0025 1 core @ 2.5


9 sGAN-VO 0.81 % 0.1 s
[deg/m] Ghz (C/C++)

0.0020 4 cores @
10 LG-SLAM 0.82 % 0.2 s 2.5 Ghz
[deg/m]
(C/C++)
2 cores @
0.0026
11 RotRocc+ 0.83 % [deg/m] 0.25 s 2.0 Ghz
(C/C++)
2 cores @
0.0022
12 LIMO2_GP code 0.84 % [deg/m] 0.2 s 2.5 Ghz
(C/C++)
4 cores @
13 UFSF-VLO 0.84 % 0.0023 0.05 s 3.0 Ghz
[deg/m]
(C/C++)
1 core @
14 GDVO 0.86 % 0.0031 0.09 s >3.5 Ghz
[deg/m]
(C/C++)
2 cores @
0.0022
15 LIMO2 code 0.86 % 0.2 s 2.5 Ghz
[deg/m]
(C/C++)

0.0036 1 core @ 2.5


16 ICP_LO 0.87 % [deg/m] 0.05 s
Ghz (C/C++)

4 cores @
0.0025
17 CPFG-slam 0.87 % [deg/m] 0.03 s 2.5 Ghz
(C/C++)

0.0026 1 core @ 3.5


18 MLG-VSLAM+ 0.88 % 0.8 s
[deg/m] Ghz (C/C++)
0.0026 1 core @ 3.5
18 MLG-VSLAM+ 0.88 % 0.8 s
[deg/m] Ghz (C/C++)

2 cores @
0.0022
19 SOFT 0.88 % 0.1 s 2.5 Ghz
[deg/m]
(C/C++)
2 cores @
0.0025
20 RotRocc 0.88 % 0.3 s 2.0 Ghz
[deg/m]
(C/C++)

0.0021 1 core @ 2.5


21 U-DVSO 0.88 % 0.1 s
[deg/m] Ghz (C/C++)

4 cores @
0.0030
22 scan-to-map PNDT-D2D 0.89 % 0.5 s >3.5 Ghz
[deg/m]
(C/C++)

0.0021 GPU @ 2.5


23 DVSO 0.90 % 0.1 s
[deg/m] Ghz (C/C++)

0.93 % 0.0028
1 core @ 3.5
24 MLG-VSLAM 0.8 s
[deg/m] Ghz (C/C++)

2 cores @
0.0026
25 LIMO code 0.93 % [deg/m] 0.2 s 2.5 Ghz
(C/C++)

0.0020 1 core @ 3.4


26 Stereo DSO 0.93 % 0.1 s
[deg/m] Ghz (C/C++)

0.96 % 0.0034
1 core @ 3.5
27 MLG-SLAM 0.5 s
[deg/m] Ghz (C/C++)

2 cores @
28 ROCC 0.98 % 0.0028 0.3 s 2.0 Ghz
[deg/m]
(C/C++)

0.0044 2 core @ 3.0


29 S4-SLAM 0.98 % 0.2 s
[deg/m] Ghz (C/C++)

8 cores @
0.0037
30 DLOAM 0.99 % [deg/m] 2s 3.5 Ghz
(Python)
>8 cores @
0.0025
31 LiOd 1.01 % 1s 2.5 Ghz
[deg/m]
(C/C++)
AGX Jetson
0.0023 Xavier
32 IsaacElbrus code 1.02 % 0.0095 s (0.03s
[deg/m]
Jetson
Nano)

0.0040 1 core @ 2.8


33 DLO 1.02 % 0.1s
[deg/m] Ghz (C/C++)

0.0053 1 core @ 2.5


34 S4OM 1.03 % 0.15 s
[deg/m] Ghz (C/C++)

35 NDT_LO 1.05 % 0.0043 0.15s 1 core @ 2.5


[deg/m] Ghz (C/C++)

1.06 % 0.0034
1 core @ 3.5
36 SuMa++ 0.1 s
[deg/m] Ghz (C/C++)

0.0029 GPU @ 3.5


37 cv4xv1-sc 1.09 % [deg/m] 0.145 s Ghz (C/C++)

0.0033 1 core @ 3.0


38 VINS-Fusion code 1.09 % 0.1s
[deg/m] Ghz (C/C++)

4 cores @
0.0023
39 FPVO 1.10 % [deg/m] 0.08 s 2.3 Ghz
(C/C++)

0.0028 2 cores @
40 MonoROCC 1.11 % [deg/m] 1s 2.0 Ghz
(C/C++)

0.0029 1 core @ 2.5


41 ISRI_VO2 1.12 % [deg/m] 0.1 s
Ghz (C/C++)

42 ISRI_VO 1.13 % 0.0030 0.1 s 1 core @ 2.5


[deg/m] Ghz (C/C++)

2 cores @
43 DEMO 1.14 % 0.0049 0.1 s 2.5 Ghz
[deg/m] (C/C++)
2 cores @
44 ORB-SLAM2 code 1.15 % 0.0027 0.06 s >3.5 Ghz
[deg/m] (C/C++)
1.5 cores @
0.0032
45 ElbrusFast 1.15 % [deg/m] 0.018 s 3.3 Ghz
(C/C++)

0.0035 1 core @ 3.0


46 NOTF 1.17 % [deg/m] 0.45 s
Ghz (C/C++)
0.0022 1 core @ 2.5
47 FSMVO 1.18 % 0.1 s
[deg/m] Ghz (C/C++)

4 cores @
0.0025
48 S-PTAM code 1.19 % [deg/m] 0.03 s 3.0 Ghz
(C/C++)

0.0033 1 core @ 3.5


49 S-LSD-SLAM code 1.20 % 0.07 s
[deg/m] Ghz (C/C++)

0.0029 1 core @ 2.0


50 VoBa 1.22 % 0.1 s
[deg/m] Ghz (C/C++)

51 STEAM-L WNOJ 1.22 % 0.0058 0.2 s 1 core @ 2.5


[deg/m] Ghz (C/C++)

0.0042 1 core @ 2.5


52 LiViOdo 1.22 % 0.5 s
[deg/m] Ghz (C/C++)

0.0041 4 cores @
53 SLUP 1.25 % [deg/m] 0.17 s 3.3 Ghz
(C/C++)

0.0061 1 core @ 2.5


54 STEAM-L 1.26 % [deg/m] 0.2 s Ghz (C/C++)

0.0038 1 core @ 3.5


55 FRVO 1.26 % 0.03 s
[deg/m] Ghz (C/C++)

0.0026 1 core @ 2.5


56 RTAB-Map code 1.26 % 0.1 s
[deg/m] Ghz (C/C++)

2 cores @
0.0029
57 ORBVB code 1.26 % 0.06 s >3.5 Ghz
[deg/m]
(C/C++)

code 1.26 % 0.0034


1 core @ 2.5
58 A-LOAM 0.1 s
[deg/m] Ghz (C/C++)

2 cores @
0.0022
59 MIGP 1.28 % [deg/m] 0.2 s 2.5 Ghz
(C/C++)

0.0030 1 core @ 2.2


60 MFI 1.30 % [deg/m] 0.1 s Ghz (C/C++)

1 Core
0.0038
61 TLBBA 1.36 % [deg/m] 0.1 s @2.8GHz
(C/C++)
0.0035 1 core @ 3.0
62 2FO-CC code 1.37 % 0.1 s
[deg/m] Ghz (C/C++)

0.0051 1 core @ 2.5


63 SALO 1.37 % [deg/m] 0.6 s Ghz (C/C++)

0.0034 1 core @ 3.5


64 SuMa 1.39 % 0.1 s
[deg/m] Ghz (C/C++)

0.0035 1 core @ 3.0


65 ProSLAM code 1.39 % 0.02 s
[deg/m] Ghz (C/C++)

66 JFBVO 1.43 % 0.0038 0.05 s 1 core @ 3.4


[deg/m] Ghz (C/C++)

2 cores @
0.0051
67 DQV-SLAM 1.47 % 0.2 s 2.5 Ghz
[deg/m]
(C/C++)

0.0042 2 cores @
68 StereoSFM code 1.51 % [deg/m] 0.02 s 2.5 Ghz
(C/C++)

0.0050 1 core @ 2.5


69 Leso 1.57 % [deg/m] 0.05 s Ghz (C/C++)

8 cores @
0.0044
70 SSLAM code 1.57 % 0.5 s 3.5 Ghz
[deg/m]
(C/C++)
1 core @
0.0061
71 ICP SLAM 1.61 % 0.1 s >3.5 Ghz
[deg/m]
(C/C++)

0.0054 1 core @ 2.5


72 MLLAB 1.72 % 0.02 s
[deg/m] Ghz (C/C++)

2 cores @
73 eVO 1.76 % 0.0036 0.05 s 2.0 Ghz
[deg/m]
(C/C++)
4 cores @
0.0026
74 Stereo DWO code 1.76 % [deg/m] 0.1 s 2.5 Ghz
(C/C++)
1 core @
0.0036
75 BVO 1.76 % [deg/m] 0.1 s 2.5GHz
(Python)

0.0051 1 core @ 2.5


76 D6DVO 2.04 % [deg/m] 0.03 s
Ghz (C/C++)
1 core @ 2.5
0.0051
77 PMO / PbT-M2 2.05 % 1s Ghz (Python
[deg/m]
+ C/C++)

2 cores @
0.0056
78 GFM code 2.12 % [deg/m] 0.03 s 1.5 Ghz
(C/C++)

0.0059 8 cores @
79 SSLAM-HR code 2.14 % 0.5 s 3.5 Ghz
[deg/m]
(C/C++)

0.0049 1 core @ 2.5


80 FTMVO 2.24 % 0.11 s
[deg/m] Ghz (C/C++)

1 core @ 2.5
0.0053
81 PbT-M1 2.38 % 1s Ghz (Python
[deg/m]
+ C/C++)

code 2.44 % 0.0114


1 core @ 2.5
82 VISO2-S 0.05 s
[deg/m] Ghz (C/C++)

5 cores @
0.0057
83 MLM-SFM 2.54 % 0.03 s 2.5 Ghz
[deg/m]
(C/C++)

0.0078 1 core @ 2.5


84 GT_VO3pt 2.54 % 1.26 s
[deg/m] Ghz (C/C++)

85 PeNet 2.54 % 0.0089 0.1 s GPU @ 2.0


[deg/m] Ghz (C/C++)

0.0086 1 core @ 2.5


86 RMCPE+GP 2.55 % [deg/m] 0.39 s
Ghz (C/C++)

87 LKNMVO 2.66 % 0.0079 0.3 s GPU @ 3.0


[deg/m] Ghz (C/C++)

0.0068 1 core @ 2.0


88 VO3pt 2.69 % 0.56 s
[deg/m] Ghz (C/C++)

2.94 % 0.0077
1 core @ 2.5
89 TGVO 0.06 s
[deg/m] Ghz (C/C++)

3.02 % 0.0132
1 core @ 2.5
90 OISEL 0.2 s
[deg/m] Ghz (C/C++)

3.13 % 0.0104
1 core @ 2.0
91 VO3ptLBA 0.57 s
[deg/m] Ghz (C/C++)
0.0095 2 cores @
92 PLSVO 3.26 % 0.20 s 2.5 Ghz
[deg/m]
(C/C++)

0.0144 1 core @ 2.5


93 NO_OISEL 3.45 % [deg/m] 0.1 s Ghz (C/C++)

0.0128 1 core @ 2.5


94 BLF 3.49 % 0.7 s
[deg/m] Ghz (C/C++)

8 cores @
0.0107
95 CFORB 3.73 % 0.9 s 3.0 Ghz
[deg/m]
(C/C++)

96 VOFS 3.94 % 0.0099 0.51 s 1 core @ 2.0


[deg/m] Ghz (C/C++)

1 core @ 2.5
0.0096
97 UnDFVO 4.03 % 1s Ghz
[deg/m]
(Python)

0.0112 1 core @ 2.0


98 VOFSLBA 4.17 % [deg/m] 0.52 s Ghz (C/C++)

0.0052 GPU @ 2.5


99 CUDA-EgoMotion 4.36 % [deg/m] .001 s Ghz
(Matlab)

0.0175 1 core @ 2.5


100 BCC 4.59 % 1s
[deg/m] Ghz (C/C++)

1 core @ 2.5
0.0274
101 EB3DTE+RJMCM 5.45 % 1s Ghz
[deg/m]
(Matlab)
2 cores @
0.0065
102 LVT 5.80 % 0.02 s 2.5 Ghz
[deg/m]
(C/C++)

7.46 % 0.0245
1 core @ 2.5
103 VISO2-M + GP 0.15 s
[deg/m] Ghz (C/C++)

8.68 % 0.0227
GPU @ 2.5
104 unscene 0.1 s
[deg/m] Ghz (C/C++)

0.0163 1 core @ 2.5


105 BLO 9.21 % 0.1 s
[deg/m] Ghz (C/C++)

0.0069 1 core @ 2.5


106 PJ-test 11.79 % [deg/m] 0.1 s
Ghz (C/C++)

0.0234 1 core @ 2.5


107 VISO2-M code 11.94 % 0.1 s
[deg/m] Ghz (C/C++)
0.0115 1 core @ 2.5
108 MTL 14.95 % 0.1 s
[deg/m] Ghz (C/C++)

0.0135 1 core @ 3.5


109 OABA 20.95 % [deg/m] 0.5 s Ghz (C/C++)

1 core @ 2.5
0.0394
110 3DC-VO 21.00 % 0.04 s Ghz (Python
[deg/m]
+ C/C++)

1 core @ 2.5
0.0489
111 DeepVO 24.55 % 1s Ghz
[deg/m]
(Python)
Fréquence
Rotation Nombre de
Method Translation (deg/m) Runtime Language core utilisé utilisé
(GHz)

0.0014
5 SOFT2 0.65 % 0.1 s C/C++ 2
[deg/m]
2.5

0.0018
6 IMLS-SLAM 0.69 % 1.25 s C/C++ 1 >3,5
[deg/m]

0.0016
7 MC2SLAM 0.69 % 0.1 s C/C++ 4 2.5
[deg/m]

8 ESO 0.80 % 0.0026 0.08 s C/C++ 4 3


[deg/m]

0.0025
9 sGAN-VO 0.81 % 0.1 s C/C++ 1 2.5
[deg/m]

0.0020
10 LG-SLAM 0.82 % 0.2 s C/C++ 4 2.5
[deg/m]

0.0026
11 RotRocc+ 0.83 % [deg/m] 0.25 s C/C++ 2 2

0.0022
12 LIMO2_GP 0.84 % 0.2 s C/C++ 2 2.5
[deg/m]

0.0023
13 UFSF-VLO 0.84 % 0.05 s C/C++ 4 3
[deg/m]

0.0031
14 GDVO 0.86 % 0.09 s C/C++ 1 >3,5
[deg/m]

0.0022
15 LIMO2 0.86 % 0.2 s C/C++ 2 2.5
[deg/m]

0.0036
16 ICP_LO 0.87 % 0.05 s C/C++ 1 2.5
[deg/m]

0.0025
17 CPFG-slam 0.87 % 0.03 s C/C++ 4 2.5
[deg/m]

C/C++ 1 3.5
0.0026
18 MLG-VSLAM+ 0.88 % 0.8 s
[deg/m]
0.0026
18 MLG-VSLAM+ 0.88 % 0.8 s
[deg/m]
C/C++ 1 3.5

0.0022
19 SOFT 0.88 % 0.1 s C/C++ 2 2.5
[deg/m]

0.0025
20 RotRocc 0.88 % 0.3 s C/C++ 2 2
[deg/m]

0.0021
21 U-DVSO 0.88 % 0.1 s C/C++ 1 2.5
[deg/m]

0.0030
22 scan-to-map PNDT-D2D
0.89 % 0.5 s C/C++ 4 3.5
[deg/m]

0.0021
23 DVSO 0.90 % 0.1 s C/C++ 2.5
[deg/m]

C/C++ 1 3.5
0.0028
24 MLG-VSLAM 0.93 % 0.8 s
[deg/m]
C/C++ 1 3.5

0.0026
25 LIMO 0.93 % [deg/m] 0.2 s C/C++ 2 2.5

0.0020
26 Stereo DSO 0.93 % 0.1 s C/C++ 1 3.4
[deg/m]

0.0034
27 MLG-SLAM 0.96 % 0.5 s C/C++ 1 3.5
[deg/m]

0.0028
28 ROCC 0.98 % 0.3 s C/C++ 2 2
[deg/m]

0.0044
29 S4-SLAM 0.98 % 0.2 s C/C++ 2 3
[deg/m]

0.0037
30 DLOAM 0.99 % 2s Python 8 3.5
[deg/m]

0.0025
31 LiOd 1.01 % 1s C/C++ >8 2.5
[deg/m]
AGX Jetson
0.0023 Xavier
32 IsaacElbrus 1.02 % 0.0095 s AGX
[deg/m] Jetson Nano
(0,03s)

0.0040
33 DLO 1.02 % 0.1s C/C++ 1 2.8
[deg/m]

0.0053
34 S4OM 1.03 % 0.15 s C/C++ 1 2.5
[deg/m]

35 NDT_LO 1.05 % 0.0043 0.15s C/C++ 1 2.5


[deg/m]

0.0034
36 SuMa++ 1.06 % 0.1 s C/C++ 1 3.5
[deg/m]

0.0029
37 cv4xv1-sc 1.09 % [deg/m] 0.145 s C/C++ GPU 3.5

0.0033
38 VINS-Fusion 1.09 % 0.1s C/C++ 1 3
[deg/m]

0.0023
39 FPVO 1.10 % [deg/m] 0.08 s C/C++ 4 2.3

0.0028
40 MonoROCC 1.11 % [deg/m] 1s C/C++ 2 2

0.0029
41 ISRI_VO2 1.12 % 0.1 s C/C++ 1 2.5
[deg/m]

42 ISRI_VO 1.13 % 0.0030 0.1 s C/C++ 1 2.5


[deg/m]

43 DEMO 1.14 % 0.0049 0.1 s C/C++ 2 2.5


[deg/m]

0.0027
44 ORB-SLAM2 1.15 % 0.06 s C/C++ 2 >3,5
[deg/m]

0.0032
45 ElbrusFast 1.15 % 0.018 s C/C++ 1.5 3.3
[deg/m]

0.0035
46 NOTF 1.17 % 0.45 s C/C++ 1 3
[deg/m]
0.0022
47 FSMVO 1.18 % 0.1 s C/C++ 1 2.5
[deg/m]

0.0025
48 S-PTAM 1.19 % [deg/m] 0.03 s C/C++ 4 3.0

0.0033
49 S-LSD-SLAM 1.20 % 0.07 s C/C++ 1 3.5
[deg/m]

0.0029
50 VoBa 1.22 % 0.1 s C/C++ 1 2
[deg/m]

51 STEAM-L WNOJ 1.22 % 0.0058 0.2 s C/C++ 1 2.5


[deg/m]

0.0042
52 LiViOdo 1.22 % 0.5 s C/C++ 1 2.5
[deg/m]

0.0041
53 SLUP 1.25 % [deg/m] 0.17 s C/C++ 4 3.3

0.0061
54 STEAM-L 1.26 % [deg/m] 0.2 s C/C++ 1 2.5

0.0038
55 FRVO 1.26 % 0.03 s C/C++ 1 3.5
[deg/m]

0.0026
56 RTAB-Map 1.26 % 0.1 s C/C++ 1 2.5
[deg/m]

0.0029
57 ORBVB 1.26 % 0.06 s C/C++ 2 >3,5
[deg/m]

0.0034
58 A-LOAM 1.26 % 0.1 s C/C++ 1 2.5
[deg/m]

0.0022
59 MIGP 1.28 % 0.2 s C/C++ 2 2.5
[deg/m]

0.0030
60 MFI 1.30 % [deg/m] 0.1 s C/C++ 1 2.2

0.0038
61 TLBBA 1.36 % 0.1 s C/C++ 1 2.8
[deg/m]
0.0035
62 2FO-CC 1.37 % 0.1 s C/C++ 1 3
[deg/m]

0.0051
63 SALO 1.37 % [deg/m] 0.6 s C/C++ 1 2.5

0.0034
64 SuMa 1.39 % 0.1 s C/C++ 1 3.5
[deg/m]

0.0035
65 ProSLAM 1.39 % 0.02 s C/C++ 1 3
[deg/m]

66 JFBVO 1.43 % 0.0038 0.05 s C/C++ 1 3.4


[deg/m]

0.0051
67 DQV-SLAM 1.47 % 0.2 s C/C++ 2 2.5
[deg/m]

0.0042
68 StereoSFM 1.51 % [deg/m] 0.02 s C/C++ 2 2.5

0.0050
69 Leso 1.57 % [deg/m] 0.05 s C/C++ 1 2.5

0.0044
70 SSLAM 1.57 % 0.5 s C/C++ 8 3.5
[deg/m]

0.0061
71 ICP SLAM 1.61 % 0.1 s C/C++ 1 >3,5
[deg/m]

0.0054
72 MLLAB 1.72 % 0.02 s C/C++ 1 2.5
[deg/m]

0.0036
73 eVO 1.76 % 0.05 s C/C++ 2 2
[deg/m]

0.0026
74 Stereo DWO 1.76 % 0.1 s C/C++ 4 2.5
[deg/m]

0.0036
75 BVO 1.76 % [deg/m] 0.1 s Python 1 2.5

0.0051
76 D6DVO 2.04 % 0.03 s C/C++ 1 2.5
[deg/m]
0.0051 Python +
77 PMO / PbT-M2 2.05 % 1s 1 2.5
[deg/m] C/C++

0.0056
78 GFM 2.12 % [deg/m] 0.03 s C/C++ 2 1.5

0.0059
79 SSLAM-HR 2.14 % 0.5 s C/C++ 8 3.5
[deg/m]

0.0049
80 FTMVO 2.24 % 0.11 s C/C++ 1 2.5
[deg/m]

0.0053
81 PbT-M1 2.38 % 1s C/C++ 1 2.5
[deg/m]

0.0114
82 VISO2-S 2.44 % 0.05 s C/C++ 1 2.5
[deg/m]

0.0057
83 MLM-SFM 2.54 % 0.03 s C/C++ 5 2.5
[deg/m]

0.0078
84 GT_VO3pt 2.54 % 1.26 s C/C++ 1 2.5
[deg/m]

85 PeNet 2.54 % 0.0089 0.1 s C/C++ GPU 2.5


[deg/m]

0.0086
86 RMCPE+GP 2.55 % 0.39 s C/C++ 1 2.5
[deg/m]

87 LKNMVO 2.66 % 0.0079 0.3 s C/C++ GPU 3


[deg/m]

0.0068
88 VO3pt 2.69 % 0.56 s C/C++ 1 2
[deg/m]

0.0077
89 TGVO 2.94 % 0.06 s C/C++ 1 2.5
[deg/m]

0.0132
90 OISEL 3.02 % 0.2 s C/C++ 1 2.5
[deg/m]

0.0104
91 VO3ptLBA 3.13 % 0.57 s C/C++ 1 2
[deg/m]
0.0095
92 PLSVO 3.26 % 0.20 s C/C++ 2 2.5
[deg/m]

0.0144
93 NO_OISEL 3.45 % [deg/m] 0.1 s C/C++ 1 2.5

0.0128
94 BLF 3.49 % 0.7 s C/C++ 1 2.5
[deg/m]

0.0107
95 CFORB 3.73 % 0.9 s C/C++ 8 3
[deg/m]

96 VOFS 3.94 % 0.0099 0.51 s C/C++ 1 2


[deg/m]

0.0096
97 UnDFVO 4.03 % 1s Python 1 2.5
[deg/m]

0.0112
98 VOFSLBA 4.17 % [deg/m] 0.52 s C/C++ 1 2

0.0052
99 CUDA-EgoMotion4.36 % [deg/m] .001 s Matlab GPU 2.5

0.0175
100 BCC 4.59 % 1s C/C++ 1 2.5
[deg/m]

0.0274
101 EB3DTE+RJMCM 5.45 % 1s Matlab 1 2.5
[deg/m]

0.0065
102 LVT 5.80 % 0.02 s C/C++ 2 2.5
[deg/m]

0.0245
103 VISO2-M + GP 7.46 % 0.15 s C/C++ 1 2.5
[deg/m]

0.0227
104 unscene 8.68 % 0.1 s C/C++ GPU 2.5
[deg/m]

0.0163
105 BLO 9.21 % 0.1 s C/C++ 1 2.5
[deg/m]

0.0069
106 PJ-test 11.79 % 0.1 s C/C++ 1 2.5
[deg/m]

0.0234
107 VISO2-M 11.94 % 0.1 s C/C++ 1 2.5
[deg/m]
0.0115
108 MTL 14.95 % 0.1 s C/C++ 1 2.5
[deg/m]

0.0135
109 OABA 20.95 % [deg/m] 0.5 s C/C++ 1 3.5

0.0394 Python +
110 3DC-VO 21.00 % 0.04 s 1 2.5
[deg/m] C/C++

0.0489
111 DeepVO 24.55 % 1s Python 1 2.5
[deg/m]

You might also like