This repository includes the code of the experiments introduced in the paper:
Álvarez-Tuñón, O., Brodskiy, Y., & Kayacan, E. (2023). Monocular visual simultaneous localization and mapping:(r) evolution from geometry to deep learning-based pipelines. IEEE Transactions on Artificial Intelligence.
Clone this repository with its submodules:
git clone --recurse-submodules -j8 [email protected]:olayasturias/monocular_visual_slam_survey.git
If you forgot to clone the submodules, you can download them afterwards as
git submodule update --init --recursive
This repository contains a folder called algorithms
with submodules pointing to the algorithms implemented in the experimental comparison for the survey.
The scripts under the script
folder can be easily modified to automatically run each algorithm independently or all the algorithms at once. They automatically generate the necessary configs and store the results in the respective folders as shown below:
monocular_visual_slam
├── algorithms
│ └── <algorithm> # as a github submodule
├── configs
│ └── <algorithm>
│ │ ├── <dataset_i>
│ │ │ └── <track_name>.yml
│ │ └── <dataset_i+1>
│ │ └── <track_name>.yml
├── results
│ └── <algorithm>
│ │ ├── <dataset_i>
│ │ │ └── <track_name>
│ │ │ │ └── result.txt
│ │ └── <dataset_i+1>
│ │ └── result.txt
├── media
│ └── some_cool_slam_pic.png
├── scripts
│ └── run_<algorithm>.sh
│ └── run_all.sh
└── README.md
For those libraries using python, we will create the correspondent virtual environments to ensure that we have the correct versions for each.
Follow the given naming for the environments, or at least be aware that the scripts refer to the names used here. If you use different names, you'll need to modify the script scripts/run_all.sh
Create a conda environment for DF-VO as follows:
cd algorithms/DF-VO/envs
conda env create -f requirement.yml
Note that this could take a while :tea:
For DF-VO, the models are downloaded here
and saved in the directory algorithms/DF-VO/model_zoo
Create a conda environment for TrianFlow as follows:
cd algorithms/TrianFlow
conda env create -f requirement.yml
For TrianFlow, the models are downloaded here
and saved in the directory algorithms/TrianFlow/models
Install the requirements indicated in the README. Once you've done that, build ORB_SLAM3 as:
cd algorithms/ORB_SLAM3
./build.sh
Note: for Pangolin and Eigen I recommend you to follow the instructions in this post. Tested with Eigen3.4 and Pangolin v0.6.
Install the requirements indicated in DSO's README. Once you've done that, build DSO as:
cd algorithms/dso
mkdir build
cd build
cmake ..
make -j16 # or whichever number of cores you have
Note: for Pangolin and Eigen I recommend you to follow the instructions in this post. Tested with Eigen3.4 and Pangolin v0.6.
We provide scripts for running each of the algorithms independently and also all at once. For the python-based methods, the scripts use the conda environments following the naming from the Installation section. If you use different names, you'll need to modify the script scripts/run_all.sh
First of all, DF-VO requires config files that handle the data loading parameters for each dataset. Some of them are already originally provided by the repository. We aditionally provide scripts that automatically generate those configs for additional datasets like MIMIR and TUM-RGBD:
cd scripts/
./generate_dfvo_configs.sh <DatasetName> <DatasetRoot> <Tracks> <ModelZooDir>
# For example:
./generate_dfvo_configs.sh TUM $HOME/Datasets rgbd_dataset_freiburg1_xyz $HOME/dev/monocular_visual_slam_survey/algorithms/DF-VO/model_zoo
This will automatically generate a config file called <Tracks>.yml
under DFVO/<DatasetName>/
, e.g. DFVO/TUM/rgbd_dataset_freiburg1_xyz.yml
Then, you're ready to execute DF-VO in your favourite dataset as:
cd scripts/
./run_dfvo.sh <DatasetName> <DatasetRoot> <Tracks>
# For example:
./run_dfvo.sh TUM $HOME/Datasets rgbd_dataset_freiburg1_xyz
Running Trianflow with any of the supported datasets is as simple as:
cd scripts/
./run_trianflow.sh <DatasetName> <DatasetRoot> <Tracks>
# For example:
./run_trianflow.sh TUM $HOME/Datasets rgbd_dataset_freiburg1_xyz
Running ORB-SLAM3 with any of the supported datasets is as simple as:
cd scripts/
./run_orb3_mono.sh <DatasetName> <DatasetRoot> <Tracks>
# For example:
./run_orb3_mono.sh TUM $HOME/Datasets rgbd_dataset_freiburg1_xyz
Running DSO with any of the supported datasets is as simple as:
cd scripts/
./dso.sh <DatasetName> <DatasetRoot> <Tracks>
# For example:
./dso.sh TUM $HOME/Datasets rgbd_dataset_freiburg1_xyz
Too busy to run those scripts and having to manually specify the input parameters? Well, you can run ALL automatically by just executing the script:
cd scripts/
./run_all.sh
This script considers that the folder where you have all the supported datasets is $HOME/Datasets
. If you have them in a different folder, you just need to modify the path that the variable $datasetRoot
is pointing at. You also need to specify the tracks available for each dataset in the variables <DatasetName>_tracks
.
The datasets that can be used to run these algorithms off-the-shelf are:
Dataset | Publication | Sample |
---|---|---|
KITTI | Geiger, A., Lenz, P., Stiller, C., & Urtasun, R. (2013). Vision meets robotics: The kitti dataset. The International Journal of Robotics Research, 32(11), 1231-1237. | |
EuRoC | Burri, M., Nikolic, J., Gohl, P., Schneider, T., Rehder, J., Omari, S., ... & Siegwart, R. (2016). The EuRoC micro aerial vehicle datasets. The International Journal of Robotics Research, 35(10), 1157-1163. | |
Aqualoc | Ferrera, M., Creuze, V., Moras, J., & Trouvé-Peloux, P. (2019). AQUALOC: An underwater dataset for visual–inertial–pressure localization. The International Journal of Robotics Research, 38(14), 1549-1559. | |
TUM-RGBD | Sturm, J., Engelhard, N., Endres, F., Burgard, W., & Cremers, D. (2012). RGB-D SLAM dataset and benchmark. Computer Vision Group TUM Department of Informatics Technical University of Munich. | |
MIMIR | O. Álvarez-Tuñón et al., “MIMIR-UW: A multipurpose synthetic dataset for underwater navigation and inspection,” 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). |