This commit is contained in:
Ivan
2022-04-05 11:42:28 +03:00
commit 6dc0eb0fcf
5565 changed files with 1200500 additions and 0 deletions

261
doc/BatchEvaluation.md Normal file
View File

@@ -0,0 +1,261 @@
# Batch Evaluation of Square Root Optimization and Marginalization
In this tutorial we detail how you can use the batch evaluation
scripts to reproduce the results of the ICCV'21 paper Demmel et al.,
"Square Root Marginalization for Sliding-Window Bundle Adjustment".
In the paper we discuss how square root estimation techniques can be
used in Basalt's optimization-based sliding-window odometry to make
optimization faster and marginalization numerially more stable. See
the [project page](https://go.vision.in.tum.de/rootvo) for further
details.
Basalt's VIO/VO now runs with single-precision floating point numbers
by default, using the new square root formulation. The conventional
squared (Hessian-based) formualtion is still available via config
options. For manual testing, you can pass `--use-double true` or
`--use-double false` (default) as command line arguments to
`basalt_vio`, and change `config.vio_sqrt_marg` in the config file,
which controls if the marginalization prior is stored in Hessian or
Jacobian form (default: `true`), as well as
`config.vio_linearization_type` in the config file, which controls
whether to use Schur complement, or nullspace projection and
QR-decomposition for optimization and marginalization (`"ABS_SC"` or
`"ABS_QR"` (default)).
In the following tutorial we systematically compare the different
formulations in single and double precision to reproduce the results
from the ICCV'21 paper. You can of course adjust the correspondig
config files to evaluate other aspects of the system.
## Prerequisites
1. **Source installation of Basalt:** The batch evaluation scripts
by default assume that the `build` folder is directly inside the
source checkout `basalt`. See
[README.md](../README.md#source-installation-for-ubuntu-1804-and-macos-1014-mojave)
for instructions.
2. **Downloads of the datasets:** We evaluate EuRoC (all 11
sequneces), TUMVI (euroc format in 512x512 resultion; sequences:
corridor1-2, magistrale1-2, room1-2, slides1-2), and Kitti
Odometry (sequences 00-10). It's recommended to store the data
locally on an SSD to ensure that reading the images is not the
bottleneck during evaluation (on a multicore desktop Basalt runs
many times faster than real-time). There are instructions for
downloading these dataset: [EuRoC](VioMapping.md#euroc-dataset),
[TUMVI](VioMapping.md#tum-vi-dataset),
[KITTI](Vo.md#kitti-dataset). Calibration for EuRoC and TUMVI is
provided in the `data` folder. For KITTI you can use the
`basalt_convert_kitti_calib.py` script to convert the provided
calibration to a Basalt-compatible format (see
[KITTI](Vo.md#kitti-dataset)).
3. **Dependencies of evaluation scripts:** You need pip packages
`py_ubjson`, `matplotlib`, `numpy`, `munch`, `scipy`, `pylatex`,
`toml`. How to install depends on your Python setup (virtualenv,
conda, ...). To just install for the local user with pip you can
use the command `python3 -m pip install --user -U py_ubjson
matplotlib numpy munch scipy pylatex toml`. For generating result
tables and plots you additionally need latexmk and a LaTeX
distribution (Ubuntu: `sudo apt install texlive-latex-extra
latexmk`; macOS with Homebrew: `brew install --cask mactex`).
## Folder Structure
The batch evaluation scripts and config files assume a certain folder
structure inside a "parent" folder, since relative paths are used to
find the compiled executable and calibration files. So **it's
important to follow the folder structure**.
```
parent-folder/
├─ basalt/
│ ├─ build/
│ │ ├─ basalt_vio
│ │ ├─ ...
│ ├─ data/
│ │ ├─ euroc_ds_calib.json
│ │ ├─ ...
│ ├─ ...
├─ experiments/
│ ├─ iccv_tutorial/
│ │ ├─ basalt_batch_config.toml
│ │ ├─ experiments-iccv.toml
│ │ ├─ 01_iccv_all/
│ │ │ ├─ ...
│ │ ├─ 02_iccv_runtime/
│ │ │ ├─ ...
```
As a sibling of the `basalt` source checkout we'll have an
`experiments` folder, and inside, a folder `iccv_tutorial` for this
tutorial. Into that folder, we copy the provided
`basalt_batch_config.toml` file that defines the configurations we
want to evaluate and from which we generate individual config files
for each VIO / VO run. We also copy the provided
`experiments-iccv.toml` config file, which defines the results tables
and plots that we generate from the experiments' logs.
> *Note:* Commands in this tutorial are assumed to be executed from
> within `parent-folder` unless specified otherwise.
```bash
mkdir -p experiments/iccv_tutorial
cp basalt/data/iccv21/basalt_batch_config.toml experiments/iccv_tutorial/
cp basalt/data/iccv21/experiments-iccv.toml experiments/iccv_tutorial/
```
## Generate Experimental Configs
First, edit the copied configuration file
`experiments/iccv_tutorial/basalt_batch_config.toml` and modify all
`"dataset-path"` lines to point to the locations where you downloaded
the datasets to.
Now, we can generate per-experiment config files:
```bash
cd experiments/iccv_tutorial/
../../basalt/scripts/batch/generate-batch-configs.py .
```
This will create subfolder `01_iccv_all` containing folders
`vio_euroc`, `vio_tumvi`, and `vo_kitti`, which in turn contain all
generated `basalt_config_...json` files, one for each experiment we
will run.
## Run Experiments
We can now run all experiments for those generate configs. Each config
/ sequence combination will automatically be run twice and only the
second run is evaluated, which is meant to ensure that file system
caches are hot.
Since we also evaluate runtimes, we recommend that you don't use the
machine running the experiments for anything else and also ensure no
expensive background tasks are running during the evaluation. On one
of our desktops with 12 (virtual) cores the total evaluation of all
sequences takes aroudn 3:30h. Your milage may vary of course depending
on the hardware.
```bash
cd experiments/iccv_tutorial/
time ../../basalt/scripts/batch/run-all-in.sh 01_iccv_all/
```
Inside `01_iccv_all`, a new folder with the start-time of the
experimental run is created, e.g., `20211006-143137`, and inside that
you can again see the same per-dataset subfolders `vio_euroc`,
`vio_tumvi`, and `vo_kitti`, inside of which there is a folder for
each config / run. Inside these per-run folders you can find log files
including the command line output, which you can inspect in case
something doesn't work.
In a second terminal, you can check the status of evaluation while it
is running (adjust the argument to match the actual folder name).
```bash
cd experiments/iccv_tutorial/
../../basalt/scripts/batch/list-jobs.sh 01_iccv_all/20211006-143137
```
If you see failed experiments for the square root solver in single
precision, don't worry, that is expected.
## Generate Results Tables and Plots
After all experimental runs have completed, you can generate a PDF
file with tabulated results and plots, similar to those in the ICCV'21
paper.
```bash
cd experiments/iccv_tutorial/
../../basalt/scripts/batch/generate-tables.py --config experiments-iccv.toml --open
```
The results are in the generated `tables/experiments-iccv.pdf` file
(and with the `--open` argument should automatically open with the
default PDF reader).
## Better Runtime Evaluation
The experiments above have the extended logging of eigenvalue and
nullspace information enabled, which does cost a little extra
runtime. To get a better runtime comparison, you can re-run the
experiments without this extended logging. The downside is, that you
can only generate the results tables, but not the plots.
We assume that you have already followed the tutorial above, including
the initial folder setup. For these modified experiments, we redo all
three steps (generating config files; running experiments; generating
results) with slight modifications.
First, edit the `experiments/iccv_tutorial/basalt_batch_config.toml`
file at the bottom, and uncomment the commented entries in
`_batch.combinations` as well as the commented `revision`. At the same
time, comment out the initially uncommented lines. It should look
something like this after the modifications:
```toml
[_batch.combinations]
#vio_euroc = ["vio", "savetumgt", "extlog", "runtwice", "all_meth", "all_double", "all_euroc"]
#vio_tumvi = ["vio", "tumvivio", "savetumgt", "extlog", "runtwice", "all_meth", "all_double", "more_tumvi"]
#vo_kitti = ["vo", "kittivo", "savetumgt", "extlog", "runtwice", "all_meth", "all_double", "all_kitti"]
vio_euroc = ["vio", "runtwice", "all_meth", "all_double", "all_euroc"]
vio_tumvi = ["vio", "tumvivio", "runtwice", "all_meth", "all_double", "more_tumvi"]
vo_kitti = ["vo", "kittivo", "runtwice", "all_meth", "all_double", "all_kitti"]
```
```toml
[_batch]
#revision = "01_iccv_all"
revision = "02_iccv_runtime"
```
You can see that we removed the `savetumgt` and `extlog` named config
elements and that generated config files and results for this second
run of experiments will be placed in `02_iccv_runtime`.
Now generate config files and start the experimental runs:
```
cd experiments/iccv_tutorial/
../../basalt/scripts/batch/generate-batch-configs.py .
time ../../basalt/scripts/batch/run-all-in.sh 02_iccv_runtime/
```
Before generating the results PDF you need to now edit the
`experiments-iccv.toml` file, point it to the new location for
experimental logs and disable the generation of plots. Check the place
towards the start of the file where substitutions for
`EXP_PATTERN_VIO` and `EXP_PATTERN_VO` are defined, as well as
`SHOW_TRAJECTORY_PLOTS`, `SHOW_EIGENVALUE_PLOTS`, and
`SHOW_NULLSPACE_PLOTS`. After your modifications, that section should
look something like:
```toml
###################
## where to find experimental runs
[[substitutions]]
#EXP_PATTERN_VIO = "01_iccv_all/*-*/vio_*/"
#EXP_PATTERN_VO = "01_iccv_all/*-*/vo_*/"
EXP_PATTERN_VIO = "02_iccv_runtime/*-*/vio_*/"
EXP_PATTERN_VO = "02_iccv_runtime/*-*/vo_*/"
###################
## which kind of plots to show
[[substitutions]]
SHOW_TRAJECTORY_PLOTS = false
SHOW_EIGENVALUE_PLOTS = false
SHOW_NULLSPACE_PLOTS = false
```
Now we can generate the results tables for the new experimental runs
with the same command as before:
```bash
cd experiments/iccv_tutorial/
../../basalt/scripts/batch/generate-tables.py --config experiments-iccv.toml --open
```

174
doc/Calibration.md Normal file
View File

@@ -0,0 +1,174 @@
# Calibration
Here, we explain how to use the calibration tools with [TUM-VI](https://vision.in.tum.de/data/datasets/visual-inertial-dataset), [EuRoC](https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets), [UZH-FPV](http://rpg.ifi.uzh.ch/uzh-fpv.html) and [Kalibr](https://github.com/ethz-asl/kalibr/wiki/downloads) datasets as an example.
## TUM-VI dataset
Download the datasets for camera and camera-IMU calibration:
```
mkdir ~/tumvi_calib_data
cd ~/tumvi_calib_data
wget http://vision.in.tum.de/tumvi/calibrated/512_16/dataset-calib-cam3_512_16.bag
wget http://vision.in.tum.de/tumvi/calibrated/512_16/dataset-calib-imu1_512_16.bag
```
### Camera calibration
Run the camera calibration:
```
basalt_calibrate --dataset-path ~/tumvi_calib_data/dataset-calib-cam3_512_16.bag --dataset-type bag --aprilgrid /usr/etc/basalt/aprilgrid_6x6.json --result-path ~/tumvi_calib_result/ --cam-types ds ds
```
The command line options have the following meaning:
* `--dataset-path` path to the dataset.
* `--dataset-type` type of the datset. Currently only `bag` and `euroc` formats of the datasets are supported.
* `--result-path` path to the folder where the resulting calibration and intermediate results will be stored.
* `--aprilgrid` path to the configuration file for the aprilgrid.
* `--cam-types` camera models for the image streams in the dataset. For more details see [arXiv:1807.08957](https://arxiv.org/abs/1807.08957).
After that, you should see the calibration GUI:
![tumvi_cam_calib](/doc/img/tumvi_cam_calib.png)
The buttons in the GUI are located in the order which you should follow to calibrate the camera. After pressing a button the system will print the output to the command line:
* `load_dataset` loads the dataset.
* `detect_corners` starts corner detection in the background thread. Since it is the most time consuming part of the calibration process, the detected corners are cached and loaded if you run the executable again pointing to the same result folder path.
* `init_cam_intr` computes an initial guess for camera intrinsics.
* `init_cam_poses` computes an initial guess for camera poses given the current intrinsics.
* `init_cam_extr` computes an initial transformation between the cameras.
* `init_opt` initializes optimization and shows the projected points given the current calibration and camera poses.
* `optimize` runs an iteration of the optimization and visualizes the result. You should press this button until the error printed in the console output stops decreasing and the optimization converges. Alternatively, you can use the `opt_until_converge` checkbox that will run the optimization until it converges automatically.
* `save_calib` saves the current calibration as `calibration.json` in the result folder.
* `compute_vign` **(Experimental)** computes a radially-symmetric vignetting for the cameras. For the algorithm to work, **the calibration pattern should be static (camera moving around it) and have a constant lighting throughout the calibration sequence**. If you run `compute_vign` you should press `save_calib` afterwards. The png images with vignetting will also be stored in the result folder.
You can also control the process using the following buttons:
* `show_frame` slider to switch between the frames in the sequence.
* `show_corners` toggles the visibility of the detected corners shown in red.
* `show_corners_rejected` toggles the visibility of rejected corners. Works only when `show_corners` is enabled.
* `show_init_reproj` shows the initial reprojections computed by the `init_cam_poses` step.
* `show_opt` shows reprojected corners with the current estimate of the intrinsics and poses.
* `show_vign` toggles the visibility of the points used for vignetting estimation. The points are distributed across white areas of the pattern.
* `show_ids` toggles the ID visualization for every point.
* `huber_thresh` controls the threshold for the huber norm in pixels for the optimization.
* `opt_intr` controls if the optimization can change the intrinsics. For some datasets it might be helpful to disable this option for several first iterations of the optimization.
* `opt_until_converge` runs the optimization until convergence.
* `stop_thresh` defines the stopping criteria. Optimization will stop when the maximum increment is smaller than this value.
### Camera + IMU + Mocap calibration
After calibrating the camera you can run the camera + IMU + Mocap calibration. The result path should point to the **same folder as before**:
```
basalt_calibrate_imu --dataset-path ~/tumvi_calib_data/dataset-calib-imu1_512_16.bag --dataset-type bag --aprilgrid /usr/etc/basalt/aprilgrid_6x6.json --result-path ~/tumvi_calib_result/ --gyro-noise-std 0.000282 --accel-noise-std 0.016 --gyro-bias-std 0.0001 --accel-bias-std 0.001
```
The command line options for the IMU noise are continous-time and defined as in [Kalibr](https://github.com/ethz-asl/kalibr/wiki/IMU-Noise-Model):
* `--gyro-noise-std` gyroscope white noise.
* `--accel-noise-std` accelerometer white noise.
* `--gyro-bias-std` gyroscope random walk.
* `--accel-bias-std` accelerometer random walk.
![tumvi_imu_calib](/doc/img/tumvi_imu_calib.png)
The buttons in the GUI are located in the order which you need to follow to calibrate the camera-IMU setup:
* `load_dataset`, `detect_corners`, `init_cam_poses` same as above.
* `init_cam_imu` initializes the rotation between camera and IMU by aligning rotation velocities of the camera to the gyro data.
* `init_opt` initializes the optimization. Shows reprojected corners in magenta and the estimated values from the spline as solid lines below.
* `optimize` runs an iteration of the optimization. You should press it several times until convergence before proceeding to next steps. Alternatively, you can use the `opt_until_converge` checkbox that will run the optimization until it converges automatically.
* `init_mocap` initializes the transformation from the Aprilgrid calibration pattern to the Mocap coordinate system.
* `save_calib` save the current calibration as `calibration.json` in the result folder.
* `save_mocap_calib` save the current Mocap to IMU calibration as `mocap_calibration.json` in the result folder.
You can also control the visualization using the following buttons:
* `show_frame` - `show_ids` the same as above.
* `show_spline` toggles the visibility of enabled measurements (accel, gyro, position, velocity) generated from the spline that we optimize.
* `show_data` toggles the visibility of raw data contained in the dataset.
* `show_accel` shows accelerometer data.
* `show_gyro` shows gyroscope data.
* `show_pos` shows spline position for `show_spline` and positions generated from camera pose initialization transformed into IMU coordinate frame for `show_data`.
* `show_rot_error` shows the rotation error between spline and camera pose initializations transformed into IMU coordinate frame.
* `show_mocap` shows the mocap marker position transformed to the IMU frame.
* `show_mocap_rot_error` shows rotation between the spline and Mocap measurements.
* `show_mocap_rot_vel` shows the rotation velocity computed from the Mocap.
The following options control the optimization process:
* `opt_intr` enables optimization of intrinsics. Usually should be disabled for the camera-IMU calibration.
* `opt_poses` enables optimization based camera pose initialization. Sometimes helps to better initialize the spline before running optimization with `opt_corners`.
* `opt_corners` enables optimization based on reprojection corner positions **(should be used by default)**.
* `opt_cam_time_offset` computes the time offset between camera and the IMU. This option should be used only for refinement when the optimization already converged.
* `opt_imu_scale` enables IMU axis scaling, rotation and misalignment calibration. This option should be used only for refinement when the optimization already converged.
* `opt_mocap` enables Mocap optimization. You should run it only after pressing `init_mocap`.
* `huber_thresh` controls the threshold for the huber norm in pixels for the optimization.
* `opt_until_convg` runs the optimization until convergence.
* `stop_thresh` defines the stopping criteria. Optimization will stop when the maximum increment is smaller than this value.
**NOTE:** In this case we use a pre-calibrated sequence, so most of refinements or Mocap to IMU calibration will not have any visible effect. If you want to test this functionality use the "raw" sequences, for example `http://vision.in.tum.de/tumvi/raw/dataset-calib-cam3.bag` and `http://vision.in.tum.de/tumvi/raw/dataset-calib-imu1.bag`.
## EuRoC dataset
Download the datasets for camera and camera-IMU calibration:
```
mkdir ~/euroc_calib_data
cd ~/euroc_calib_data
wget http://robotics.ethz.ch/~asl-datasets/ijrr_euroc_mav_dataset/calibration_datasets/cam_april/cam_april.bag
wget http://robotics.ethz.ch/~asl-datasets/ijrr_euroc_mav_dataset/calibration_datasets/imu_april/imu_april.bag
```
### Camera calibration
Run the camera calibration:
```
basalt_calibrate --dataset-path ~/euroc_calib_data/cam_april.bag --dataset-type bag --aprilgrid /usr/etc/basalt/aprilgrid_6x6.json --result-path ~/euroc_calib_result/ --cam-types ds ds
```
![euroc_cam_calib](/doc/img/euroc_cam_calib.png)
### Camera + IMU calibration
After calibrating the camera you can run the camera + IMU calibration. The result-path should point to the same folder as before:
```
basalt_calibrate_imu --dataset-path ~/euroc_calib_data/imu_april.bag --dataset-type bag --aprilgrid /usr/etc/basalt/aprilgrid_6x6.json --result-path ~/euroc_calib_result/ --gyro-noise-std 0.000282 --accel-noise-std 0.016 --gyro-bias-std 0.0001 --accel-bias-std 0.001
```
![euroc_imu_calib](/doc/img/euroc_imu_calib.png)
## UZH dataset
Download the datasets for camera and camera-IMU calibration:
```
mkdir ~/uzh_calib_data
cd ~/uzh_calib_data
wget http://rpg.ifi.uzh.ch/datasets/uzh-fpv/calib/indoor_forward_calib_snapdragon_cam.bag
wget http://rpg.ifi.uzh.ch/datasets/uzh-fpv/calib/indoor_forward_calib_snapdragon_imu.bag
```
### Camera calibration
Run the camera calibration:
```
basalt_calibrate --dataset-path ~/uzh_calib_data/indoor_forward_calib_snapdragon_cam.bag --dataset-type bag --aprilgrid /usr/etc/basalt/aprilgrid_5x4_uzh.json --result-path ~/uzh_calib_result/ --cam-types ds ds
```
![uzh_cam_calib](/doc/img/uzh_cam_calib.png)
### Camera + IMU calibration
After calibrating the camera you can run the camera + IMU calibration. The result-path should point to the same folder as before:
```
basalt_calibrate_imu --dataset-path ~/uzh_calib_data/indoor_forward_calib_snapdragon_imu.bag --dataset-type bag --aprilgrid /usr/etc/basalt/aprilgrid_5x4_uzh.json --result-path ~/uzh_calib_result/ --gyro-noise-std 0.05 --accel-noise-std 0.1 --gyro-bias-std 4e-5 --accel-bias-std 0.002
```
![uzh_imu_calib](/doc/img/uzh_imu_calib.png)
## Kalibr dataset
Download the datasets for camera and camera-IMU calibration from [here (Sample datasets)](https://github.com/ethz-asl/kalibr/wiki/downloads):
```
mkdir ~/kalibr_calib_data
cd ~/kalibr_calib_data
# Download data
tar xvf static.tar.gz
tar xvf dynamic.tar.gz
```
### Camera calibration
Run the camera calibration:
```
basalt_calibrate --dataset-path ~/kalibr_calib_data/static/static.bag --dataset-type bag --aprilgrid /usr/etc/basalt/aprilgrid_6x6.json --result-path ~/kalibr_calib_result/ --cam-types ds ds ds ds
```
![kalibr_cam_calib](/doc/img/kalibr_cam_calib.png)
### Camera + IMU calibration
After calibrating the camera you can run the camera + IMU calibration. The result-path should point to the same folder as before:
```
basalt_calibrate_imu --dataset-path ~/kalibr_calib_data/dynamic/dynamic.bag --dataset-type bag --aprilgrid /usr/etc/basalt/aprilgrid_6x6.json --result-path ~/kalibr_calib_result/ --gyro-noise-std 0.005 --accel-noise-std 0.01 --gyro-bias-std 4.0e-06 --accel-bias-std 0.0002
```
![kalibr_imu_calib](/doc/img/kalibr_imu_calib.png)

71
doc/DevSetup.md Normal file
View File

@@ -0,0 +1,71 @@
### Clang-format
We use clang-format to maintain a consistent formating of the code. Since there are small differences between different version of clang-format we use version 11 on all platforms.
On **Ubuntu 20.04 or 18.04** run the following commands to install clang-format-11
```
wget -O - https://apt.llvm.org/llvm-snapshot.gpg.key|sudo apt-key add -
sudo sh -c 'echo "deb http://apt.llvm.org/$(lsb_release -sc)/ llvm-toolchain-$(lsb_release -sc)-11 main" > /etc/apt/sources.list.d/llvm11.list'
sudo apt-get update
sudo apt-get install clang-format-11
```
On **MacOS** [Homebrew](https://brew.sh/) should install the right version of clang-format:
```
brew install clang-format
```
### Realsense Drivers (Optional)
If you want to use the code with Realsense T265 cameras you should install the realsense library.
On **Ubuntu 20.04 or 18.04** run the following commands:
```
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-key C8B3A55A6F3EFCDE
sudo sh -c 'echo "deb http://realsense-hw-public.s3.amazonaws.com/Debian/apt-repo bionic main" > /etc/apt/sources.list.d/realsense.list'
sudo apt-get update
sudo apt-get install librealsense2-dev librealsense2-gl-dev librealsense2-dkms librealsense2-utils librealsense2-dkms
```
On **MacOS** run:
```
brew install librealsense
```
### Install and configure QtCreator
Download and install QtCreator. On **Ubuntu 20.04 or 18.04** run:
```
wget https://download.qt.io/official_releases/qtcreator/4.10/4.10.0/qt-creator-opensource-linux-x86_64-4.10.0.run
chmod +x qt-creator-opensource-linux-x86_64-4.10.0.run
./qt-creator-opensource-linux-x86_64-4.10.0.run
```
On **MacOS** run:
```
brew cask install qt-creator
```
After installation, go to `Help` -> `About plugins...` in the menu and enable Beautifier plugin (formats the code automatically on save):
![qt_creator_plugins](/doc/img/qt_creator_plugins.png)
Go to `Tools` -> `Options` and select the Beautifier tab. There select ClangFormat as the tool in `General` tab.
![qt_creator_beautifier_general](/doc/img/qt_creator_beautifier_general.png)
Select file as predefined style in `Clang Format` tab. Also select `None` as the fallback style. For **Ubuntu 20.04 or 18.04** change the executable name to `/usr/bin/clang-format-11`.
![qt_creator_beautifier_clang_format](/doc/img/qt_creator_beautifier_clang_format.png)
### Build project
First, clone the project repository.
```
git clone --recursive https://gitlab.com/VladyslavUsenko/basalt.git
```
After that, in QtCreator open to the `CMakeLists.txt` in the `basalt` folder and configure the project with `Release with Debug Info` configuration. The build directory should point to `/<your_installation_path>/basalt/build`.
![qt_creator_configure_project](/doc/img/qt_creator_configure_project.png)
Finally, you should be able to build and run the project.

159
doc/Realsense.md Normal file
View File

@@ -0,0 +1,159 @@
# Tutorial on Camera-IMU and Motion Capture Calibration with Realsense T265
![Realsense](/doc/img/realsense_setup.jpg)
In this tutorial we explain how to perform photometric and geometric calibration of the multi-camera setup and then calibrate the transformations between cameras, IMU and the motion capture marker setup. To make sure the calibration is successful we recommend to rigidly attach markers to the camera as shown on the figure above.
## Dataset
We provide a set of example datasets for the calibration. Even if you plan to record your own datasets (covered in the next section), we recommend to download and try the provided examples:
```
mkdir ~/t265_calib_data
cd ~/t265_calib_data
wget http://vision.in.tum.de/tumvi/t265/response_calib.zip
wget http://vision.in.tum.de/tumvi/t265/cam_calib.zip
wget http://vision.in.tum.de/tumvi/t265/imu_calib.zip
wget http://vision.in.tum.de/tumvi/t265/sequence0.zip
unzip response_calib
unzip cam_calib
unzip imu_calib
unzip sequence0
```
## Recording Own Dataset
You can record your own sequences using the `basalt_rs_t265_record` executable:
```
basalt_rs_t265_record --dataset-path ~/t265_calib_data/ --manual-exposure
```
* `--dataset-path` specifies the location where the recorded dataset will be stored. In this case it will be stored in `~/t265_calib_data/<current_timestamp>/`.
* `--manual-exposure` disables the autoexposure. In this tutorial the autoexposure is disabled for all calibration sequences, but for the VIO sequence (sequence0) we enable it.
![t265_record](/doc/img/t265_record.png)
The GUI elements have the following meaning:
* `webp_quality` compression quality. The highest value (101) means lossless compression. For photometric calibration it is important not to have any compression artifacts, so we record these calibration sequences with lossless compression.
* `skip_frames` reduces the framerate of the recorded dataset by skipping frames.
* `exposure` controls the exposure time of the cameras.
* `record` starts/stops the recoding of the dataset. If you run the system on the slow PC pay attention to the number of messages in the queue. If it goes beyond the limit the recorder will start dropping frames.
* `export_calibration` exports factory calibration in the basalt format.
After recoding the dataset it is a good practice to verify the quality of the dataset. You can do this by running:
```
basalt_verify_dataset.py -d ~/t265_calib_data/<dataset_path>/
```
It will report the actual frequencies of the recorded sensor messages and warn you if any files with image data are missing.
Every sequence required for the calibration should have certain properties to enable successful calibration. Pay attention to the **Important for recording the dataset** subsections and inspect the provided examples before recoding your own sequences.
## Response calibration
In this project we assume a camera has a linear response function (intensity of the pixel is linearly proportional to the amount of light captured by the pixel). In this section we will verify this for the Realsense T265 cameras. We will need to record a static scene with different exposure times.
**Important for recording the dataset:**
* Keep the camera static and make sure that nothing in the scene moves during the recording.
* Move `webp_quality` slider to the highest value to enable lossless compression.
* Optionally set the `skip_frames` slider to 3-5 to speed up the dataset recoding.
* Start recoding and slowly move the exposure slider up and down. Stop recoding.
* Rename the dataset to `response_calib`.
Run the response function calibration:
```
basalt_response_calib.py -d ~/t265_calib_data/response_calib
```
You should see the response function and the irradiance image similar to the one shown below. For the details of the algorithm see Section 2.3.1 of [[arXiv:1607.02555]](https://arxiv.org/abs/1607.02555). The results suggest that the response function used in the camera is linear.
![t265_inv_resp_irradiance](/doc/img/t265_inv_resp_irradiance.png)
## Multi-Camera Geometric and Vignette Calibration
For the camera calibration we need to record a dataset with a static aprilgrid pattern.
**Important for recording the dataset:**
* Move `webp_quality` slider to the highest value to enable lossless compression (important for vignette calibration).
* Set the `skip_frames` slider to 5 to speed up the dataset recoding.
* Move the camera slowly to reduce the motion blur.
* Cover the entire field of view of the camera with the calibration pattern. Try to observe the pattern from different angles.
* Make sure you do not cast shadows at the pattern (important for vignette calibration).
* Rename the dataset to `cam_calib`
Run the calibration executable:
```
basalt_calibrate --dataset-path ~/t265_calib_data/cam_calib --dataset-type euroc --result-path ~/t265_calib_results/ --aprilgrid /usr/etc/basalt/aprilgrid_6x6.json --cam-types kb4 kb4
```
To perform the calibration follow these steps:
* `load_dataset` load the dataset.
* `detect_corners` detect corners. If the corners were detected before the cached detections will be loaded at the previous step so there is no need to re-run the detection.
* `init_cam_intr` initialize camera intrinsics.
* `init_cam_poses` initialize camera poses using the current intrinsics estimate.
* `init_cam_extr` initialize transformations between multiple cameras.
* `init_opt` initialize optimization.
* `opt_until_converge` optimize until convergence.
* `init_cam_poses` some initial poses computed from the initialized intrinsics can be far from optimum and not converge to the right minimum. To improve the final result we can re-initialize poses with optimized intrinsics.
* `init_opt` initialize optimization with new initial poses.
* `opt_until_converge` optimize until convergence.
* `compute_vign` after optimizing geometric models compute the vignetting of the cameras.
* `save_calib` save calibration file to the `~/t265_calib_results/calibration.json`.
![t265_cam_calib](/doc/img/t265_cam_calib.png)
## IMU and Motion Capture Calibration
After calibrating cameras we can proceed to geometric and time calibration of the cameras, IMU and motion capture system. Setting up the motion capture system is specific for your setup.
For the motion capture recording we use [ros_vrpn_client](https://github.com/ethz-asl/ros_vrpn_client) with [basalt_capture_mocap.py](/scripts/basalt_capture_mocap.py). We record the data to the `mocap0` folder and then move it to the `mav0` directory of the camera dataset. This script is provided as an example. Motion capture setup is different in every particular case.
**Important for recording the dataset:**
* Set the `skip_frames` slider to 1 to use the full framerate.
* Reduce the exposure time to reduce the motion blur.
* Move the setup such that all axes of accelerometer and gyro are excited. This means moving with acceleration along X, Y and Z axes and rotating around those axes.
* Do not forget to simultaneously record motion capture data.
* Rename the dataset to `imu_calib`.
Run the calibration executable:
```
basalt_calibrate_imu --dataset-path ~/t265_calib_data/imu_calib --dataset-type euroc --result-path ~/t265_calib_results/ --aprilgrid /usr/etc/basalt/aprilgrid_6x6.json --accel-noise-std 0.00818 --gyro-noise-std 0.00226 --accel-bias-std 0.01 --gyro-bias-std 0.0007
```
To perform the calibration follow these steps:
* `load_dataset` load the dataset.
* `detect_corners` detect corners. If the corners were detected before the cached detections will be loaded at the previous step so there is no need to re-run the detection.
* `init_cam_poses` initialize camera poses.
* `init_cam_imu` initialize the transformation between cameras and the IMU.
* `init_opt` initialize optimization.
* `opt_until_converge` optimize until convergence.
* Enable `opt_cam_time_offset` and `opt_imu_scale` to optimize the time offset between cameras and the IMU and the IMU scale.
* `opt_until_converge` optimize until convergence.
* `init_mocap` initialize transformation between the motion capture marker frame and the IMU and the transformation between the aprilgrid pattern and the motion capture system origin.
* Enable `opt_mocap`.
* `opt_until_converge` optimize until convergence.
* `save_calib` save calibration file to the `~/t265_calib_results/calibration.json`.
* `save_mocap_calib` save motion capture system calibration file to the `~/t265_calib_results/mocap_calibration.json`.
![t265_imu_calib](/doc/img/t265_imu_calib.png)
## Generating Time-Aligned Ground Truth
Since motion capture system and the PC where the dataset was recorded might not have the same clock we need to perform the time synchronization. Additionally we need to transform the coordinate frame of the GT data to the IMU frame (originally it is in the coordinate frame attached to the markers).
The raw motion capture data is stored in the `mav/mocap0/` folder. We can find the time offset by minimizing the error between gyro measurements and rotational velocities computed from the motion capture data. If you press the `save_aligned_dataset` button the resulting trajectory (time aligned and transformed to the IMU frame) will be written to `mav/gt/data.csv` and automatically loaded when available.
```
basalt_time_alignment --dataset-path ~/t265_calib_data/sequence0 --dataset-type euroc --calibration ~/t265_calib_results/calibration.json --mocap-calibration ~/t265_calib_results/mocap_calibration.json
```
You should be able to see that, despite some noise, rotational velocity computed from the motion capture data aligns well with gyro measurements.
![t265_time_align_gyro](/doc/img/t265_time_align_gyro.png)
You can also switch to the error function plot and see that there is a clear minimum corresponding to the computed time offset.
![t265_time_align_error](/doc/img/t265_time_align_error.png)
**Note:** If you want to run the time alignment again you should delete the `~/t265_calib_data/sequence0/mav/gt` folder first. If GT data already exist you will see the `save_aligned_dataset(disabled)` button which will **NOT** overwrite it.
## Running Visual-Inertial Odometry
Now we can run the visual-inertial odometry on the recorded dataset:
```
basalt_vio --dataset-path ~/t265_calib_data/sequence0 --cam-calib ~/t265_calib_results/calibration.json --dataset-type euroc --config-path /usr/etc/basalt/euroc_config.json --show-gui 1
```
After the system processes the whole sequence you can use `align_se3` button to align trajectory to the ground-truth data and compute RMS ATE.
![t265_vio](/doc/img/t265_vio.png)
## Running Visual-Inertial Odometry Live
It is also possible to run the odometry live with the camera. If no calibration files are provided the factory calibration will be used.
```
basalt_rs_t265_vio --cam-calib ~/t265_calib_results/calibration.json --config-path /usr/etc/basalt/euroc_config.json
```

51
doc/Simulation.md Normal file
View File

@@ -0,0 +1,51 @@
## Simulator
For better evaluation of the system we use the simulated environment where the optical flow and IMU data is generated from the ground truth by adding noise.
**Note:** The path to calibration and configuration files used here works for the APT installation. If you compile from source specify the appropriate path to the files in [data folder](/data/).
### Visual-inertial odometry simulator
```
basalt_vio_sim --cam-calib /usr/etc/basalt/euroc_ds_calib.json --marg-data sim_marg_data --show-gui 1
```
The command line options have the following meaning:
* `--cam-calib` path to camera calibration file. Check [calibration instructions](doc/Calibration.md) to see how the calibration was generated.
* `--marg-data` folder where the data from keyframe marginalization will be stored. This data can be later used for visual-inertial mapping simulator.
* `--show-gui` enables or disables GUI.
This opens the GUI and runs the sequence.
![SIM_VIO](/doc/img/SIM_VIO.png)
The buttons in the GUI have the following meaning:
* `show_obs` toggles the visibility of the ground-truth landmarks in the image view.
* `show_obs_noisy` toggles the visibility of the noisy landmarks in the image view.
* `show_obs_vio` toggles the visibility of the landmarks estimated by VIO in the image view.
* `show_ids` toggles the IDs of the landmarks.
* `show_accel` shows noisy accelerometer measurements generated from the ground-truth spline.
* `show_gyro` shows noisy gyroscope measurements generated from the ground-truth spline.
* `show_gt_...` shows ground truth position, velocity and biases.
* `show_est_...` shows VIO estimates of the position, velocity and biases.
* `next_step` proceeds to next frame.
* `continue` plays the sequence.
* `align_se3` performs SE(3) alignment with ground-truth trajectory and prints the RMS ATE to the console.
### Visual-inertial mapping simulator
```
basalt_mapper_sim --cam-calib /usr/etc/basalt/euroc_ds_calib.json --marg-data sim_marg_data --show-gui 1
```
The command line arguments are the same as above.
This opens the GUI where the map can be processed.
![SIM_MAPPER](/doc/img/SIM_MAPPER.png)
The system processes the marginalization data and extracts the non-linear factors from them. Roll-pitch and relative-pose factors are initially added to the system. One way to verify that they result in gravity-aligned map is the following
* `optimize` runs the optimization
* `rand_inc` applies a random increment to all frames of the system. If you run the `optimize` until convergence afterwards, and press `align_se3` the alignment transformation should only contain the rotation around Z axis.
* `rand_yaw` applies an increment in yaw to all poses. This should not change the error of the optimization once is have converged.
* `setup_points` triangulates the points and adds them to optimization. You should optimize the system again after adding the points.
* `align_se3` performs SE(3) alignment with ground-truth trajectory and prints the RMS ATE to the console.
For comparison we also provide the `basalt_mapper_sim_naive` executable that has the same parameters. It runs a global bundle-adjustment on keyframe data and inserts pre-integrated IMU measurements between keyframes. This executable is included for comparison only.

121
doc/VioMapping.md Normal file
View File

@@ -0,0 +1,121 @@
## EuRoC dataset
We demonstrate the usage of the system with the `MH_05_difficult` sequence of the [EuRoC dataset](https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets) as an example.
**Note:** The path to calibration and configuration files used here works for the APT installation. If you compile from source specify the appropriate path to the files in [data folder](/data/).
Download the sequence from the dataset and extract it.
```
mkdir euroc_data
cd euroc_data
wget http://robotics.ethz.ch/~asl-datasets/ijrr_euroc_mav_dataset/machine_hall/MH_05_difficult/MH_05_difficult.zip
mkdir MH_05_difficult
cd MH_05_difficult
unzip ../MH_05_difficult.zip
cd ../
```
### Visual-inertial odometry
To run the visual-inertial odometry execute the following command in `euroc_data` folder where you downloaded the dataset.
```
basalt_vio --dataset-path MH_05_difficult/ --cam-calib /usr/etc/basalt/euroc_ds_calib.json --dataset-type euroc --config-path /usr/etc/basalt/euroc_config.json --marg-data euroc_marg_data --show-gui 1
```
The command line options have the following meaning:
* `--dataset-path` path to the dataset.
* `--dataset-type` type of the datset. Currently only `bag` and `euroc` formats of the datasets are supported.
* `--cam-calib` path to camera calibration file. Check [calibration instructions](doc/Calibration.md) to see how the calibration was generated.
* `--config-path` path to the configuration file.
* `--marg-data` folder where the data from keyframe marginalization will be stored. This data can be later used for visual-inertial mapping.
* `--show-gui` enables or disables GUI.
This opens the GUI and runs the sequence. The processing happens in the background as fast as possible, and the visualization results are saved in the GUI and can be analysed offline.
![MH_05_VIO](/doc/img/MH_05_VIO.png)
The buttons in the GUI have the following meaning:
* `show_obs` toggles the visibility of the tracked landmarks in the image view.
* `show_ids` toggles the IDs of the points.
* `show_est_pos` shows the plot of the estimated position.
* `show_est_vel` shows the plot of the estimated velocity.
* `show_est_bg` shows the plot of the estimated gyro bias.
* `show_est_ba` shows the plot of the estimated accel bias.
* `show_gt` shows ground-truth trajectory in the 3D view.
By default the system starts with `continue_fast` enabled. This option visualizes the latest processed frame until the end of the sequence. Alternatively, the `continue` visualizes every frame without skipping. If both options are disabled the system shows the frame that is selected with the `show_frame` slider and the user can move forward and backward with `next_step` and `prev_step` buttons. The `follow` button changes between the static camera and the camera attached to the current frame.
For evaluation the button `align_se3` is used. It aligns the GT trajectory with the current estimate using an SE(3) transformation and prints the transformation and the root-mean-squared absolute trajectory error (RMS ATE).
The button `save_traj` saves the trajectory in one of two formats (`euroc_fmt` or `tum_rgbd_fmt`). In EuRoC format each pose is a line in the file and has the following format `timestamp[ns],tx,ty,tz,qw,qx,qy,qz`. TUM RBG-D can be used with [TUM RGB-D](https://vision.in.tum.de/data/datasets/rgbd-dataset/tools) or [UZH](https://github.com/uzh-rpg/rpg_trajectory_evaluation) trajectory evaluation tools and has the following format `timestamp[s] tx ty tz qx qy qz qw`.
### Visual-inertial mapping
To run the mapping tool execute the following command:
```
basalt_mapper --cam-calib /usr/etc/basalt/euroc_ds_calib.json --marg-data euroc_marg_data
```
Here `--marg-data` is the folder with the results from VIO.
This opens the GUI and extracts non-linear factors from the marginalization data.
![MH_05_MAPPING](/doc/img/MH_05_MAPPING.png)
The buttons in the GUI have the following meaning:
* `show_frame1`, `show_cam1`, `show_frame2`, `show_cam2` allows you to assign images to image view 1 and 2 from different timestamps and cameras.
* `show_detected` shows the detected keypoints in the image view window.
* `show_matches` shows feature matching results.
* `show_inliers` shows inlier matches after geometric verification.
* `show_ids` prints point IDs. Can be used to find the same point in two views to check matches and inliers.
* `show_gt` shows the ground-truth trajectory.
* `show_edges` shows the edges from the factors. Relative-pose factors in red, roll-pitch factors in magenta and bundle adjustment co-visibility edges in green.
* `show_points` shows 3D landmarks.
The workflow for the mapping is the following:
* `detect` detect the keypoints in the keyframe images.
* `match` run the geometric 2D to 2D matching between image frames.
* `tracks` build tracks from 2D matches and triangulate the points.
* `optimize` run the optimization.
* `align_se3` align ground-truth trajectory in SE(3) and print the transformation and the error.
The `num_opt_iter` slider controls the maximum number of iterations executed when pressing `optimize`.
The button `save_traj` works similar to the VIO, but saves the keyframe trajectory (subset of frames).
For more systematic evaluation see the evaluation scripts in the [scripts/eval_full](/scripts/eval_full) folder.
**NOTE: It appears that only the datasets in ASL Dataset Format (`euroc` dataset type in our notation) contain ground truth that is time-aligned to the IMU and camera images. It is located in the `state_groundtruth_estimate0` folder. Bag files have raw Mocap measurements that are not time aligned and should not be used for evaluations.**
### Optical Flow
The visual-inertial odometry relies on the optical flow results. To enable a better analysis of the system we also provide a separate optical flow executable
```
basalt_opt_flow --dataset-path MH_05_difficult/ --cam-calib /usr/etc/basalt/euroc_ds_calib.json --dataset-type euroc --config-path /usr/etc/basalt/euroc_config.json --show-gui 1
```
This will run the GUI and print an average track length after the dataset is processed.
![MH_05_OPT_FLOW](/doc/img/MH_05_OPT_FLOW.png)
## TUM-VI dataset
We demonstrate the usage of the system with the `magistrale1` sequence of the [TUM-VI dataset](https://vision.in.tum.de/data/datasets/visual-inertial-dataset) as an example.
Download the sequence from the dataset and extract it.
```
mkdir tumvi_data
cd tumvi_data
wget http://vision.in.tum.de/tumvi/exported/euroc/512_16/dataset-magistrale1_512_16.tar
tar -xvf dataset-magistrale1_512_16.tar
```
### Visual-inertial odometry
To run the visual-inertial odometry execute the following command in `tumvi_data` folder where you downloaded the dataset.
```
basalt_vio --dataset-path dataset-magistrale1_512_16/ --cam-calib /usr/etc/basalt/tumvi_512_ds_calib.json --dataset-type euroc --config-path /usr/etc/basalt/tumvi_512_config.json --marg-data tumvi_marg_data --show-gui 1
```
![magistrale1_vio](/doc/img/magistrale1_vio.png)
### Visual-inertial mapping
To run the mapping tool execute the following command:
```
basalt_mapper --cam-calib /usr/etc/basalt/tumvi_512_ds_calib.json --marg-data tumvi_marg_data
```
![magistrale1_mapping](/doc/img/magistrale1_mapping.png)

25
doc/Vo.md Normal file
View File

@@ -0,0 +1,25 @@
## KITTI dataset
[![teaser](/doc/img/kitti_video.png)](https://www.youtube.com/watch?v=M_ZcNgExUNc)
We demonstrate the usage of the system with the [KITTI dataset](http://www.cvlibs.net/datasets/kitti/eval_odometry.php) as an example.
**Note:** The path to calibration and configuration files used here works for the APT installation. If you compile from source specify the appropriate path to the files in [data folder](/data/).
Download the sequences (`data_odometry_gray.zip`) from the dataset and extract it.
```
# We assume you have extracted the sequences in ~/dataset_gray/sequences/
# Convert calibration to the basalt format
basalt_convert_kitti_calib.py ~/dataset_gray/sequences/00/
# If you want to convert calibrations for all sequences use the following command
for i in {00..21}; do basalt_convert_kitti_calib.py ~/dataset_gray/sequences/$i/; done
```
Optionally you can also copy the provided ground-truth poses to `poses.txt` in the corresponding sequence.
### Visual odometry
To run the visual odometry execute the following command.
```
basalt_vio --dataset-path ~/dataset_gray/sequences/00/ --cam-calib /work/kitti/dataset_gray/sequences/00/basalt_calib.json --dataset-type kitti --config-path /usr/etc/basalt/kitti_config.json --show-gui 1 --use-imu 0
```
![magistrale1_vio](/doc/img/kitti.png)

BIN
doc/img/MH_05_MAPPING.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 407 KiB

BIN
doc/img/MH_05_OPT_FLOW.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 542 KiB

BIN
doc/img/MH_05_VIO.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 333 KiB

BIN
doc/img/SIM_MAPPER.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

BIN
doc/img/SIM_VIO.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 139 KiB

BIN
doc/img/euroc_cam_calib.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 413 KiB

BIN
doc/img/euroc_imu_calib.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 601 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 538 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 579 KiB

BIN
doc/img/kitti.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 369 KiB

BIN
doc/img/kitti_video.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 126 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 270 KiB

BIN
doc/img/magistrale1_vio.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 187 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 84 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 73 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 186 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 169 KiB

BIN
doc/img/realsense_setup.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

BIN
doc/img/t265_cam_calib.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 MiB

BIN
doc/img/t265_imu_calib.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 339 KiB

BIN
doc/img/t265_record.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.6 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 632 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

BIN
doc/img/t265_vio.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 MiB

BIN
doc/img/teaser.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 269 KiB

BIN
doc/img/tumvi_cam_calib.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 365 KiB

BIN
doc/img/tumvi_imu_calib.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 401 KiB

BIN
doc/img/uzh_cam_calib.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 447 KiB

BIN
doc/img/uzh_imu_calib.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 679 KiB