You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/gs-calibration.dox
+6-6Lines changed: 6 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -56,7 +56,7 @@ The first task is to calibrate the camera intrinsic values such as the focal len
56
56
Our group often uses the [Kalibr](https://github.com/ethz-asl/kalibr/) @cite Furgale2013IROS calibration toolbox to perform both intrinsic and extrinsic offline calibrations, by proceeding the following steps:
57
57
58
58
1. Clone and build the [Kalibr](https://github.com/ethz-asl/kalibr/) toolbox
59
-
2. Print out a calibration board to use (we normally use the [Aprilgrid 6x6 0.8x0.8 m (A0 page)](https://drive.google.com/file/d/1TCZJ1KPJrsj3ffCNnj001ege54jffc19/view))
59
+
2. Print out a calibration board to use (we normally use the Aprilgrid 6x6 0.8x0.8 m (A0 page) [pdf](https://drive.google.com/file/d/14dY7z8pDb2iEBdveTviDXsoi5H9AaQP1/view?usp=drive_link) [yaml](https://drive.google.com/file/d/1zXfr48_OY0RafwJalBLjqkqgnme-r7Gd/view?usp=drive_link))
60
60
3. Ensure that your sensor driver is publishing onto ROS with correct timestamps.
61
61
4. Sensor preparations
62
62
- Limit motion blur by decreasing exposure time (can be achieved through high framerate)
@@ -72,7 +72,7 @@ Our group often uses the [Kalibr](https://github.com/ethz-asl/kalibr/) @cite Fur
72
72
73
73
@image html kalibr_reprojection.png width=60%
74
74
75
-
An example script [calibrate_camera_static.sh](https://github.com/rpng/ar_table_dataset/blob/9d556a789e2d01387e5ba2aeb2453269bc2c4001/calibrate_camera_static.sh), dataset, and configuration can be found in our group's [ar\_table\_dataset](https://github.com/rpng/ar_table_dataset/) repository.
75
+
An example script [calibrate_camera_static.sh](https://github.com/rpng/ar_table_dataset/blob/master/calibrate_camera_static.sh), dataset, and configuration can be found in our group's [ar\_table\_dataset](https://github.com/rpng/ar_table_dataset/) repository.
@@ -108,7 +108,7 @@ We recommend using the [allan_variance_ros](https://github.com/ori-drs/allan_var
108
108
- Finally process the command via their `analysis.py` script
109
109
5. Typically these noise values should be inflated by maybe 10-20x their values to take into account unmodelled errors (one can test to see how different inflations perform)
110
110
111
-
An example script [calibrate_imu.sh](https://github.com/rpng/ar_table_dataset/blob/9d556a789e2d01387e5ba2aeb2453269bc2c4001/calibrate_imu.sh), dataset, and configuration can be found in our group's [ar\_table\_dataset](https://github.com/rpng/ar_table_dataset/) repository.
111
+
An example script [calibrate_imu.sh](https://github.com/rpng/ar_table_dataset/blob/master/calibrate_imu.sh), dataset, and configuration can be found in our group's [ar\_table\_dataset](https://github.com/rpng/ar_table_dataset/) repository.
112
112
113
113
114
114
@@ -121,7 +121,7 @@ One needs to have at least one translational motion along with two degrees of or
121
121
We recommend having as much change in orientation as possible in order to ensure convergence.
122
122
123
123
1. Clone and build the [Kalibr](https://github.com/ethz-asl/kalibr/) toolbox
124
-
2. Print out a calibration board to use (we normally use the [Aprilgrid 6x6 0.8x0.8 m (A0 page)](https://drive.google.com/file/d/1TCZJ1KPJrsj3ffCNnj001ege54jffc19/view))
124
+
2. Print out a calibration board to use (we normally use the Aprilgrid 6x6 0.8x0.8 m (A0 page) [pdf](https://drive.google.com/file/d/14dY7z8pDb2iEBdveTviDXsoi5H9AaQP1/view?usp=drive_link) [yaml](https://drive.google.com/file/d/1zXfr48_OY0RafwJalBLjqkqgnme-r7Gd/view?usp=drive_link))
125
125
3. Ensure that your sensor driver is publishing onto ROS with correct timestamps (inspect the IMU time dt plot carefully in the PDF report!)
126
126
4. Sensor preparations
127
127
- Limit motion blur by decreasing exposure time
@@ -131,13 +131,13 @@ We recommend having as much change in orientation as possible in order to ensure
131
131
6. Finally run the calibration
132
132
- Use the `kalibr_calibrate_imu_camera`
133
133
- Input your static calibration file which will have the camera topics in it
134
-
- You will need to make an [imu.yaml](https://drive.google.com/file/d/1F9e9I1Xdm14nJw_E1j06HyinlBxRp3yu/view) file with your noise parameters.
134
+
- You will need to make an [imu.yaml](https://drive.google.com/file/d/1oucvH3FABHPUmzkyEH3rX8n4lp8_AH8P/view?usp=drive_link) file with your noise parameters.
135
135
- Depending on how many frames are in your dataset this can take on the upwards of half a day.
136
136
7. Inspect the final result. You will want to make sure that the spline fitted to the inertial reading was properly fitted. Your accelerometer and gyroscope errors are within their 3-sigma bounds (if not then your IMU noise or the dataset are incorrect). Ensure that your estimated biases do not leave your 3-sigma bounds. If they do leave then your trajectory was too dynamic, or your noise values are not good. Sanity check your final rotation and translation with hand-measured values.
137
137
138
138
@image html kalibr_accel_err.png width=60%
139
139
140
-
An example script [calibrate_camera_dynamic.sh](https://github.com/rpng/ar_table_dataset/blob/9d556a789e2d01387e5ba2aeb2453269bc2c4001/calibrate_camera_dynamic.sh), dataset, and configuration can be found in our group's [ar\_table\_dataset](https://github.com/rpng/ar_table_dataset/) repository.
140
+
An example script [calibrate_camera_dynamic.sh](https://github.com/rpng/ar_table_dataset/blob/master/calibrate_camera_dynamic.sh), dataset, and configuration can be found in our group's [ar\_table\_dataset](https://github.com/rpng/ar_table_dataset/) repository.
Copy file name to clipboardExpand all lines: docs/gs-datasets.dox
+25-20Lines changed: 25 additions & 20 deletions
Original file line number
Diff line number
Diff line change
@@ -25,22 +25,22 @@ Please take a look at the [run_ros_eth.sh](https://github.com/rpng/open_vins/blo
25
25
@par Groundtruth on V1_01_easy
26
26
We have found that the groundtruth on the V1_01_easy dataset is not accurate in its orientation estimate.
27
27
We have recomputed this by optimizing the inertial and vicon readings in a graph to get the trajectory of the imu (refer to our [vicon2gt](https://github.com/rpng/vicon2gt) @cite Geneva2020TRVICON2GT project).
28
-
You can find the output at this [link](https://drive.google.com/drive/folders/1d62Q_RQwHzKLcIdUlTeBmojr7j0UL4sM?usp=sharing) and is what we normally use to evaluate the error on this dataset.
28
+
You can find the output at this [link](https://drive.google.com/drive/folders/1GospxhpVnyzvJNVUdN4qyrl8GlOd7vw8?usp=drive_link) and is what we normally use to evaluate the error on this dataset.
29
29
30
30
@m_div{m-text-center}
31
31
| Dataset Name | Length (m) | Dataset Link | Groundtruth Traj. | Config |
@@ -237,6 +239,9 @@ Typically we process the datasets at 1.5x rate so we get a ~20 Hz image feed and
237
239
@m_enddiv
238
240
239
241
242
+
243
+
244
+
240
245
@section gs-data-kaist-vio KAIST VIO Dataset
241
246
242
247
The [KAIST VIO dataset](https://github.com/url-kaist/kaistviodataset) @cite Jeon2021RAL is a dataset of a MAV in an indoor 3.15 x 3.60 x 2.50 meter environment which undergoes various trajectory motions.
0 commit comments