-
Notifications
You must be signed in to change notification settings - Fork 156
Case study: example on TLS point cloud registration
This page shows you an example on using MULLS-ICP together with TEASER++ for TLS point cloud registration.
Terrestrial laser scanning (TLS) can collect very dense point cloud over a large area with high accuracy.
Taking Riegl VZ-400 Laser Scanner as an example, more than 10 million points can be collected over a 400 m radius sphere with 5 mm accuracy. Due to its high accuracy and density, TLS is widely used on geomatics (surveying and mapping) applications.
In this example, we test our registration sub-module on WHU-TLS dataset. The scenario WHU-TLS Campus is selected, which is scanned by the mentioned VZ-400. For reproducing our results, you can download it by filling the data request.
Firstly, configure the data path and parameters in run_mulls_reg.sh
My example is as follows:
#!/bin/sh
#########################################################################################
# MULLS pairwise point cloud registration #
############################# part to configure (down)###################################
#data path (*.pcd, *.las, *.ply, *.txt, *.h5)
tpc_path=xxxx/dummy_target_point_cloud.xxx
spc_path=xxxx/dummy_source_point_cloud.xxx
opc_path=xxxx/dummy_transformed_source_point_cloud.xxx
#run
./bin/mulls_reg \
--colorlogtostderr=true \
-stderrthreshold 0 \
-log_dir ./log/test \
--v=10 \
--realtime_viewer_on=true \
--point_cloud_1_path=${tpc_path} \
--point_cloud_2_path=${spc_path} \
--output_point_cloud_path=${opc_path} \
--cloud_1_down_res=0.05 \
--cloud_2_down_res=0.05 \
--gf_grid_size=3.0 \
--gf_in_grid_h_thre=0.3 \
--gf_neigh_grid_h_thre=1.5 \
--dist_inverse_sampling_method=1 \
--unit_dist=50.0 \
--gf_ground_down_rate=8 \
--gf_nonground_down_rate=3 \
--pca_neighbor_radius=1.8 \
--pca_neighbor_count=50 \
--linearity_thre=0.65 \
--planarity_thre=0.65 \
--curvature_thre=0.1 \
--reciprocal_corr_on=true \
--corr_dis_thre=3.0 \
--converge_tran=0.0005 \
--converge_rot_d=0.002 \
--reg_max_iter_num=20 \
--teaser_on=true
Then, run sh script/run_mulls_reg.sh
, you will see the following feature viewer:
Geometric feature points in target (left) and source (right) point cloud: (gray: ground, blue: facade, green: pillar, yellow: beam, red: roof, purple: vertex keypoint)
Press [Space], TEASER would be applied to achieve coarse registration. Results are still shown in the feature viewer:
For each vertex keypoint, a neighborhood category context (NCC) feature is extracted, which is based on the distribution of different geometric feature points in the neighborhood. NCC is not as descriptive as FPFH and other more sophisticated learning based feature, but generating NCC is extremely fast based on the former feature point extraction. Correspondences are determined by NCC feature distance reciprocally among the keypoints of the source and the target point cloud. Then, TEASER is applied based on these correspondence.
Left: correspondences determined by NCC feature of the keypoints; Right: registration result of TEASER
Press [Space], TEASER's result will be shown in the registration viewer for comparing. Press [Space] again, you will see the registration result after the refinement of MULLS-ICP.
Left: source (golden) and target (silver) point cloud before registration; Right: Left: source and target point cloud after TEASER and MULLS-ICP fine tune.
Details of the final registered point cloud
In the end, press [Space], the registered (transformed source) point cloud will be written to the specified path ${opc_path}
.
TEASER + MULLS-ICP took only about 3 seconds on a moderate PC.
Function | Consuming time (s) |
---|---|
Geo-feature point extraction (target) | 1.0 |
Geo-feature point extraction (source) | 0.8 |
Determining keypoint correspondence roughly | 0.4 |
TEASER coarse registration | 0.05 |
MULLS-ICP refinement | 0.4 |
Total | 2.65 |
Brief tutorial and review on point cloud registration: link
We thank the authors of TEASER for making their breakthrough work public.
We thank the authors of WHU-TLS dataset for making the awesome dataset public.