Skip to content

Demonstrations

rdube edited this page Apr 25, 2018 · 10 revisions

In this wiki page, we show how to apply the SegMatch algorithm on different datasets and for different applications. This page will be updated over time with new examples.

Localization in the KITTI dataset

This demonstration shows how to reproduce the results of the first part of the SegMatch video.

For running the demonstration on the KITTI datasets you first need to create the following folders segmatch/laser_mapper/demonstration_files/kitti and download the files from the following link in the kitti folder.

For this first demonstration will need:

  • 2011_10_03_drive_27.bag: a ROS bag file created from the raw KITTI data obtained here.
  • drive27_target_map.pcd: a point cloud to localize against.
  • kitti_localization.rviz: a simple rviz configuration to get you starting.
  • random_forest_eigen_25trees.xml: a random forest trained on separate data.

Once these file are copied in the appropriate folder, you can start the demonstration using:

$ roslaunch laser_mapper kitti_localization.launch

The rosbag will first be paused to give the program some time to load the point cloud from disk and to extract and describe segments. Once that is completed, you should see the segments from the target point cloud in white, as illustrated in the following image.

target_point_cloud

Once segmentation of the target cloud is completed, you can return to the shell where the file was launched and start playing the bag by hitting the space bar. As the vehicle moves, you should start observing localizations as displayed in the following image.

localizations

In this image, the segments extracted from the source cloud are represented with random colors. The green lines indicate segments matches which passed the geometrical verification step. The vehicle trajectory is illustrated in orange and the red spheres indicate where segmentation occurred.

On some machines, the visualization of the point cloud might significantly affect performances. If this is the case, you can try reducing the SegMatchWorker parameter titled ratio_of_points_to_keep_when_publishing in segmatch/laser_mapper/launch/kitti/kitti_localization.yaml. For optimal performances, you can try subscribing only to the segments centroid clouds eg.: /segmatch/source_segments_centroids and /segmatch/target_segments_centroids. Please note that the target point cloud and segment centroids are published only once. For you information, the input of the SegMatch algorithm is a local map created by the laser_mapper on a topic called /laser_mapper/local_map.

Closing loops in the KITTI dataset

This demonstration shows how to reproduce the results of the second part of the SegMatch video. In this scenario, the target segment map is built online and the robot is able to find and close loops based on segment matching.

For this second demonstration you will require:

  • 2011_09_30_drive_18.bag: a ROS bag file created from the raw KITTI data obtained here.
  • kitti_loop_closure.rviz: a simple rviz configuration to get you starting.
  • random_forest_eigen_25trees.xml: a random forest trained on separate data.

The demonstration can be launched using:

$ roslaunch laser_mapper kitti_loop_closure.launch

Pressing the space bar will then play the bag and you should see the target map being incrementally built as shown in white in the following figure.

start

Once the robot revisits a part of the environment you should observe segment matches and loop closures displayed in green and blue respectively. The trajectory is optimized by feeding the loop-closures in a sparse pose-graph optimizer as illustrated in the following figure.

first

This final figure illustrates how the trajectory should look like once the bag has completed.

end

As mentioned above, the visualization of the point cloud might significantly affect performances. For optimal performances, you can try subscribing only to the segments centroid clouds eg.: /segmatch/source_segments_centroids and /segmatch/target_segments_centroids.

Further demonstrations

More demonstrations to come :-)

Clone this wiki locally