Re: Demo RTAB-Map on Turtlebot

Posted by matlabbe on
URL: http://official-rtab-map-forum.206.s1.nabble.com/How-to-process-RGBD-SLAM-datasets-with-RTAB-Map-tp939p647.html

Hi Antony,

The ground truths of the RGB-D datasets were taken using an external motion capture system (e.g., MotionAnalysis or VICON). Unless you have a motion capture system, it may be difficult to create a ground truth indoor.

If you want to test RTAB-Map with the hand-held RGB-D datasets, you can do it with the standalone version and the tgz archives (following example with rgbd_dataset_freiburg3_long_office_household.tgz).

  1. Synchronize the rgb and depth images using this modified script associate.py from this page:
    $ cd rgbd_dataset_freiburg3_long_office_household
    $ python associate.py rgb.txt depth.txt
    
    This will create "rgb_sync" and "depth_sync" directories with both the same number of images.
  2. Create calibration file. Using intrinsic parameters of this page, create the file "rgbddatasets.yaml" as the following and copy this file in "~/Documents/RTAB-Map/camera_info":
    %YAML:1.0
    camera_name: rgbddatasets
    image_width: 0
    image_height: 0
    camera_matrix:
       rows: 3
       cols: 3
       data: [ 525., 0., 3.1950000000000000e+02, 0., 525.,
           2.3950000000000000e+02, 0., 0., 1. ]
    distortion_coefficients:
       rows: 1
       cols: 5
       data: [ 0., 0., 0., 0., 0. ]
    rectification_matrix:
       rows: 3
       cols: 3
       data: [ 1., 0., 0., 0., 1., 0., 0., 0., 1. ]
    projection_matrix:
       rows: 3
       cols: 4
       data: [ 525., 0., 3.1950000000000000e+02, 0., 0., 525.,
           2.3950000000000000e+02, 0., 0., 0., 1., 0. ]
    
  3. Now open RTAB-Map, open Preferences dialog. Click on "Reset all settings" to make sure you have the default values. Go to Source tab, select RGB-D as source type, set input rate to 30 Hz or lower, set calibration name to "rgbddatasets" (should be the same name as the calibration file create before).
  4. Scroll down to select "Images" for the camera driver. Fill RGB and depth directories (with "rgb_sync" and "depth_sync" folders), set 5 for the depth scale, check "Use RGB file names as timestamps" and leave empty the "Optional timestamps file".
  5. To have more points to compare with the ground truth, you can check "Create intermediate nodes..." under Advanced RTAB-Map settings panel.

Start mapping and when the processing is finished (and stopped), you can export the poses in RGB-D dataset format with the menu Edit->Advanced->Export poses... -> RGB-D SLAM format. With the exported txt file, you can compare it with the ground truth online here.

Note that default RTAB-Map's parameters may not work well for some datasets where the camera is moving fast like the freiburg1_360 dataset. Using optical flow odometry strategy may reduce the chance of getting lost in this particular dataset. The freiburg3_long_office_household works well, and I've tested it using the ROS bag too with rgbdslam_datasets.launch.

For the robot SLAM datasets, which have odometry and laser scans from the robot, this could be done with the ROS package and using the ROS bags... but I didn't give a try yet.

cheers,
Mathieu