Lidar based 3d SLAM in sparse feature environment

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Lidar based 3d SLAM in sparse feature environment

Gi_T
This post was updated on .
Hello, I’m currently working on a drone project with the goal of the drone being able to autonomously explore and build a map of a building that doesn’t have many visual features e.g. blank walls and sparse furniture.

I recently came across the rtabmap_drone_example repository which is build on PX4 and RTABMAP and wanted to adapt this to my usecase. The drone also contains a Velodyne VLP16 Lidar which I added to the available Gazebo model and configuration giving promising initial results on maps with lots of visual features like the aws-robomaker-small-house-world.

Working map:
Working house map

Result:


The current configuration however doesn’t work well for maps without many visual features as the odometry starts to drift right after taking of.  Can you provide some guidance on possible approaches for getting stable odometry in such an environment. The real-world test map will be build out of MDF and contain sparse furniture.


Failure map:


Current slam.launch:
<?xml version="1.0"?>
<launch>
   
    <arg name="localization" default="false"/>
    <arg name="rtabmap_viz"   default="true"/>
    <arg name="ground_truth" default="false"/>
    
    <arg     if="$(arg localization)" name="pre_args" default=""/>
    <arg unless="$(arg localization)" name="pre_args" default="-d"/>
   
    <node pkg="nodelet" type="nodelet" name="imu_to_tf" args="standalone rtabmap_util/imu_to_tf">
      <remap from="imu/data" to="/mavros/imu/data"/>
      <param name="fixed_frame_id" value="base_link_stabilized"/>
      <param name="base_frame_id" value="base_link"/>
    </node>

    <!-- To connect rtabmap planning stuff with move_base below -->
    <param name="/rtabmap/rtabmap/use_action_for_goal" value="true"/>
    <remap from="/rtabmap/move_base" to="/move_base"/>

    <!-- VSLAM -->
    <param name="/rtabmap/rtabmap/latch" value="false"/> <!-- For some reason, if we latch grid_map, the global costmap inflation layer will create holes on robot path. To avoid holes, republish grid_map on each update (latching disabled). -->
    <include file="$(find rtabmap_launch)/launch/rtabmap.launch">
      <arg name="localization"      value="$(arg localization)"/>
      <arg name="args"              value="$(arg pre_args) --Optimizer/GravitySigma 0.1 --Vis/FeatureType 10 --Kp/DetectorStrategy 10 --Grid/MapFrameProjection true --NormalsSegmentation false --Grid/MaxGroundHeight 1 --Grid/MaxObstacleHeight 1.6 --RGBD/StartAtOrigin true" />
      <arg name="rtabmap_viz"        value="$(arg rtabmap_viz)" />
      <arg name="frame_id"          value="base_link" />
      <arg name="odom_guess_frame_id" value="base_link_stabilized" />
      <arg name="rgb_topic"         value="/r200/camera/color/image_raw" />
      <arg name="depth_topic"       value="/r200/camera/depth/image_raw" />
      <arg name="camera_info_topic" value="/r200/camera/color/camera_info" />
      <arg name="imu_topic"         value="/mavros/imu/data"/>
      <arg name="wait_imu_to_init"  value="true"/>
      <arg name="approx_sync"       value="true"/>
      <arg name="subscribe_scan_cloud" value="true"/>
      <param name="subscribe_depth" value="false"/>
      <param name="subscribe_rgbd" value="false"/>
      <param name="subscribe_rgb" value="false"/>
      <arg name="scan_cloud_topic" value="/velodyne_points"/>
      
      <arg if="$(arg ground_truth)" name="ground_truth_frame_id" value="world"/>
      <arg if="$(arg ground_truth)" name="ground_truth_base_frame_id" value="base_link_gt"/>
    </include>
    
    <!-- Costmap -->
    <node pkg="nodelet" type="nodelet" name="camera_points_xyz" args="standalone rtabmap_util/point_cloud_xyz">
      <remap from="depth/image"       to="/r200/camera/depth/image_raw"/>
      <remap from="depth/camera_info" to="/r200/camera/depth/camera_info"/>
      <remap from="cloud"             to="camera_cloud" />

      <param name="decimation"  type="double" value="4"/>
      <param name="voxel_size"  type="double" value="0.0"/>
      <param name="approx_sync" type="bool"   value="true"/>
    </node> 

    <!-- navigation -->  
    <node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen">
       <remap from="map" to="/rtabmap/grid_map"/>
       <remap from="odom" to="/rtabmap/odom"/>
       <param name="base_local_planner" value="dwa_local_planner/DWAPlannerROS" />  
       <rosparam file="$(find rtabmap_drone_example)/param/costmap_common_params.yaml" command="load" ns="global_costmap" />
       <rosparam file="$(find rtabmap_drone_example)/param/costmap_common_params.yaml" command="load" ns="local_costmap" />
       <rosparam file="$(find rtabmap_drone_example)/param/global_costmap_params.yaml" command="load" />
       <rosparam file="$(find rtabmap_drone_example)/param/local_costmap_params.yaml" command="load" />
       <rosparam file="$(find rtabmap_drone_example)/param/base_local_planner_params.yaml" command="load" />
    </node>
   <node name="empty_voxels_markers" pkg="rtabmap_costmap_plugins" type="voxel_markers" args="voxel_grid:=/move_base/local_costmap/voxel_layer/voxel_grid visualization_marker:=/voxels_empty">
       <param name="cell_type" value="0"/>
   </node>
   <node name="marked_voxels_markers" pkg="rtabmap_costmap_plugins" type="voxel_markers" args="voxel_grid:=/move_base/local_costmap/voxel_layer/voxel_grid visualization_marker:=/voxels_marked" />

    <!-- joystick -->
    <rosparam file="$(find rtabmap_drone_example)/config/joy_config.yaml" />
    <node pkg="joy" type="joy_node" name="joy_node">
        <param name="autorepeat_rate" value="5"/>
    </node>
    <node pkg="teleop_twist_joy" type="teleop_node" name="teleop_node" output="screen">
        <param name="autorepeat_rate" value="5"/>
    </node>

    <!-- Ground truth -->
  <node if="$(arg ground_truth)" pkg="tf" type="static_transform_publisher" name="worldtomap_broadcaster" args="0 0 0 0 0 0 1 world map 100" />
  <node if="$(arg ground_truth)" pkg="rtabmap_util" type="gazebo_ground_truth.py" name="gazebo_ground_truth" output="screen">
    <param name="frame_id"        value="world"/>
    <param name="child_frame_id"  value="base_link_gt"/>
    <param name="gazebo_frame_id" value="drone::base_link"/>
  </node>
</launch>
Any help is greatly appreciated.

Update:
I now tried out some different combinations of sensors and maps. Results can be seen below:

AWS RoboMaker Small House World


VSLAM (default settings of repository)


<include file="$(find rtabmap_launch)/launch/rtabmap.launch">
      <arg name="localization"      value="$(arg localization)"/>
      <arg name="args"              value="$(arg pre_args) --Optimizer/GravitySigma 0.1 --Vis/FeatureType 10 --Kp/DetectorStrategy 10 --Grid/MapFrameProjection true --NormalsSegmentation false --Grid/MaxGroundHeight 1 --Grid/MaxObstacleHeight 1.6 --RGBD/StartAtOrigin true" />
      <arg name="rtabmap_viz"        value="$(arg rtabmap_viz)" />
      <arg name="frame_id"          value="base_link" />
      <arg name="odom_guess_frame_id" value="base_link_stabilized" />
      <arg name="rgb_topic"         value="/r200/camera/color/image_raw" />
      <arg name="depth_topic"       value="/r200/camera/depth/image_raw" />
      <arg name="camera_info_topic" value="/r200/camera/color/camera_info" />
      <arg name="imu_topic"         value="/mavros/imu/data"/>
      <arg name="wait_imu_to_init"  value="true"/>
      <arg name="approx_sync"       value="true"/>
      
      <arg if="$(arg ground_truth)" name="ground_truth_frame_id" value="world"/>
      <arg if="$(arg ground_truth)" name="ground_truth_base_frame_id" value="base_link_gt"/>
</include>
Result: https://youtu.be/X2k-8kg89g8

Camera + Lidar with Visual Odometry

<include file="$(find rtabmap_launch)/launch/rtabmap.launch">
      <arg name="localization"      value="$(arg localization)"/>
      <arg name="args"              value="$(arg pre_args) --Optimizer/GravitySigma 0.1 --Vis/FeatureType 10 --Kp/DetectorStrategy 10 --Grid/MapFrameProjection true --NormalsSegmentation false --Grid/MaxGroundHeight 1 --Grid/MaxObstacleHeight 1.6 --RGBD/StartAtOrigin true" />
      <arg name="rtabmap_viz"        value="$(arg rtabmap_viz)" />
      <arg name="frame_id"          value="base_link" />
      <arg name="odom_guess_frame_id" value="base_link_stabilized" />
      <arg name="rgb_topic"         value="/r200/camera/color/image_raw" />
      <arg name="depth_topic"       value="/r200/camera/depth/image_raw" />
      <arg name="camera_info_topic" value="/r200/camera/color/camera_info" />
      <arg name="imu_topic"         value="/mavros/imu/data"/>
      <arg name="wait_imu_to_init"  value="true"/>
      <arg name="approx_sync"       value="true"/>
      <arg name="subscribe_scan_cloud" value="true"/>
      <arg name="scan_cloud_topic" value="/velodyne_points"/>
      
      <arg if="$(arg ground_truth)" name="ground_truth_frame_id" value="world"/>
      <arg if="$(arg ground_truth)" name="ground_truth_base_frame_id" value="base_link_gt"/>
</include>
Result: https://youtu.be/V5ZD5773K80

Camera + Lidar with ICP Odometry

    <include file="$(find rtabmap_launch)/launch/rtabmap.launch">
      <arg name="localization"      value="$(arg localization)"/>
      <arg name="args"              value="$(arg pre_args) --Optimizer/GravitySigma 0.1 --Vis/FeatureType 10 --Kp/DetectorStrategy 10 --Grid/MapFrameProjection true --NormalsSegmentation false --Grid/MaxGroundHeight 1 --Grid/MaxObstacleHeight 1.6 --RGBD/StartAtOrigin true" />
      <arg name="rtabmap_viz"        value="$(arg rtabmap_viz)" />
      <arg name="frame_id"          value="base_link" />
      <arg name="odom_guess_frame_id" value="base_link_stabilized" />
      <arg name="rgb_topic"         value="/r200/camera/color/image_raw" />
      <arg name="depth_topic"       value="/r200/camera/depth/image_raw" />
      <arg name="camera_info_topic" value="/r200/camera/color/camera_info" />
      <arg name="imu_topic"         value="/mavros/imu/data"/>
      <arg name="wait_imu_to_init"  value="true"/>
      <arg name="approx_sync"       value="true"/>
      <arg name="subscribe_scan_cloud" value="true"/>
      <arg name="scan_cloud_topic" value="/velodyne_points"/>
      <arg name="icp_odometry" value="true"/>
      
      <arg if="$(arg ground_truth)" name="ground_truth_frame_id" value="world"/>
      <arg if="$(arg ground_truth)" name="ground_truth_base_frame_id" value="base_link_gt"/>
 </include>
Result: https://youtu.be/KtJFd-DVsB4

Custom map


Same settings as above:
Custom map camera: https://youtu.be/WW58vcVW380
Custom map camera + lidar: https://youtu.be/hG3MEwAEl9s
Custom map ICP: https://youtu.be/4z_wYMdYGU0
Reply | Threaded
Open this post in threaded view
|

Re: Lidar based 3d SLAM in sparse feature environment

matlabbe
Administrator
If your target environment is textureless and you have access to a lidar, I'll would do icp_odometry. The problem in your test with icp_odometry is that default ICP parameters are tuned for 2D lidar, not 3D lidar. There is a test_velodyne.launch example here using velodyne for hand-held mapping.

Feeding the IMU to icp_odometry can also help for better ICP.

From your launch file, you may be able to continue using rtabmap.launch by adding those parameters related to lidar (based on test_velodyne.launch):

odom_args:="
  Icp/MaxTranslation 2 Icp/CorrespondenceRatio 0.01 
  Odom/ScanKeyFrameThr 0.4 OdomF2M/ScanSubtractRadius 0.15 
  OdomF2M/ScanMaxSize 15000 
..."
 
rtabmap_args:="
  Icp/PointToPlane true Icp/Iterations 10 Icp/VoxelSize 0.15 Icp/Epsilon 0.001 
  Icp/PointToPlaneK 20 Icp/PointToPlaneRadius 0 Icp/MaxTranslation 3 
  Icp/MaxCorrespondenceDistance 1.5 Icp/PM true Icp/PMOutlierRatio 0.7 
  Icp/CorrespondenceRatio 0.2 RGBD/ProximityMaxGraphDepth 0 
  RGBD/ProximityPathMaxNeighbors 1 Mem/LaserScanNormalK 20 Reg/Strategy 1 
..."

Notes:
 - icp_odometry node combines rtabmap_args and odom_args together (that's why I've overriden 2 ICP parameters for odom).
 - I skipped to mention deskewing stuff to keep example simple in simulation, but you would need to re-enable it for real-world tests.
 - In the test_velodyne.launch example, there is also a point_cloud_assembler node, you can use it if you want to record ALL scans in the database.

cheers,
Mathieu
Reply | Threaded
Open this post in threaded view
|

Re: Lidar based 3d SLAM in sparse feature environment

Gi_T
Hey Mathieu, thanks for the information about the ICP Odometry settings for the 3d Lidar. After updating the ICP parameters I was able to fly on the above mentioned map without flyaways (for the most part). Still there is quite a large Z-drift while flying (around 0.5-1m when exploring the whole map). As I'm pretty new to RTABMAP and SLAM in general parameter-tuning proves quite difficult. Do you have any tips for how to best approach such a situation?

Current config: https://github.com/TannerGilbert/rtabmap_drone_example/blob/master/launch/slam.launch
Reply | Threaded
Open this post in threaded view
|

Re: Lidar based 3d SLAM in sparse feature environment

matlabbe
Administrator
Can you share a database? 0.5-1m offset seems quite large if you have a 360 3D lidar.