Re: Lidar based 3d SLAM in sparse feature environment

Posted by matlabbe on
URL: http://official-rtab-map-forum.206.s1.nabble.com/Lidar-based-3d-SLAM-in-sparse-feature-environment-tp10174p10288.html

First I'm not quite sure what happens when a Loop Closure is detected. As far as I understand /rtabmap/odom is not changed on loop closure. Here I mainly have two questions 1) what topic/transform should be used to provide the PX4 with /mavros/vision_pose/pose
In our rtabmap drone example, we created a node publishing vision_pose at 50 Hz from latest available TF map->base_link.


and 2) if I use RTABMAP with the Octomap integration for exploration how can I take loop closure into account?
The octomap published by rtabmap is updated on loop closures. You can also use /octomap_global_frontier_space to know where is the frontier (unknown space next to empty space) for exploration (see https://www.youtube.com/watch?v=S_IlPVMYVkM for example of what it looks like -> cyan cubes).

Additionally for the real data I noticed that the Depth and RGB projection from my Realsense D455 is not perfectly aligned with the Lidar data (Velodyne VLP-16). Do you have any suggestion for tuning for that?
Make sure depth from realsense is aligned with RGB. To calibrate extrinsics between the camera RGB frame and the lidar frame, you can do it manually or automatically using some lidar-camera calibration tools. To manually do it, I generally set initial translation/rotation (measured roughly with a measuring tape or from CAD) between RGB camera frame of realense and the lidar frame, then refine that TF rotation by looking in RVIZ with Camera View display and adding the point cloud of the lidar as overlay. However, that may not be as accurate than using an automated tool but is pretty fast to do for a rough test.


cheers,
Mathieu