Re: RGBD Outdoor Mapping - Offline Database Processing - Localization
Posted by
matlabbe on
URL: http://official-rtab-map-forum.206.s1.nabble.com/RGBD-Outdoor-Mapping-Offline-Database-Processing-Localization-tp5258p5283.html
Hi,
1.Every keyframe's feature stored in the map has the info of its depth?
Yes
2.Is the depth of the features used for estimating the pose of the camera through SFM? It depends, the pose estimation is done by
PNP which is 2D <-> 3D estimation. For localization, the depth is not required. However, for the map, the depth of the features must be known.
3.The point cloud should not be necessary when localizing through RGB visual words approach. Is it true anyway that the pc can be exploited to refine the pose estimate by ICP processing?]
The point cloud is not necessary for appearance-based localization. However, if a laser scan is available, it can be used to refine the transformation with ICP.
You will have more details in the paper "
RTAB-Map as an Open-Source Lidar and Visual SLAM Library for Large-Scale and Long-Term Online Operation".
Can I use the appearance based localization to locate the robot within a 2D occupancy map? Using rtabmap yes, rtabmap keeps the 3D features even if you output a 2d occupancy grid.
For localization, as by default rtabmap is updated at 1 Hz, you will get discrete jumps for the /map -> /odom transform. It is based on the standard, see
REP105. Localizations may not happen at every frames, so the robot may "jump" when relocalized to correct the odometry drift.
cheers,
Mathieu