Pure Visual Localization

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Pure Visual Localization

seanxu112
This post was updated on .
Hello Mathieu,

It has been a while since I have played around with the robot. I am playing around with the parameters that you have suggested here: http://official-rtab-map-forum.206.s1.nabble.com/Trying-to-Understand-ICP-Registration-During-Localization-td7523.html#a8085

The map looks great, even though there are less local links and global links. However, during localization, the robot sometimes just drifts away. I am wondering during localization, if external odometry is used, how is rtabmap doing pose estimation? It feels like when t265 start drifting, my current setup is not doing any pose estimation based on the keypoints of the map. Should we keep using t265 as external odom, or should we use the built in VO from rtabmap to do F2M pose estimation? If I use t265 odom, does setting Odom/Holonomic to false help at all? It seems like after PnP the output pose has some big y direction translation for some neighbor link refinement.

Previously I have been relying mostly on ICP, so now I am not too sure about doing mostly visual.

EDIT: Is the Vis parameter for rtabmap node responsible for the pose estimation? And how is local loop closure done? It seems like Green loop closure is working well, but yellow loop closure seems to shift the robot left and right.

I am currently uploading my db here, it is a bit big, I can see a few problems such as d435's motion blur is getting a bit crazy. And I think Neighbor Link Refining is causing more drift comparing to the t265 odometry. and thus not optimized map looks much better than the optimized one.

Sincerely,
Sean Xu
Reply | Threaded
Open this post in threaded view
|

Re: Pure Visual Localization

matlabbe
Administrator
Hi,

got late to answer this post, the database is not available anymore. Anyway, on localization mode, you should still provide odometry to rtabmap node. If you used t265 odometry for mapping, continue to use it in localization mode.

Yellow loop closures are proximity detections, they can be disabled with RGBD/ProximityBySpace = false. Proximity detection uses the odometry to guess where the robot is now, then choose the closest node in the map to it and try to estimate a transformation (with Vis/ parameters). If you increase Vis/MinInliers, only loop closure/proximity detections with better transformation estimation would be accepted. The Odom/ parameters are not used if you feed external odometry to rtabmap.

You can also set Rtabmap/DetectionRate to 0 if you want to localize more often (do this only in localization mode).

cheers,
Mathieu
Reply | Threaded
Open this post in threaded view
|

Re: Pure Visual Localization

seanxu112
Hi Matheiu,

I see, I have tried RGBD/ProximityBySpace = false, everything seems to work well, but when there is no loop closure for a long period of time, robot pose will start to drift, which is expected, that is why we need loop closures.

Then is the only solution for this to make sure there is more loop closures? The database size scales linearly wrt time, so I am not sure it that is the best solution.

Sincerely,
Sean Xu
Reply | Threaded
Open this post in threaded view
|

Re: Pure Visual Localization

matlabbe
Administrator
Hi Sean,

The database does increase linearly over time in SLAM mode. This is discussed in this paper (see Fig 17 and 19, note that Mem/ReduceGraph parameter is disabled by default).

To correct odomerty drift, you need either loop closure detection and/or external global localization priors (e.g., vicon, optitrack, GPS).

cheers,
Mathieu