Parameters for offline robust RGBD odometry?

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Parameters for offline robust RGBD odometry?

Timofejev
I could not find any information or hints regarding parameters to use for offline mapping, i.e. when I want to map from a pre-recorded kinect footage. I understand, that the current default parameters are tuned to be real-time capable on a wide range of hardware. I am mainly interested in the other extreme case - unlimited time, CPU and RAM resources and the necessity for the SLAM algorithm to be as robust as possible. I tried adding ICP  to visual odometry, increasing the number of features extacted per frame (up to 1500) and increasing the size of the local map (up to 15 000), but it did not help. The problem is that the odometry is often lost where there are seemingly enough features (they are all colored yellow in the odometry view though). I am using GFFT+ORB features, could that be an issue?
Reply | Threaded
Open this post in threaded view
|

Re: Parameters for offline robust RGBD odometry?

matlabbe
Administrator
Hi,

Even with unlimited power, we are still limited to environments that are friendly for visual SLAM. A lot of features doesn't mean better quality. However, a lot of discriminative features will generate better accuracy. Depth distortion or poor stereo calibration can have a bending effect on the overall map.

You may try Vis/EstimationType to 1 (3D->2D PnP estimation) to be more robust to noisy features.

cheers,
Mathieu
Reply | Threaded
Open this post in threaded view
|

Re: Parameters for offline robust RGBD odometry?

Timofejev
Thank you Mathieu,
the 3D->2D PNP estimated did improve the results. I think the environment I have here should be SLAM- and odometry-friendly. Could you please take a look at this short clip and tell, whether something can be done to improve feature matching? https://giphy.com/gifs/3o7bufRxsLpk7HX1PG/html5
Reply | Threaded
Open this post in threaded view
|

Re: Parameters for offline robust RGBD odometry?

matlabbe
Administrator
Hi,

Can you do the same with depth image shown over the RGB image (see right-click on image view)?

Pipes can have a lot of visual features but they are not discriminative. Black/reflective pipes may not be friendly for the depth sensor, so many visual features could not have corresponding valid depth values. Do you have a small database to share? ICP could be better in this kind of environment (with a lot of geometry) if the depth sensor detects easily the pipes.

cheers,
Mathieu