Handling lost visual odometry when using sensor fusion (robot_localization)
Posted by BrettHemes on
URL: http://official-rtab-map-forum.206.s1.nabble.com/Handling-lost-visual-odometry-when-using-sensor-fusion-robot-localization-tp3725.html
Hi,
I am trying to integrate various additional sources of odometry into rtabmap and have been using the sensor_fusion.launch file as an example.
I have everything working in well-behaved scenarios but have issues when the visual odometry fails. When using the default mapping example the null output from the visual odometry results in a mapping pause with a red screen, however, when using robot_localization the null output (with BAD_VARIANCE) is consumed and filtered resulting in non-null output into rtabmap. In this case I end up with some non-zero translational velocity from the filter and the mapping spirals (literally) out of control.
My question is: is there a suggested/preferred way of handling "lost" odometry components in the sensor fusion case?
My current thought is to put an additional node that takes in the filtered odometry and checks the visual odometry for the lost condition and then feeds rtabmap with null/ekf-output accordingly... but I am concerned that the EKF may still suffer from the null inputs with arbitrarily high variance. Should the variance perhaps be set to -1 instead of the current 9999 constant?
I have some other ideas as well but wanted to at least check in here to see if others have experienced/solved the issue or even if the functionality is already in there and I am just missing some parameter settings.
Thanks,
Brett