Optimizing Azure Kinect outdoor autonomous mapping

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Optimizing Azure Kinect outdoor autonomous mapping

chris.ferra
This post was updated on .
Hi Mathieu,

I'm reaching out for your expertise on a university project that uses RTAB-Map with an Azure Kinect to map crop fields under strong sunlight. Despite the known preference for stereo cameras for such tasks, budget constraints have led us to utilize the Kinect. We've adjusted the Kinect’s color camera settings via the Azure Kinect SDK to ensure sharp images even during rapid movements, such as shifting between crop rows.

However, we're encountering issues with RTAB-Map's odometry, where it occasionally loses tracking. Through testing, we found that reducing the playback speed of a recorded ROS message bag improves odometry stability (0.1 speed eliminates odometry loss, while speeds over 0.4 introduce sporadic losses).

This leads me to suspect that our setup may be hitting computational limits, particularly since performance varies with the number of concurrently running programs. We are using an Intel NUC NUC13ANKi7 with 64GB RAM and a 1TB NVME disk.

Given our inability to add hardware accelerators, is there a method to dynamically monitor the robot's speed so as to adjust it based on real-time computational load, preventing odometry loss?
I actually considered using the "odom quality" score. However, its sudden drop rather than a gradual decrease suggests that the system only recognizes a significant problem with odometry when it's already too late for any preventive actions.
Is there an other way to make this work without using a VIO approach, which you previously suggested but my professors doesn't want me to try?

Thank you in advance for your time. Any help will be greatly appreciated.

Christian
Reply | Threaded
Open this post in threaded view
|

Re: Optimizing Azure Kinect outdoor autonomous mapping

matlabbe
Administrator
Hi Chistian,

what kind of framerate can you get from visual odometry? Don't know which resolution of the images you are using from kinect, but using smaller resolution image could help (e.g., Odom/ImageDecimation=2).

If you have access to other kind of odometry that is locally good when visual odometry lose tracking, you could use it as guess_frame_id for visual odometry node, then set `Odom/ResetCountdown` to reset VO on latest pose. Use odom_frame_id for rtabmap node and it won't lose TF when VO is lost (using implicitly your other odometry till VO can track again).

cheers,
Mathieu