|
This post was updated on .
Hi Mathieu,
I'm reaching out for your expertise on a university project that uses RTAB-Map with an Azure Kinect to map crop fields under strong sunlight. Despite the known preference for stereo cameras for such tasks, budget constraints have led us to utilize the Kinect. We've adjusted the Kinect’s color camera settings via the Azure Kinect SDK to ensure sharp images even during rapid movements, such as shifting between crop rows.
However, we're encountering issues with RTAB-Map's odometry, where it occasionally loses tracking. Through testing, we found that reducing the playback speed of a recorded ROS message bag improves odometry stability (0.1 speed eliminates odometry loss, while speeds over 0.4 introduce sporadic losses).
This leads me to suspect that our setup may be hitting computational limits, particularly since performance varies with the number of concurrently running programs. We are using an Intel NUC NUC13ANKi7 with 64GB RAM and a 1TB NVME disk.
Given our inability to add hardware accelerators, is there a method to dynamically monitor the robot's speed so as to adjust it based on real-time computational load, preventing odometry loss?
I actually considered using the "odom quality" score. However, its sudden drop rather than a gradual decrease suggests that the system only recognizes a significant problem with odometry when it's already too late for any preventive actions.
Is there an other way to make this work without using a VIO approach, which you previously suggested but my professors doesn't want me to try?
Thank you in advance for your time. Any help will be greatly appreciated.
Christian
|