rtabmap_drone_example in-depth understanding.

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

rtabmap_drone_example in-depth understanding.

MidKnight
Hey @matlabbe, greetings!

I want to completely understand rtabmap_drone_example's end to end pipeline and internal workings. Is there an in-depth documentation that I can refer?

1. Have you added an external source of odometry to RTAB? Or, is RTAB's odometry (ie RGBD odometry?) reliable enough?
2. Is there a flow sensor being used with PX4 for a better odometry?
3. Is Seeed studio jetson nano adequate for performing autonomous navigation as seen in the chapter 2 of the demo video? If not, what are your suggestions on a) better hardware, b) better performance on the same board with the same setup? Thanks for your response on my previous post and I need some clarification on what your recommendation meant. Did you mean to suggest those VIO methods instead of rtab in localisation mode or does it work within rtab?
4. Is there a ros2 port available? I plan to use ros2 humble with docker on the mentioned board.


Please let me know about any resources for gaining information on this, that you might be aware of.
I have referred some online, including this link.

Regards,
MidKnight.
Reply | Threaded
Open this post in threaded view
|

Re: rtabmap_drone_example in-depth understanding.

matlabbe
Administrator
Hi MidKnight,

There is no in-depth documentation for the drone example. The rtabmap part is relatively simple though. rtabmap subscribe to imu and image topics, then provide tf map->base_link, which is converted to "vision pose" at 50 Hz for the px4 controller.


1. Have you added an external source of odometry to RTAB? Or, is RTAB's odometry (ie RGBD odometry?) reliable enough?
In that example, we are using default visual odometry from rtabmap. You can however provide your own odometry if not fast enough on your onboard computer.

2. Is there a flow sensor being used with PX4 for a better odometry?
We don't use one in this example, sot not sure how it could be integrated.

3. Is Seeed studio jetson nano adequate for performing autonomous navigation as seen in the chapter 2 of the demo video? If not, what are your suggestions on a) better hardware, b) better performance on the same board with the same setup? Thanks for your response on my previous post and I need some clarification on what your recommendation meant. Did you mean to suggest those VIO methods instead of rtab in localisation mode or does it work within rtab?
Are you referring to this board? When it comes to small computers, the bottleneck is often the acquisition of images from camera that can use a lot of CPU. I don't have experiment on that specific board to be confident to tell you it will work. It can also depend of which camera you are using, is rectification / depth computed onboard the camera or should be computed on the nano? The first makes it easier to integrate. Also don't stream 1080p or 720p, it will be a waste of resources and would make VO running slower, use 360p or lower. When using other VIO approaches, you would only remove the rgbd_odometry or stereo_odometry node from rtabmap launch, but keep rtabmap node for mapping and localization.

4. Is there a ros2 port available? I plan to use ros2 humble with docker on the mentioned board.
yup, see ros2 branch: https://github.com/introlab/rtabmap_ros/tree/ros2 . For some docker usage on ros2: see also https://github.com/introlab/rtabmap_ros/tree/ros2/docker#readme

cheers,
Mathieu
Reply | Threaded
Open this post in threaded view
|

Re: rtabmap_drone_example in-depth understanding.

MidKnight
In reply to this post by MidKnight
Hey Mathieu,

Thanks for your response.
The camera I am using is Intel RealSense D435, and I plan to take IMU data from the FCU as done in the rtabmap_drone_exmaple implementation. I use the camera in RGBD mode, with the launch arguments as specified in the rtampap_ros example launch file - realsense_d435i_color.launch.py, ros2 launch realsense2_camera rs_launch.py align_depth.enable:=true enable_sync:=true rgb_camera.profile:="640x360x30"

My fourth question from the previous post regarding the ros2 port was about 'A ros2 port for rtabmap_drone_example' and not 'ros2 port for rtabmap_ros', I think we miscommunicated there  
Reply | Threaded
Open this post in threaded view
|

Re: rtabmap_drone_example in-depth understanding.

matlabbe
Administrator
With the D435i, I recommend this example instead https://github.com/introlab/rtabmap_ros/blob/ros2/rtabmap_examples/launch/realsense_d435i_infra.launch.py if you don't care about color and care more about accuracy.

There is no ros2 port of the drone example. Would need to try it someday...
Reply | Threaded
Open this post in threaded view
|

Re: rtabmap_drone_example in-depth understanding.

MidKnight
Hey Mathieu,
I tried out the infra mode, but I found that the map being formed was not accurate to the environment. I observed the odometry to be drifting less than RGBD mode though.
Whereas, in RGBD mode, the map looks more accurate, but the odometry seems to get lost easily.
 
Are those observations expected from the infra mode and the RGBD mode? Is it expected to make a very accurate map and a very accurate odometry?

Thanks!
Reagrds,
MidKnight
Reply | Threaded
Open this post in threaded view
|

Re: rtabmap_drone_example in-depth understanding.

matlabbe
Administrator
Hi MidKnight,

Yes, those observations are right. We cannot have the best of both worlds at the same time with that camera.

cheers,
Mathieu
Reply | Threaded
Open this post in threaded view
|

Re: rtabmap_drone_example in-depth understanding.

MidKnight
Hey Matheiu,

That's interesting to know, thanks!
What system (computer + sensor) have you tested RTABMap to be working excellent on?

If you were to recommend an ideal system for using RTABMap for SLAM and autonomous navigation of a drone (like the rtabmap_drone_example implementation), Which computer or what system specifications and sensors would you recommend?

Regards,
MidKnight.
Reply | Threaded
Open this post in threaded view
|

Re: rtabmap_drone_example in-depth understanding.

matlabbe
Administrator

No matter the computer you have, I've found that if odometry (visual or lidar) can work at least at 10 Hz, it is good enough in most cases. It is always possible to downsample the sensor data to go faster, but accuracy may be worst (though in many cases using downsampled data doesn't really affect the pose accuracy).

For the sensors, the more FOV you have the better the pose accuracy will be. For cameras, look for large field of view cameras > 100 deg. For lidar, 360 is great.

For cameras, look for global shutter, hardware sync between left and right sensors. Make sure the calibration is as good as it can be! Having cameras with IR projector can help to "see" textureless surfaces like white walls, but beware that IR pattern should not be seen by visual odometry.

cheers,
Mathieu