Hi everyone!
Is there any example how to use VINS Fusion as external Odometry with RTABMap with ROS2 humble ? Or has somebody achieved to run it successfully and could provide the launch file? Best Dominik |
Administrator
|
Hi, Are you referring to these packages? https://github.com/zinuok/VINS-Fusion-ROS2 https://github.com/JanekDev/VINS-Fusion-ROS2-humble-arm I don't think I have seen trying rtabmap with one of these. cheers, Mathieu |
Thanks for the reply!
exactly, i managed to run these Repos on the EuRoc MAV dataset but would like to run it in combination with RTABmap and my Intel Realsense D435i. Any advice how to get this done? Best Dominik |
Administrator
|
This post was updated on .
Hi Dominik,
Can you show which did commands you use to test on EuRoC? Can you also show what are the available topics (ros2 topic list) and the tf tree (ros2 run tf2_tools view_frames) when it is running? cheers, Mathieu EDIT: I could somewhat make it work inside rtabmap by compiling vins and adding it as dependency to rtabmap. I used that repo, fixed all compilation errors that are referred in the issues of the repo (disabled cuda/GPU stuff, fixed ros::Duration::from_seconds(0)). Then to export as a library, I added SHARED to vins_lib target and added this at the end of the same CMakeLists.txt: ament_export_dependencies(rclcpp rcpputils std_msgs visualization_msgs geometry_msgs nav_msgs tf2 tf2_ros cv_bridge camera_models image_transport) ament_export_include_directories(include) ament_export_targets(${PROJECT_NAME}) # To include downstream with targets ament_export_libraries(vins_lib) # To include downstream without targets install(TARGETS vins_lib EXPORT ${PROJECT_NAME} ARCHIVE DESTINATION lib LIBRARY DESTINATION lib RUNTIME DESTINATION bin ) install(DIRECTORY src/ DESTINATION include FILES_MATCHING PATTERN "*.h" PATTERN "*.hpp" ) ament_package() Also had to add "set_target_properties(camera_models PROPERTIES POSITION_INDEPENDENT_CODE ON)" in the CMakeLists.txt of camera_models package (for some -fPIC error when linking inside rtabmap). Build vins (colcon), source the setup.bash of the ros workspace, then rebuild rtabmap with: cmake -DWITH_VINS=ON We can select VINS-Fusion under odometry parameters (with vins config/realsense_d435i/realsense_stereo_imu_config.yaml and image resolution in respective files set to 848x480 instead of 640x480). While it seems to do something, the estimation with my realsense D435i was very poor. I only tested the Stereo-only mode, because with IMU the pose is drifting a lot more. In my memory, I thought VINS-fusion was working better than this, maybe there are some differences with the ros2 implementation. I'll check later if I can retest on a ros1 docker. EDIT2: tried under a docker/noetic, no success with live camera. |
Hi Mathieu,
First of all, thanks for your support. I used the following repo, although I think the specific choice of repository is not critical in this case. I have disabled CUDA and resolved also to ros::Duration::from_seconds(0). Node is launched using the command: ros2 run vins vins_node ~/ros2_ws/src/VINS-Fusion-ROS2-humble/config/euroc/euroc_stereo_imu_config.yaml The context is the deployment of RTAB-Map in an industrial logistics environment, where an D435i is mounted on a smart conveyed good. Primary objective is to reconstruct the trajectory of the object during operation (generation of a point cloud is nice to have). However, challenges arises due to the presence of external sensors positioned along the transport path. The sensors / rods are located in close proximity to the camera and frequently obstruct its field of view, leading to reduction in detectable keypoints. Consequently, the VO occasionally fails. To address this, I considered using VINS-Fusion, as it relies more on IMU data and may thus offer improved robustness during brief interruptions. I would be interested to know whether you see the potential for an RTAB-Map-based setup that could handle such short-term failures of VO or a temporary lack of visual features that works with ros2 humble Best Dominik |
Administrator
|
Hi Dominik,
RTAB-Map's VO alone won't handle well temporary camera blocked while moving. I don't have much experience with VINS-Fusion in particular, but with ARKit (iOS), you can keep a relatively accurate pose estimation when the camera is temporary blocked, though not for too long (<1 sec). Other options seen on mobile robots is to use wheel odometry (or some other local motion estimator) as guess frame to RTAB-Map's VO, so that when VO cannot be computed, the TF is still published relative to wheel odometry during that time, then VO can be restarted to where the camera should have been based on the wheel odometry. You can also look at OpenVINS project, which may be better maintained for ROS2. cheers, Mathieu |
Hi Mathieu,
thanks for the answer. I have one question regarding this. In Theorie i have two d435i cameras on my prototype. One facing in front, one facing to the side (90 deg shifted, not sharing Same field of view ). Is it possible to fuse both Cameras for VO ? Because in case where front camera looses VO caused by to less Keypoints, the other camera would see enough Keypoints. Or do both VOs constantly need to track enough Keypoints, otherwise it will not work with rtabmap? Best, Dominik |
Administrator
|
RTAB-Map's VO can use both cameras at the same time (so if one camera is blocked and some feature scan still be tracked in the other one, VO won't get lost). There is an example here. Another recommended requirement for multi-camera is to build rtabmap with OpenGV support.
Another caveat is this assumes that the cameras are hardware synchronized together. If not, it may work, though not optimal. cheers, Mathieu |
Hi Mathieu,
thanks, i tried this example as well, but only the topic for one camera is given into the VO node. What do i need to change, to use both camera streams for the VO? Or do i understand that wrong ? Best Dominik |
Administrator
|
Oh well, just realized that example didn't use two cameras in rgbd_odometry. To use both cameras in VO, you would need to do:
# RGB-D odometry rgbd_odometry_node = launch_ros.actions.Node( package='rtabmap_odom', executable='rgbd_odometry', output="screen", parameters=[{ "frame_id": 'base_link', "odom_frame_id": 'odom', "rgbd_cameras": 2, "publish_tf": True, "approx_sync": True, "subscribe_rgbd": True, }], remappings=[ ("rgbd_image0", '/realsense_camera1/rgbd_image'), ("rgbd_image1", '/realsense_camera2/rgbd_image'), ("odom", 'odom')], arguments=["--delete_db_on_start", ''], prefix='', namespace='rtabmap' ) # SLAM slam_node = launch_ros.actions.Node( package='rtabmap_slam', executable='rtabmap', output="screen", parameters=[{ "subscribe_sensor_data": True, "subscribe_odom_info": True, "frame_id": 'base_link', "map_frame_id": 'map', "publish_tf": True, "database_path": '~/.ros/rtabmap.db', "approx_sync": False, "Mem/IncrementalMemory": "true", "Mem/InitWMWithAllNodes": "true" }], remappings=[ ("sensor_data", 'odom_sensor_data/raw'), ("odom", 'odom')], arguments=["--delete_db_on_start"], prefix='', namespace='rtabmap' ) ![]() I also modified rtabmap node to get sensor_data output from rgbd_odometry (containing already the images synchronized and visual features, and to make also sure that odometry topic is perfectly synced with the same data used by vo). You can also instantiate rtabmap_viz node with pretty much the same parameters/remappings than rtabmap node. cheers, Mathieu |
Thank you Mathieu, I really appreciate your effort!
Following your advice on this post i would like to use the stereo odometry with my two d435i‘s, but that shouldn‘t be a problem and works the same way, right? Last question on this: regarding your last paragraph what you have done to VO/RTABmap node with the sensor topic, could you share those nodes by any chance to get a grasp of what exactly you have done as I’m new to all of this and it could save me a lot of time? (i‘m using ros2 humble) Best, Dominik |
Administrator
|
Hi Dominik,
To show nodes and topics like in my screenshot, you can use rqt_graph: $ ros2 run rqt_graph rqt_graphThe same plugin can be opened in rqt directly too (along with other useful tools): $ rqtPlugins->Introspection->Node Graph Other plugins I use often: # Show TF tree (generate a pdf) $ ros2 run tf2_tools view_frames # Show all ros logs and/or filter them $ ros2 run rqt_console rqt_console # Show some data over time (e.g., odometry values) $ ros2 run rqt_plot rqt_plot \ /odom/pose/pose/position/x \ /odom/pose/pose/position/y \ /odom/pose/pose/orientation/z # Diagnostic (when nodes support it, rtabmap publishes some of them about input and output expected frequencies): $ ros2 run rqt_runtime_monitor rqt_runtime_monitor # Re-configuring nodes (particularly useful when tuning nav2 stack for example, note that for rtabmap you would use rtabmap_viz instead) $ ros2 run rqt_reconfigure rqt_reconfigure # Just looking an image topic without rviz $ ros2 run rqt_image_view rqt_image_view If you want to use stereo mode instead, you can use stereo_sync nodes with stereo_odometry. stereo_odometry has also "subscribe_rgbd" and "rgbd_cameras" parameters to connect the output of stereo_sync. You can do `ros2 node info` on a stereo_sync node to see the topics you would need to remap. Note that D435i's IR+Depth mode would be similar to stereo mode, but the depth will be already computed on the camera. cheers, Mathieu |
Free forum by Nabble | Edit this page |