Using RTAB with Manipulator

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Using RTAB with Manipulator

KasaIvas
Hello, I want to estimate the real pose of the robot's end-effector via SLAM.

So far I have managed to connect a depth camera to the wrist_link of my robot (and not to the tool_link=end-effector) and create an octomap directly in MoveIt. However, reading around I understood that it is not possible to apply SLAM directly in MoveIt and I came to know this ROS package, RTAB-Map.

I then created a custom launch file that first launches the launch file that starts Gazebo+Rviz+MoveIt and then launches the rtabmap_slam and rtabmap_viz nodes with delay of 20 seconds (to give Gazebo and Moveit time to launch). Starting everything I have several problems, many concerning the TF tree:

[rtabmap-11] [ WARN] MsgConversion.cpp:2005::getTransform() (getting transform odom -> base_link) Lookup would require extrapolation into the past. Requested time 373.503000 but the earliest data is at time 373.530000, when looking up transform from frame [base_link] to frame [odom] (wait_for_transform=0.200000)

[rtabmap-11] [ WARN] MsgConversion.cpp:2005::getTransform() (getting transform odom -> base_link) Could not find a connection between 'odom' and 'base_link' because they are not part of the same tree.Tf has two or more unconnected trees. (wait_for_transform=0.200000)


I have several doubts  that I can't solve:
1) Which frames  should I use for SLAM? Does it make sense to create a new odometry frame or can I use the base_link? Can I estimate the pose of the end-effector with this? (or the camera, then I would have to do a transformation)
2) To view in RTAB-Viz which frame should I use? Do I have to create a new one (for example "map") or use an existing one (for example "world"). Also, do I need to remap the topics and set "'subscribe_depth': True," or not?

_______________________________________________________________
This is my node launch:

rtabmap_node = Node(
        package='rtabmap_slam',
        executable='rtabmap',
        name='rtabmap',
        output='screen',
        parameters=[{
            'use_sim_time': True,
            'frame_id': 'base_link',
            'odom_frame_id': 'odom',
            'subscribe_depth': True,
            'subscribe_rgbd': False,
            'subscribe_scan': False,
            'approx_sync': True,
            'sync_queue_size': 10,  
        arguments=['-d'],
        remappings=[
            ('rgb/image', '/gazebo_depth_camera/image_raw'),
            ('depth/image', '/gazebo_depth_camera/depth/image_raw'),
            ('rgb/camera_info', '/gazebo_depth_camera/camera_info'),
        ]
    )

rtabmap_viz_node = Node(
        package='rtabmap_viz',
        executable='rtabmap_viz',
        name='rtabmap_viz',
        output='screen',
        parameters=[{
            'use_sim_time': True,
            'frame_id': 'map',
            'odom_frame_id': 'odom',

            #'subscribe_depth': True,  #I don't know if these need to be put on or not
            #'subscribe_rgbd': False,
        }]
        # remappings=[
        #     ('rgb/image', '/gazebo_depth_camera/image_raw'),
        #     ('depth/image', '/gazebo_depth_camera/depth/image_raw'),
        #     ('rgb/camera_info', '/gazebo_depth_camera/camera_info'),
        # ]
    )
_______________________________________________________________

This is my tf tree (robot + camera), where the "world" is a virtual link and the planning frame in MoveIt:




Reply | Threaded
Open this post in threaded view
|

Re: Using RTAB with Manipulator

matlabbe
Administrator

A couple of questions:

* What means world -> base_link? It is always Identity?
* Is your robot arm's base moving? Note that if the arm's base is not moving, using directly octomap package may just work already.

Here are two scenarios:

A) The arm's base is fixed on a table. In this case, we will run rtabmap exactly like you would run the octomap_server. You can run rtabmap with odom_frame_id set to base_link and frame_id set to wrist_link. Note that if world->base_link is fixed and Identity, you may also use world frame for odom_frame_id. You can also remove your node publishing world->base_link and set map_frame_id to world instead (then rtabmap will publish world->base_link). Then add these parameters to rtabmap:
# Values sohuld be all "string" type 
'Kp/MaxFeatures': '-1',           # Disable visual loop closure detection
'RGBD/ProximityBySpace': 'false', # Disable proximity detection
'Grid/CellSize': '0.05',          # Adjust octomap's minimum voxel size
'Grid/3D': 'true',
#'Grid/RayTracing': 'true',       # Uncomment if you want octomap to clear dynamic obstacles
'RGBD/LinearUpdate': '0.1',       # Adjust minimum motion to add new nodes to the map
'RGBD/AngularUpdate': '0.1',      # Adjust minimum motion to add new nodes to the map
'Rtabmap/DetectionRate': '1',     # Adjust at which rate new nodes are added to the map

The /rtabmap/octomap topic will be generated. rtabmap will publish map-> (base_link or world) TF, though it would stay Identity the whole time.

B) The arm is on a mobile platform. In this case, you will need to get rid of world frame, it will be replaced by map frame. You would also need odometry node, either it is wheel odometry from your mobile platform or computed from the camera on the wrist of the robot. To compute odometry from the camera, you could try rgbd_odometry node. For rtabmap node, the config could stay relatively the same than what you have right now. The odom frame will be published by the odometry node. Note that if world->base_link is moving (i.e., you have kinda already odometry computed), you can set odom_frame_id to world in your original example.

For rtabmap_viz, you don't need to set subscribe_depth, unless you want to see high frame rate feedback of the current camera data. The frame_id should be the same frame_id set for rtabmap node, same for odom_frame_id (though it is optional for rtabmap_viz).
Reply | Threaded
Open this post in threaded view
|

Re: Using RTAB with Manipulator

KasaIvas
This post was updated on .
Hi,
"base_link" is a fixed base and the transformation "world"->"base_link" is an identity, there is no movement between the two. So far I have been using "Fake Odometry" by setting the frame_id and odom_frame_id equal to base_link. I have added the frame "map" as "map_frame_id" and the static transformation "map->world".

Below is the launch file "slam_rtab.launch.py" and the new tf tree (sorry for the comments in Italian):

slam_rtab.py

frames_2025-05-09_CON_MAP_page-0001.jpg

I don't think this setup is useful for my purpose (estimating the pose of the end-effector=tool_link and control based on pose estimation error), but I'm still fighting with various warnings:
1) [move_group]: Message Filter dropping message: frame 'depth_camera_optical_link' at time 370.825 for reason 'discarding message because the queue is full'
2) [rtabmap_viz-13] [ WARN] (2025-05-09 15:47:28.326) MainWindow.cpp:1976::processStats() Processing time (2.017118s) is over detection rate (1.000000s), real-time problem!
3) [rtabmap]: rtabmap (60): Rate=1.00s, Limit=0.000s, Conversion=0.0005s, RTAB-Map=2.2644s, Maps update=0.0001s pub=0.0018s delay=1.3160s (local map=1, WM=1)
4) If I set "'RGBD/LinearUpdate': '0.0'" and "'RGBD/AngularUpdate': '0.0'" I inexplicably get "[ WARN] (2025-05-12 17:11:46.634) SensorData.cpp:762::uncompressDataConst() Requested laser scan data, but the sensor data (182) doesn't have laser scan."

Log of the RTAB part only + planning execution in MoveIt (before this MoveIt and Gazebo start): log_RTAB.txt
_______________________________

1) What changes if I put "wrist_link" or "tool_link" as "frame_id"? Where can I find the pose estimate for these frames?

2) Are these the octomap topics you mean?
/octomap_binary
/octomap_empty_space
/octomap_full
/octomap_global_frontier_space
/octomap_grid
/octomap_ground
/octomap_obstacles
/octomap_occupied_space

I also saw that I can use this https://github.com/Notou/Moveit-External-Octomap-Updater to bring the RTAB octomap into MoveIt and not use its native one.

One last piece of information, the camera  I'm using is Gazebo's (libgazebo_ros_camera.so) and its update_rate is set to 10Hz.

_________________________________________________
UPDATE 16-05
- I think I eliminated the "..queue is full" warnings by changing the QoS values: 'qos_image': 2, 'qos_camera_info': 2. In fact "depth_camera_controller" had "Reliability: RELIABLE" and "rtabmap" had "Reliability: BEST_EFFORT".
- I also changed the following parameters to lighten the simulation: 'Mem/ImagePreDecimation': '4', 'Grid/RayTracing': 'false', 'Grid/CellSize': '0.1'.
- Finally I decreased the detection rate to 'Rtabmap/DetectionRate': '0.3', so update every 3.33s. I get "[INFO] [1747316578.407095878] [rtabmap]: rtabmap (1): Rate=3.33s, Limit=0.000s, Conversion=0.0086s, RTAB-Map=0.3603s, Maps update=0.0003s pub=0.0001s delay=0.2880s (local map=1, WM=1)" and I don't get the "... real-time problem!" warning anymore.
- I put 'RGBD/LinearUpdate': '0.0' and 'RGBD/AngularUpdate': '0.0', I think I'll leave out the laser scan warnings.
- The only warning I still get is "[move_group-7] [INFO] [1747316592.865220175] [move_group]: Message Filter dropping message: frame 'depth_camera_optical_link' at time 374.105 for reason 'the timestamp on the message is earlier than all the data in the transform cache'", but I only get this at startup and not during the simulation. [Note: I do NOT get this warning if I do NOT start RTAB-Map].
Reply | Threaded
Open this post in threaded view
|

Re: Using RTAB with Manipulator

matlabbe
Administrator
You can use odom_frame_id = base_link, but you need to use RGBD/LinearUpdate=0 to update the map like you did. However, it will update all the time, if you want to update the map only with the end effector is moving, set frame_id to wrist_link and set RGBD/LinearUpdate to some minimum motion to update.

The biggest concern I have seeing the log is:
[rtabmap-12] [INFO] [1746798276.786549815] [rtabmap]: rtabmap (1): Rate=1.00s, Limit=0.000s, Conversion=0.0175s, RTAB-Map=2.9141s, Maps update=0.0003s pub=0.0001s delay=1.4180s (local map=1, WM=1)
It seems that just creating a single node is already slower than 1 Hz. Can you share the database even if there is a single node, I would like the see the details of processing time to know where it is so slow.

For this warning:
If I set "'RGBD/LinearUpdate': '0.0'" and "'RGBD/AngularUpdate': '0.0'" I inexplicably get "[ WARN] (2025-05-12 17:11:46.634) SensorData.cpp:762::uncompressDataConst() Requested laser scan data, but the sensor data (182) doesn't have laser scan."
You have set Reg/Strategy=1, assuming there is a lidar. In your setup you are not using a lidar or a PointCloud2 input. You could set it back to Reg/Strategy=0 or remove it from your config.

1) What changes if I put "wrist_link" or "tool_link" as "frame_id"? Where can I find the pose estimate for these frames?
Use TF to recover the pose. In that example, I assume that base_link to all links on the arm are already updated by the robot controller (forward  kinematics).

2) Are these the octomap topics you mean?
yes

I am not sure what Moveit-External-Octomap-Updater package actually does, but yes, if it can provide an external octomap subscriber to move_it, rtabmap's octomap output topic could be connected to it.

For your updates, "RTAB-Map=0.3603s" looks more reasonable. Ray tracing is indeed expensive to use with small Grid/CellSize, if it was the cause of the high processing time.

- The only warning I still get is "[move_group-7] [INFO] [1747316592.865220175] [move_group]: Message Filter dropping message: frame 'depth_camera_optical_link' at time 374.105 for reason 'the timestamp on the message is earlier than all the data in the transform cache'", but I only get this at startup and not during the simulation. [Note: I do NOT get this warning if I do NOT start RTAB-Map].
It sounds like a lack of processing resources when rtabmap is running, which may starve other parts of the system, causing the warning.
Reply | Threaded
Open this post in threaded view
|

Re: Using RTAB with Manipulator

KasaIvas
This post was updated on .
Hi, thanks for the reply.

I changed 'Reg/Strategy' from 1 to 0, but instead of getting the warning "[ WARN] (2025-05-12 17:11:46.634) SensorData.cpp:762::uncompressDataConst() Requested laser scan data, but the sensor data (182) doesn't have laser scan." I get this other "[ WARN] (2025-05-18 10:19:42.983) Memory.cpp:3200::computeTransform() Missing visual features or missing raw data to compute them. Transform cannot be estimated."

rtabmap_slam parameters updated: rtabmap_slam_parameters.txt

However, using a Detection Rate equal to 0.3Hz it seems that the map cannot be built in real time. Let me explain better: when the robot is stopped in pose "A" the environment in front of the camera is well captured and the map is well built, but when I start a planning to take the robot from "A" to "B", the environment captured in the path is not built. Once it has stopped in pose "B" the environment captured by the camera in this position is well built.

If I increase to 1 Hz I get in addition to the error mentioned above also again "[move_group]: Message Filter dropping message: frame 'depth_camera_optical_link' at time 370.825 for reason 'discarding message because the queue is full'"

Finally, since I am creating a controller in the operational space based on the error of the robot's pose estimates and that takes as input the SLAM estimates of the end-effector's poses, if I use TF the error would be zero or almost zero. Is there a way to obtain the SLAM estimate? Or is it not possible in this configuration?


What do you mean with "Can you share the database even if there is a single node, I would like the see the details of processing time to know where it is so slow." Is it the rtabmap.db file that is saved in "home/user/.ros/"?
Reply | Threaded
Open this post in threaded view
|

Re: Using RTAB with Manipulator

matlabbe
Administrator
The warning is maybe coming from RGBD/NeighborLinkRefining, set it to false.

but when I start a planning to take the robot from "A" to "B", the environment captured in the path is not built. Once it has stopped in pose "B" the environment captured by the camera in this position is well built.
At 0.3 Hz, of the trajectory is done under 3 seconds, there could be no update during the motion.


Finally, since I am creating a controller in the operational space based on the error of the robot's pose estimates and that takes as input the SLAM estimates of the end-effector's poses, if I use TF the error would be zero or almost zero. Is there a way to obtain the SLAM estimate? Or is it not possible in this configuration?
In the current setup, the SLAM combination is disabled, it does only the "M" (mapping) part with pose input from TF or the position output of your robot arm's controller. As I said before, that setup is exactly like if you used octomap_server directly. I don't recommend to do SLAM on the end effector (even if possible), as it will be always less accurate (and can drift) than using the robot arm's controller output directly (which would be faster, more accurate and never drift), unless it is used to track the base position of the arm, though in your case it is fixed.

Is it the rtabmap.db file that is saved in "home/user/.ros/"?
Yes


Reply | Threaded
Open this post in threaded view
|

Re: Using RTAB with Manipulator

KasaIvas
Hi,
- the warning "...Missing visual features or missing raw..." doesn't appear if I set RGBD/NeighborLinkRefining to false.

- I increased the detection rate back to 1Hz as the processing time seems to be much lower than before. However I don't get real-time updates of the map, so it's probably the speed at which the trajectory is executed.

- This is the rtabmap.db file https://www.dropbox.com/scl/fi/heaquxezb9opvush31za0/rtabmap.db?rlkey=5navbuk2663rakeoxk0inc750&st=agdtqiea&dl=0

This is my Gazebo world:


- What do you mean by "SLAM on the end-effector"? Use "tool_link" (or wrist_link) as frame_id? Or are there also other settings to change considering my current setup (like odom_frame_id or other parameters)?