ROS Online RTAB-Map in existing C++ Project (no launch file)

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

ROS Online RTAB-Map in existing C++ Project (no launch file)

brude
I have a setup of a mobile robot with an NVIDIA Jetson TX1 platform running Ubuntu 14.04 host with ROS Indigo and the ZED SDK 1.0.0. I want to be able to run an online/live RTAB-Map in my already existing c++ project.

In more detail, I would like to initiate my program, have a class that deals with the ZED camera to take in, process, and filter images before sending to the RTAB-Map. I would have a separate odometry source to send position and transform information to broadcast to match up with the images from the ZED. I would like to incrementally build a map with this data while also performing loop closing and allow for the data (map, occupancy grid, etc.) to be used by the robot and viewed by an external laptop.

I've looked at some example tutorials, including the C++ RGBD Mapping and the C++ Loop Closure Detection, but haven't really been able to figure out how to do this properly. I thought I would be able to run this all through ROS so that I would be able to publish data to topics, RTAB-Map would be able to run on its own and use this data to perform the incremental map build and detect for loop closures while also publishing map data back out. (Exactly like the ROS octomap_server).

Is this a possibility? Do I have to run RTAB-Map in my own thread and perform all the individual functions (init, process, etc.)? Will RTAB-Map work like an octomap_server? What steps do I have to go through to get RTAB-Map working in the already existing C++ project without the use of a launch file?

Thanks in advance!
Reply | Threaded
Open this post in threaded view
|

Re: ROS Online RTAB-Map in existing C++ Project (no launch file)

matlabbe
Administrator
Reply | Threaded
Open this post in threaded view
|

Re: ROS Online RTAB-Map in existing C++ Project (no launch file)

brude
This post was updated on .
Hello again,

I followed what you said and now run a launch file through a bash script on a separate process. I finally have it working and have it updating the map. However, I am running into a strange issue.

I am using a ZED camera and updating RTAB-Map using the RGB Image and Depth map (like the RGB-D Mapping with external odometry through tf). When I do this, I get correct tf transforms, but the data from the depth map is being published wrong. If I try and view the projected octomap_cloud, it seems as though the depth map is being published above the robot point where it should be populating in front.

I currently have a tf link that goes from map->odom_data->base_link->camera where there is a static publisher from the base_link->camera and an origin transform from map->odom_data. I populate the odom_data->base_link each iteration with the new tf data.

I've tried to change all of these transformations, but nothing seems to change the fact that the depth image is registering above, instead of in front of the camera.

Any thoughts on why this is happening or how to adjust this?

Thanks for all your help! It's much appreciated!

---------------
UPDATE:

I found my error. I wasn't setting the RGB or D images to the correct frame_id. I've solved this now!