Hello. I am currently trying to get RTABMAP to work on jetson xavier. I tried to create a map, but it was too heavy to be of much use. Is there anything I can do to improve it? Or is it wrong to run it on this machine in the first place?
It depends what you want to map and what you want to visualize (note that you can opt-out 3D point cloud in rtabmapviz->Preferences->3DRendering to save visualization time). rtabmapviz is also not required, you can launch rviz instead and subscribe only topics you want.
If you have external odometry (wheel encoders) maybe fused with IMU, and mapping in 2D with a 2D lidar, rtabmap can run on a RPI3 (and will actually run faster with less memory than gmapping). For RGB-D SLAM, we have examples of rtabmap running on RPI3 but at lower rate (slow).
On smaller computers, to do 6DoF mapping, it is preferred to use external odometry (like a fast VIO approach, wheel encoders/IMU or even "computational free" VIO from T265) than using rtabmap's odometry that may use 100% of one CPU. You may do visualization only offboard on another computer.
If I understand correctly, rtabmap is a graph-based-slam, which is an offline slam that uses graphs, right?
Is the theory of rtabmap similar to general graph-based-slam? Also, is it possible to see the source code of the part that is calculating the slam somewhere? I would like to know how it works.
RTAB-Map is a graph-SLAM like Google Cartographer, SLAM toolbox, ORB_SLAM...
It can be used offline or online, depending on the application. We have CLI tools to process a bunch of data offline. With ROS, you can do online SLAM. In this paper, we did continuous SLAM online during 8 hours. By interpolating the results and avoiding saving all debugging data, the robot could have done SLAM online during 800 hours (33 days) with a hard drive of 100 GB. With graph reduction enabled, it could have done SLAM online continuously over 130 days. It is really a matter on how we configure RTAB-Map and which sensors are used. By default, it keeps all data for debugging purpose, which is easier for a new user to get its hands on.
Graph-SLAM can be split into 3 main modules: odometry, loop closure detection and graph optimization. rtabmap node does the 2 later. See figure 1 of this paper:
1) I did this in a real environment (long corridor), and with lidar slam I get a nice straight line, but with RTABMAP I get noise and jaggedness. I think it is a natural phenomenon because I think it is more difficult for a camera to generate a map, but can it be improved by adjusting some parameters??
According to the above explanation, am I correct in assuming that rtabmap does not calculate odometry?
Does it just receive wheel odom etc and use it?