Administrator
|
Hi,
For robots, I strongly recommend to use ROS version. With the Windows version, you cannot have live outputs of the pose and the map. On Windows, at least you could test the stereo camera (it seems to be uvc compatible). Standalone imus are not supported with standalone RTAB-Map, only under ROS.
The time of flight sensor you referred is a single beam lidar, it cannot be used for SLAM, but it could be used for obstacle avoidance.
For the whole setup, having wheel odometry will improve robustness. If the robot is picked up, using vision or lidar can also have issues if the sensors are obstructed. Detecting somewhat if the robot has been picked could help to know that current odometry is unreliable until the robot is put down on the ground (maybe also resetting the localization). Ideally it would be to do mapping first, without any external events that could move the robot without being aware of it, to have a nice map. Then go in localization mode, in that mode you could picked up and drop the robot anywhere (it has already mapped) and visual loop closure detection would be able to relocalize on the map.
You may don't need a lidar, if you don't care about very accurate localization.
Have fun with your project!
Mathieu
|