Hello!
My team and I are attempting to create a 2D Occupancy Grid with our ZED stereo camera with the purpose of achieving obstacle avoidance in the near future. Our current setup for our robot involves the Nvidia TX1 and the ZED camera. We are fairly new to ROS, so we are unsure on how to adapt the launch files and tutorials (this one, in particular: http://wiki.ros.org/rtabmap_ros/Tutorials/StereoOutdoorMapping) to our ZED camera. We have successfully run demo_stereo_outdoor.launch with the given rosbag file, but when we run demo_stereo_outdoor.launch by itself, we are not getting any video output from our ZED in rviz/rtabmapviz. However, rtabmap.launch runs successfully after we adjust the topic names and NAMESPACE. When we run the ZED wrapper for ROS, we have the following topics published, after exporting to the camera NAMESPACE: /camera/depth/camera_info /camera/depth/depth_registered /camera/depth/depth_registered/compressed /camera/depth/depth_registered/compressed/parameter_descriptions /camera/depth/depth_registered/compressed/parameter_updates /camera/depth/depth_registered/compressedDepth /camera/depth/depth_registered/compressedDepth/parameter_descriptions /camera/depth/depth_registered/compressedDepth/parameter_updates /camera/depth/depth_registered/theora /camera/depth/depth_registered/theora/parameter_descriptions /camera/depth/depth_registered/theora/parameter_updates /camera/joint_states /camera/left/camera_info /camera/left/camera_info_raw /camera/left/image_raw_color /camera/left/image_raw_color/compressed /camera/left/image_raw_color/compressed/parameter_descriptions /camera/left/image_raw_color/compressed/parameter_updates /camera/left/image_raw_color/compressedDepth /camera/left/image_raw_color/compressedDepth/parameter_descriptions /camera/left/image_raw_color/compressedDepth/parameter_updates /camera/left/image_rect_color /camera/left/image_rect_color/compressed /camera/left/image_rect_color/compressed/parameter_descriptions /camera/left/image_rect_color/compressed/parameter_updates /camera/left/image_rect_color/compressedDepth /camera/left/image_rect_color/compressedDepth/parameter_descriptions /camera/left/image_rect_color/compressedDepth/parameter_updates /camera/left/image_rect_color/theora /camera/left/image_rect_color/theora/parameter_descriptions /camera/left/image_rect_color/theora/parameter_updates /camera/odom /camera/point_cloud/cloud_registered /camera/rgb/camera_info /camera/rgb/camera_info_raw /camera/rgb/image_raw_color /camera/rgb/image_raw_color/compressed /camera/rgb/image_raw_color/compressed/parameter_descriptions /camera/rgb/image_raw_color/compressed/parameter_updates /camera/rgb/image_raw_color/compressedDepth /camera/rgb/image_raw_color/compressedDepth/parameter_descriptions /camera/rgb/image_raw_color/compressedDepth/parameter_updates /camera/rgb/image_raw_color/theora /camera/rgb/image_raw_color/theora/parameter_descriptions /camera/rgb/image_raw_color/theora/parameter_updates /camera/rgb/image_rect_color /camera/rgb/image_rect_color/compressed /camera/rgb/image_rect_color/compressed/parameter_descriptions /camera/rgb/image_rect_color/compressed/parameter_updates /camera/rgb/image_rect_color/compressedDepth /camera/rgb/image_rect_color/compressedDepth/parameter_descriptions /camera/rgb/image_rect_color/compressedDepth/parameter_updates /camera/rgb/image_rect_color/theora /camera/rgb/image_rect_color/theora/parameter_descriptions /camera/rgb/image_rect_color/theora/parameter_updates /camera/right/camera_info /camera/right/camera_info_raw /camera/right/image_raw_color /camera/right/image_raw_color/compressed /camera/right/image_raw_color/compressed/parameter_descriptions /camera/right/image_raw_color/compressed/parameter_updates /camera/right/image_raw_color/compressedDepth /camera/right/image_raw_color/compressedDepth/parameter_descriptions /camera/right/image_raw_color/compressedDepth/parameter_updates /camera/right/image_raw_color/theora /camera/right/image_raw_color/theora/parameter_descriptions /camera/right/image_raw_color/theora/parameter_updates /camera/right/image_rect_color /camera/right/image_rect_color/compressed /camera/right/image_rect_color/compressed/parameter_descriptions /camera/right/image_rect_color/compressed/parameter_updates /camera/right/image_rect_color/compressedDepth /camera/right/image_rect_color/compressedDepth/parameter_descriptions /camera/right/image_rect_color/compressedDepth/parameter_updates /camera/right/image_rect_color/theora /camera/right/image_rect_color/theora/parameter_descriptions /camera/right/image_rect_color/theora/parameter_updates /camera/zed_wrapper_node/parameter_descriptions /camera/zed_wrapper_node/parameter_updates /rosout /rosout_agg /tf /tf_static It appears ZED is missing the throttle topics that the Bumblebee camera in the tutorial has. Additionally, my team is not sure if we should be pursuing these launch files if we want to just save 2D occupancy grids based on live video feeds. Can somebody provide advice on how to proceed with creating 2D occupancy grids? Unfortunately, I am not sure about our team's navigation stack, as the Vision Subteam operates independently of the rest of the software currently. If anybody could provide insight, or even better, launch files with ZED-specific code, that would be great! Thank you! |
Administrator
|
Hi,
Following this tutorial for zed, you may just subscribe to /rtabmap/grid_map to get the 2d map. Make sure to set frame_id to base frame of the robot with rtabmap.launch so that the 3d map is correctly aligned with the ground (so that when projecting to ground the 2d grid makes sense). The stereo outdoor demo is very specific to the corresponding rosbag. You don't have to throttle the topics on the real system (unless you want to stream the images over network), we did that here to reduce the size of the rosbag. With latest version, the default parameters would be fine too, so rtabmap.launch can be used directly. cheers, Mathieu |
Hello,
Thank you for the prompt response! We have echoed the proj_map topic (since we are using a stereo camera) and are now receiving data in the form of large arrays. Is there a way to visualize this data to produce an actual 2D map, such as in rtabmap_viz? Additionally, we have noticed that the array is populated by 100 or -1's only, which we thought was troubling. Are the probabilities being rounded? Best, Kane |
Administrator
|
Hi,
These arrays are OccupancyGrid messages. They can be visualized as grid map with RVIZ, using map display. See http://wiki.ros.org/rviz/DisplayTypes/Map 100 is for obstacle, -1 is for unknown, 0 is for empty. If the ZED cannot see the ground (no texture on the ground), there would be only obstacles and unknown cells. To fill empty space between sensor and obstacles in 2D, set "Grid/RayTracing=true" and "Grid/3D=false". When seeing an obstacle, rtabmap sets directly 100, if empty, it sets 0. There are no values in between. cheers, Mathieu |
Free forum by Nabble | Edit this page |