Hi Mathieu,
I'm currenlty working on a school project involving the use of rtabmap to map a crop field so as to recognize the cultivations. For this project, I am using the Intel NUC NUC13ANKi7, which boasts 64GB of RAM and a 1TB NVME disk. I've tried using the Azure Kinect on ROS2 (Ubuntu 22.04), but unfortunately I didn't get good results: the odometry estimated by rtabmap was easily lost, even when attempting to map simpler environments such as my bedroom. In this thread I asked for help using The Rosario Dataset (which includes stereo images captured by ZED stereo camera) to simulate the mapping process on a crop field. In that discussion, you showed me that the rtabmap odometry was accurate compared to the GPS positions. Even in the presence of irregularities in the field that caused the camera to shake, the odometry estimation process appeared to perform well. I'm now considering whether I might achieve better results by using a stereo camera like the ZED with rtabmap stereo visual odometry, as opposed to the Azure Kinect with rtabmap visual odometry. Do you think I would have the same problem with the odometry estimation process? I understand that this request may appear self-evident (as experimentation often provides the clearest insights), but it's essential for our lab to make an informed decision regarding the potential purchase of a stereo camera. Thank you in advance for your time and kindness. Best regards, Christian |
Administrator
|
Hi Christian,
Have you tested these steps for kinect azure on ROS2: http://official-rtab-map-forum.206.s1.nabble.com/Differences-between-ROS-output-vs-desktop-td9333.html#a9334 If not, they would improve a lot visual odometry accuracy. If you are going outdoor on a large crop field like in the link you shared, I would prefer a stereo camera to get unlimited range for motion estimation. Visual odometry will be more accurate. cheers, Mathieu |
This post was updated on .
Hi Mathieu,
thank you for your response. These are the commands I am running: ros2 run imu_filter_madgwick imu_filter_madgwick_node --ros-args -r /imu/data_raw:=/imu -p use_mag:=false -p publish_tf:=false ros2 run image_proc image_proc --ros-args -r camera_info:=/rgb/camera_info -r image:=/rgb/image_raw -r image_rect:=/rgb/image_rect ros2 run azure_kinect_ros_driver node --ros-args -p recording_file:=room.mkv -p color_enabled:=true -p fps:=30 -p depth_mode:=WFOV_2X2BINNED ros2 launch rtabmap_launch rtabmap.launch.py rtabmap_args:="--delete_db_on_start" rgb_topic:=/rgb/image_rect depth_topic:=/depth_to_rgb/image_raw camera_info_topic:=/rgb/camera_info frame_id:=camera_base approx_sync:=true approx_sync_max_interval:=0.1 wait_imu_to_init:=true imu_topic:=/imu/data qos:=1 queue_size:=30 (Btw I am using OpenCV version 4.3.0) As you can see from this video, the odometry is easily lost as I turn back from the bathroom. Is there anything wrong with the recording? Here you can find the .mkv video I used: https://drive.google.com/file/d/1OBqDYqW_TcUtoURmqmM3xIN2kgkNcoce/view?usp=drive_link Thank you again, Best regards, Christian |
Administrator
|
At that particular location, when the camera is rotating, there is not so much visual texture in the FOV of the camera and the images are blurry, making difficult to track features (see also reasons to get odometry lost on this page). This is kinda expected. Maybe using a VIO approach it could pass those challenging areas for visual-only visual odometry.
Note also I got relatively better tracking downsampling image from 1080p to 540p with odom_args:="--Odom/ImageDecimation 2" |
Isn't slam with rtabmap and azure kinect already a VIO approach? The commands above feed rtabmap with the kinect imu data.
Is there a way to adjust the azure kinect shutter speed, so the images are less blurry? |
Administrator
|
When you feed imu to rgbd_odometry, it is a loosely coupled VIO, meaning it uses IMU for better visual matches / tracking and aligning results with gravity, but it won't estimate the pose if there are not enough visual features to track.
For the blurriness, you may have to ask their github. I am not aware of that kind of option. One thing you can do is to avoid fast rotations. |
Free forum by Nabble | Edit this page |