implementing rtabmap_drone_example in real drone

classic Classic list List threaded Threaded
15 messages Options
Reply | Threaded
Open this post in threaded view
|

implementing rtabmap_drone_example in real drone

Samim-17
Hello sir, I am trying to implement this on a real drone with px4 and rpi4. what sensor does it use in simulation for holding constant altitude? I saw px4flow topic but it was not publishing anything in simulation. and you said the visual odometry must be > 10hz how do I can check that in rpi4.Also I checked rostopic hz rtabmap/odom and found to be around 2.5. Will this be issue?
Here are some of the images when we launched in real drone

see the odometry image why is it like that not like the one in simulation?
Also I'm passing all the topic for visualization in another computer and only runnning computation in rpi4.
Do you suggest to change to jetson nano ?
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

matlabbe
Administrator
In the simulator, the drone would take off from altitude 0 based on visual odometry. In the offboard node, we target an altitude of 1.5 meter. The controller would know the current altitude from the visual odometry input.

2.5 Hz odometry is too slow. You can decrease the resolution of the images if you can, or set Odom/ImageDecimation to 2 or 4. Check also that sometimes the slow rate on RPI4 is that the camera driver itself cannot handle fast frame rate, causing synchronization issues downstream.



Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

Samim-17
This post was updated on .
We tried to change the Odom/ImageDecimation to 4 but that didn't processed the slam.launch.but Odom/ImageDecimation 2 was fine and we saw little /rtabmap/odom hertz increase. We even decrease the fps and resolution but cannot get /rtabmap/odom over 5 hz. If the odom hertz will be 4  will it create issue ?
If there are no obstacle on an open field/ ground will I be able to navigate if the hz of /rtabmap/odom is less than 5 hz ?
Will it be better to switch to jetson nano?
Can you give any ideas on how we will be able to tune some parameter to run this efficiently on rpi4-b with 4gb ram(4gb swap already) ?
I was thinking of doing processing in my laptop by subscribing to rpi4 topic on my drone. I will use the compressed topic for this. Can this be done ? What you think how effective this will be ?
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

matlabbe
Administrator
Can you see log of how much time is required to generate /rtabmap/odom? If it is 50 ms and you still have 4 Hz, maybe there is an issue with the input topics.

I cannot tell if switching to nano will be better. For some camera drivers, they may be faster on nano.

Using the swap doesn't sound right, that can cause latency problems.

If you can ensure a reliable WIFI connection, you may try doing it on remote computer.
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

Samim-17
This post was updated on .
Hello Mathieu,
I tried rtabmap_drone_example on real drone but faced some issues. The drone kind of hovers at given altitude (0.5m) after launching offboard but after 1-2 second starts to drift left and right.
We are using rpi4b for processing with 1.5GHz and 8 gb ram. RPI4B is processing rs_camera.launch for d435 camera and slam.launch.
We have set the camera resolution as 640*480 and fps=15.
 Running rostopic hz on camera topic we are getting > 10 hz for color/image_raw but only around 5hz for aligned_depth_to_color and the /rtabmap/odom hz is  like 3 - 4 hz.
When we run slam.launch we get odometry > 250 and the time update as around 0.15s.
The drone takesoff and tries to hold constant altitude but slowly starts to drift.
What could be the issue ? I tried to change Odom/ImageDecimation to 2 and 4 but still similar behaviour of drone.
Can I really run this on rpi4b with the given processing power? If I can then what parameters can be tuned so that odometry hz increases?
Will rpi5 with 2.4GHz be able to handle the processing?
I was thinking of doing SLAM.launch in my laptop and running only the px4.launch and rs_camera.launch on rpi4b. If I want to do this will rpi4b wifi and my laptop hotspot will have enough speed/bandwidth transfer ?
here is my odometry and /rtabmap/odom hz photo
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

Samim-17
This post was updated on .
Just a little update above, really waiting for your update. Been stuck on this issue for last 2 weeks. Do I need the infra1 and infra2 camera ? I don't think we need those as there is no such parameter in launch file .
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

matlabbe
Administrator
Visual odometry varies around 200-250 ms, which is 4-5 Hz you are seeing, so it is a kind of a CPU limitation. A RPI5 would be probably better. You could try streaming to a remote computer just to compare, in case you can stream images over wifi without problem.

For the drone drifting, you can compare with visual odometry if it is also drifting. If not, maybe it is because the VO delay is too high and EKF filter of PX4 is ignoring it. One way to go around that is to re-stamp the VO poses before sending to PX4 (that is what we do here).

If you are using a D435, I'll strongly recommend to use infra1 + depth image (with Ir emitter disabled) instead of RGB/Depth.

cheers,
Mathieu

Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

Samim-17
This post was updated on .
To use infra1+depth images, will the remapping of rgb_topics to infra1 will work ?

About the re-stamp of the VO poses before sending to PX4, I am doing rosrun the same offboard.src file of rtabmap_drone_example. If the re-stamp is already done in the code what do you mean, I couldn't understand ?

IN my current setup, I am passing /camera/color/image_raw and /camera/color/camera_info and /camera/aligned_depth_to_color/image_raw topics. Should I change this to image_rect_raw topics lke this or is it fine ?
when doing rostopic hz on all color topics I get like 15hz but on the aligned_depth_color topics I get like 5-6 hz.
But on the /camera/depth/color topics I get 20hz. Which topic you would suggest to use on rpi4b and d435 camera setup.

We have completly turned off GPS and all other parameter are set to use vision.(PX4 Firmware is the latest not 1.12.3)

Thanks for the reply Mathieu.
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

matlabbe
Administrator
Hi,

Samim-17 wrote
To use infra1+depth images, will the remapping of rgb_topics to infra1 will work ?
Yes


Samim-17 wrote
About the re-stamp of the VO poses before sending to PX4, I am doing rosrun the same offboard.src file of rtabmap_drone_example. If the re-stamp is already done in the code what do you mean, I couldn't understand ?
That may not a re-stamping issue then.

Samim-17 wrote
IN my current setup, I am passing /camera/color/image_raw and /camera/color/camera_info and /camera/aligned_depth_to_color/image_raw topics. Should I change this to image_rect_raw topics lke this or is it fine ?
when doing rostopic hz on all color topics I get like 15hz but on the aligned_depth_color topics I get like 5-6 hz.
But on the /camera/depth/color topics I get 20hz. Which topic you would suggest to use on rpi4b and d435 camera setup.
In any cases, you should send rectified images to rtabmap, though some camera drivers may publish /camera/color/image_raw as rectified. Doublecheck how much /camera/color/image_raw and /camera/color/image_rect_raw are different.
I don't know what represents /camera/depth/color, but you should use the depth image aligned with the rgb camera. If you use infra1 instead, you can use original depth that is already aligned with that camera.


Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

Samim-17
Hello Mathieu,
Thanks for suggesting infra1+depth. drone is now performing much better than our previous (color+depth).
Can I get some reasons why infra1 is used in real world instead of color which was used in simulation

But still there is like 5-6 cm swing (roll) in drone while taking off on offboard.
I have used 15fps and 640*480 resolution for camera and rpi4b.
For increasing processing speed as you suggested the following parameter are used

 --Vis/MaxFeatures 500 --Mem/ImagePreDecimation 2 --Mem/ImagePostDecimation 2 --OdomF2M/MaxSize 1000 --Odom/ImageDecimation 2 --Odom/Strategy 1
--Optimizer/GravitySigma 0.1 --Vis/FeatureType 10 --Kp/DetectorStrategy 10 --Grid/MapFrameProjection true --NormalsSegmentation false --Grid/MaxGroundHeight 1 --Grid/MaxObstacleHeight 1.6 --RGBD/StartAtOrigin true

Currently IR emitter is not disabled. My question is what other parameters can be tuned to improve performance in drone ?
I want to drone as stable as possible while taking off to required altiude without much drifts.
Or Maybe if I can send you the .db file so you can check whats going wrong when the drone takeoffs.

Also while giving nav goal the drone looses altitude, what may be issue?

 
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

matlabbe
Administrator
Samim-17 wrote
Can I get some reasons why infra1 is used in real world instead of color which was used in simulation
The simulator is a fake camera, it is not really using the same spec of the realsense. The infra1 has a global shutter and has larger field of view, two important things to get best visual odometry. If you are now using infra1, I strongly recommend to disable the IR emitter or put a sticker on it to block it, otherwise visual odometry will have a hard time as there would be points in the environments moving with the drone.

To debug drone takeoff issue, I would need to see the raw data used by visual odometry, so a rosbag of the 15Hz images would be better.

I have seen the drone losing altitude when creating that example (issue that we can actually see at 3:52 in that video), but it was fixed by updating px4 to a newer version. I've found the youtube comment:
Update: Bug at 3:56 doesn't seem to appear anymore in latest PX4 versions. On Noetic with PX4 v1.12.3, it could fly fully autonomous over 60 min without any sign of that bug. PX4 v1.8.2 was used in this video under Melodic.
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

Samim-17
Thanks Mathieu for the reply.

Can you check if the parameters I mentioned above will increase odometry rate ? and are there other parameters to increase it ? Also does increasing camera fps to 30 instead of 15 and keeping resolution to 640*480 increase the odometry ?
I will send the rosbag to you.
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

matlabbe
Administrator

If you are using "--Odom/Strategy 1", you may also combine with "--Vis/CorType 1". That would be the fastest vo approach from rtabmap, otherwise you may try VIO approaches like msckf_vio or open_vins.
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

Samim-17
Hello Mathieu,
I used infrared camera and got the drone to the hover at desire altitude of 0.5m but now when giving nav goal through rviz the drone goes forward but looses this altitude while reaching the goal. the drone touches the ground trying to reach the goal.
I am using the same global planner and local planner as you did, but what maybe the issue any ideas you have on this issue ? and are there any resources to solve this kind of problem ?
px4 1.15 and latest mavros version
Reply | Threaded
Open this post in threaded view
|

Re: implementing rtabmap_drone_example in real drone

matlabbe
Administrator
You may try to debug if the lost of altitude is caused be visual odometry or the on-board EKF fusion. You can upload your data to https://review.px4.io/ to see some graph and can also be shared.