If depth doesn't match with rgb only when moving, there is a synchronization problem on kinect2_bridge (USB problem? computation power not enough?). Being able to use OpenCL over CPU for registration would help.
Re: Troubleshooting mapping with kinect v2, rplidar and jetson tx2
As far as I can tell, in the static case the rgb and depth images not lining up only occurred when the kinect was right up close to something, such that the depth camera was unable to see it. Then the RGB image was projected onto whatever was behind it. Removing objects that could be close, like the table leg, seems to have helped there. I have performed calibration twice on the unit according to their instructions.
However the "not lining up" when moving is still present, and I think you may be right about the processing power and synchonisation issue. The actual kinect2_bridge topic only publishes at 2Hz and, when running just the viewer and moving the camera, a lot of image distortion and lag is observed. Even using the overclocking script on the TX2 does not seem to improve matters. Perhaps the board just is not capable of running kinect2_bridge fast enough.
Sadly the TX2 is not compatible with OpenCL, so that is not an option either.
What kind of USB problem could cause this? If you have any suggestions of things to look into that would be hugely appreciated. I do not particularly want to abandon the project as I wanted to use the robot as an interview portfolio piece, however I am starting to think that may be the only option.
Re: Troubleshooting mapping with kinect v2, rplidar and jetson tx2
This post was updated on .
Hi
I tried switching back to sd clouds to see if its increased (30Hz) publish rate would help at least with matching each cloud, but it has identical problems with table legs etc being reproduced multiple times. Of course, switching back to sd means the black noise is back too! This seems to suggest its not a speed issue. The help boards at kinect2_bridge and libefreenect2 are unfortunately not quite as good as this forum and I've not had any response from them, I think they only react if you are posting an actual bug rather than asking for help.
I have the new motor coming on Tuesday which will hopefully rule out odometry as a cause of bad matching. With one wheel always reporting 40 clicks more but the roboclaw module only aware of one set of clicks per turn, there is going to be a lot of error in the odometry as a matter of course, as when you set up the roboclaw node you only give it one value for QPPS and it assumes both motors are identical. At least if the two encoders agree it will be possible to determine if that is having any influence.
Is it worth also tweaking some actual rtabmap settings that influence the way the clouds overlay on one another?
Re: Troubleshooting mapping with kinect v2, rplidar and jetson tx2
With the new motor and some tweaks to the odometry node settings, I am getting vastly improved maps. I am rotating the bot at the very slowest speed possible before the motors give up and have found a way to squeeze more performance out of the TX2, so I am at least getting 4Hz from kinect2_bridge which gave an immediate improvement.
Its still not perfect, and if I could convince my laptop to run kinect2_bridge without erroring out I would try that, however what I have should be enough to create a small map. It really does look as if the TX2 just doesn't quite have the grunt.
Do you know of anyone using a kinect 2 and a TX2 who gets good performance? Would be good to get some pointers on how to squeeze more out of those rocks! :)
Re: Troubleshooting mapping with kinect v2, rplidar and jetson tx2
Well that confirms it was the speed - I managed to get the laptop working, which had qhd clouds at 30Hz. Maps are a million times better! Still got a few niggles to iron out but its way further along than it was! Cheers for helping :)