Remote mapping Kinect2

classic Classic list List threaded Threaded
11 messages Options
Reply | Threaded
Open this post in threaded view
|

Remote mapping Kinect2

suk1
This post was updated on .
Hi Matthieu,

I'm having trouble with remote mapping with the Kinect v2. Let me tell you what I have done so far :)
I am running Kinetic ROS on Ubuntu 16.04. What I am trying to do is to start mapping and localizing on our 'robot'. For now it is just a Kinect2 on a rolling platform hooked on a Jetson TX1. In the future it will be installed in our moving robot with external odometry, but for now we use the visual odometry.

When I run the
roslaunch kinect2_bridge kinect2_bridge.launch publish_tf:=true 
command followed by
 roslaunch rtabmap_ros rgbd_mapping_kinect2.launch resolution:=qhd 
, RTAB boots up and I get my visuals "okay-ish", might be the environment, and the odometry gives an ouput value of 170+ frequently, so I am happy with that. However, when I start to move around the platform, the odometry fails pretty quickly and often. Also the warning
MainWindow.cpp:1432::processStats() Processing time (1.070361s) is over detection rate (1.000000s), real-time problem!
pops up often as well.
My guess is the processor of the Jetson TX1 can't handle the load of everything.
So my plan was to move the visualization and stuff to a remote laptop.
I first set up the
ROS_MASTER_URI
and
ROS_IP
and then I ran the kinect2_bridge launch on the Jetson TX1 and then running the rgbd_mapping_kinect2 launch on my laptop. However It keeps pushing out the error for both visual_odometry and rtabmap
Did not receive data since 5 seconds!
while the topics are being published okay and I can reach them with
 rostopic echo 
I also tried to use the data_throttle.
In the freenect_throttle.launch I replaced the freenect.launch include with kinect2_bridge. Then I changed the mapping values accordingly for the kinect2, e.g:
remap from="rgb/image_in"       to="/kinect2/qhd/image_color_rect" 
Then on the laptop I would run
roslaunch rtabmap_ros rtabmap.launch rgb_topic:=/camera/data_throttled_image depth_topic:=/camera/data_throttled_image_depth camera_info_topic:=/camera/data_throttled_camera_info compressed:=true rtabmap_args:="--delete_db_on_start"
. But gives the same warning that it did not receive messages in 5 seconds.
When I view the tf frames, I notice that the "map" and "odom" frames are missing.

It's kinda difficult for me to convert all the documentation for the kinect2 as there is little for that version xD. Can you help me out?

In the future we would like to introduce a navigation aspect to the whole. I looked into that Turtlebot navigation a bit to see ahead what would be needed. Is it possible to use the navigation package with a Kinect2 alone?

Thanks!

Sincerely,
Kareem
Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

Eric Schleicher
What options did you used to compile libfreenect2 and iai_kinect?  Are you *actually* using the cuda acceleration?  I would check that first since (as i understand) the RBG frame compression is accelerated as well as the RGB to depth registration (via openCL).  

I doubt the jetson has the headroom to process the qhd data frame without acceleration

Have you tried the sd quality level?  it's 1/4 the amount of data.... even if just to see if it can keep up.

Please let us know since i'm very interested in running a jetson as well.

-Eric
Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

suk1
Hi Eric,

Thank you for your response :) Yes with the Jetson more challenges appear, haha. To compile libfreenect we used Jetsonhacks' scripts as he tested it all and created a nice post http://www.jetsonhacks.com/2016/07/11/ms-kinect-v2-nvidia-jetson-tx1/ .
As for when we run the kinect2_bridge, it picks up the cuda setting:
[ INFO] [1492591527.875158821]: [Kinect2Bridge::initialize] parameter:
        base_name: kinect2
           sensor: default
        fps_limit: -1
       calib_path: /home/ubuntu/catkin_ws/src/iai_kinect2/kinect2_bridge/data/
          use_png: false
     jpeg_quality: 90
        png_level: 1
     depth_method: cuda
     depth_device: -1
       reg_method: default
       reg_device: -1
        max_depth: 12
        min_depth: 0.1
       queue_size: 5
 bilateral_filter: true
edge_aware_filter: true
       publish_tf: true
     base_name_tf: kinect2
   worker_threads: 4
But I wouldn't know if the Jetson actually uses it. NVIDIA has a cuda-gdb debug program, it requires to compile to application with special debug flags, which sounds like a pain in the ass to me. Any other way to check if it actually uses cuda acceleration?
And yes, we also tested the sd quality level. This messes up the visual odom pretty good, as it will not rise above ~80 with objects within a 2 meter radius. Going further away will drop the odom to ~30. It can 'keep up' but it creates a very bad point cloud.

Thanks!

Sincerely,
Kareem
Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

matlabbe
Administrator
Hi,

In your case, implementing a simple wheel odometry (e.g. encoders) approach on the robot would save a lot of computing resources.

For remote mapping, see this tutorial: Remote Mapping. Converting to Kinectv2 would look like this:

1- Launch kinectv2 on robot
$ roslaunch kinect2_bridge kinect2_bridge.launch publish_tf:=true

2- Launch data throttling on robot
<launch>
  <arg name="rate"  default="5"/>
  <arg name="decimation"  default="1"/> <!-- Reduce the image size, e.g., 2 means "width/2 x height/2". -->
  <arg name="resolution" default="qhd" />
  <arg name="approx_sync" default="false" /> <!-- kinect2_bridge has synchronized images -->

  <!-- Use kinect2 nodelet manager -->
  <node pkg="nodelet" type="nodelet" name="data_throttle" args="load rtabmap_ros/data_throttle kinect2" output="screen">
      <param name="rate" type="double" value="$(arg rate)"/>
      <param name="decimation" type="int" value="$(arg decimation)"/>
      <param name="approx_sync" type="bool" value="$(arg approx_sync)"/>

      <remap from="rgb/image_in"       to="/kinect2/$(arg resolution)/image_color_rect"/>
      <remap from="depth/image_in"     to="/kinect2/$(arg resolution)/image_depth_rect"/>
      <remap from="rgb/camera_info_in" to="/kinect2/$(arg resolution)/camera_info"/>

      <remap from="rgb/image_out"       to="/kinect2/data_throttled_image"/>
      <remap from="depth/image_out"     to="/kinect2/data_throttled_image_depth"/>
      <remap from="rgb/camera_info_out" to="/kinect2/data_throttled_camera_info"/>
  </node>    
</launch>

3- Launch rtabmap on remote computer:
roslaunch rtabmap_ros rtabmap.launch \
   rgb_topic:=/kinect2/data_throttled_image \
   depth_topic:=/kinect2/data_throttled_image_depth \
   camera_info_topic:=/kinect2/data_throttled_camera_info \
   compressed:=true \
   approx_sync:=false \
   rtabmap_args:="--delete_db_on_start"

Note that both computers should be synchronized ("ntpdate" for example). Verify that input topics are received:
$ rostopic hz /kinect2/data_throttled_image
$ rostopic hz /kinect2/data_throttled_image_depth
$ rostopic hz /kinect2/data_throttled_camera_info

cheers,
Mathieu
Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

suk1
Cheers for the reply!
Yeah as we did not hook up the system on the robot yet we don't have wheel odometry available, but thanks for the advice :)
I managed to get the remote mapping working properly with your code, I just needed to add a transform from the kinect2_link to the camera_link and then it all worked!
Thanks again and I'll let you know how it works out Eric, in this thread, for when you are still interested in the Kinect2 + Jetson setup!
Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

simon.xm.lee
In reply to this post by matlabbe
Hi sir,

I follow your step to try remote mapping with kinect2. The rtabmapviz can run up, but it doesn't show any images.

I echo below three topics, and they has output value.
$ rostopic echo /kinect2/data_throttled_image
$ rostopic echo /kinect2/data_throttled_image_depth
$ rostopic echo /kinect2/data_throttled_camera_info

Can you give me any suggestions for this problem?
Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

suk1
Hi Simon,

I don't know if rtabmap shows any errors or warnings when you run it, but if it does show a odometry warning about transforms, you might need to add a transform to camera_link or change the frame_id. In a launch file you could do something like this:

 <arg name="pi/2" value="1.5707963267948966"/>
 <arg name="optical_rotate" value="0 0 0 -$(arg pi/2) 0 -$(arg pi/2)" />
 <node pkg="tf" type="static_transform_publisher" name="kinect2_base_link"
       args="$(arg optical_rotate) camera_link kinect2_link 100" /> 


or you can run:

 rosrun tf static_transform_publisher 0 0 0 -1.5707963267948966 0 -1.5707963267948966 camera_link kinect2_link 100


From the tutorial: http://wiki.ros.org/rtabmap_ros/Tutorials/HandHeldMapping

Cheers

Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

suk1
In reply to this post by matlabbe
Hi Mathieu,

So I managed to create a nice map of the room and it can localize inside of it. A next step would be to send navigation commands through Rviz. I am trying to find my way through the turtlebot navigation and use that as a reference point, only throughout all the different files and guides I am kind of lost.
In the end I figure I want the 'cmd_vel' topic from the move_base but how would I set one up?
How would one proceed?

Thank you.

Sincerely,
Kareem


Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

matlabbe
Administrator
Hi,

As you are on turtlebot, see this tutorial: http://wiki.ros.org/rtabmap_ros/Tutorials/MappingAndNavigationOnTurtlebot

EDIT As pointed out in the tutorial, turtlebot remapped /cmd_vel to /mobile_base/commands/velocity

cheers,
Mathieu
Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

suk1
Hi Matthieu,

I'm sorry maybe I did not explain it very well.
I don't own a turtlebot, I made a custom robot. However I'm trying to use the turtlebot as a reference point to learn how to implement mapping and navigation.
I'm still not sure how to connect the mapping and localization to a navigation stack.
Again, thanks for you time to answer all of these questions :)

Greetings,

Karim
Reply | Threaded
Open this post in threaded view
|

Re: Remote mapping Kinect2

matlabbe
Administrator
Hi Karim,

rtabmap will provide a 2d occupancy grid (/rtabmap/grid_map), it is the one used by move_base to generate global plans (see global costmap parameters). For navigation stuff, this tutorial can be also useful to understand all move_base parameters: http://wiki.ros.org/navigation/Tutorials/RobotSetup

cheers,
Mathieu