Kinect2 + um7 IMU on Jetson TX2

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Kinect2 + um7 IMU on Jetson TX2

hurin
CONTENTS DELETED
The author has deleted this message.
Reply | Threaded
Open this post in threaded view
|

Re: Kinect2 + um7 IMU on Jetson TX2

matlabbe
Administrator
Hi,

It is hard to debug this kind of problem on remote, but here some hints:

* Set inlier_distance=0.1 or use motion_estimation=1 with max_depth=0 if visual odometry is lost often.

* If you are using robot_localization, set publish_tf=false for rgbd_odometry as robot_localization should already publish it.

* At which frequency is published /vo and /imu/data so that queue_size can be correctly set? Maybe reduce frequency of robot_localization.

* For visual odometry, you could use only twist linear and/or rotation.

* For the orientation of the IMU, I think the Z axis should go upward and y left. From the robot_localization doc:
 The IMU message is currently subject to some ambiguity, though this is being addressed by the ROS community. Most IMUs natively report orientation data in a world-fixed frame whose X and Z axes are defined by the vectors pointing to magnetic north and the center of the earth, respectively, with the Y axis facing east (90 degrees offset from the magnetic north vector). This frame is often referred to as NED (North, East, Down). However, REP-103 specifies an ENU (East, North, Up) coordinate frame for outdoor navigation. As of this writing, robot_localization assumes an ENU frame for all IMU data, and does not work with NED frame data. This may change in the future, but for now, users should ensure that data is transformed to the ENU frame before using it with any node in robot_localization.
cheers,
Mathieu