Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
This post was updated on Aug 09, 2023; 3:05am.
CONTENTS DELETED
The author has deleted this message.
|
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
Administrator
|
With the iOS version, we use ARKit's VIO approach, which is more robust than VO approaches inside RTAB-Map library. It is also similar in performance than Google Tango's VIO. Using rtabmap_ros, you may find an Open Source VIO approach that could be fed to rtabmap node, to produce similar results. For iOS, it is just so convenient to have already state of the art VIO with synchronized LiDAR sensor, powerful computer, screen and battery that you can hold with one hand.
With other cameras, there are ways to make it close to ARKit, but careful configurations should be done. For example, with D435i, use IR+Depth mode with IR emitter disabled. Note also that stereo cameras cannot give as accurate point clouds than the TOF (LiDAR) camera on iPhone. With cameras like L515 or Kinect Azure (having TOF camera), you can do ICP odometry, which in some cases is pretty accurate. cheers, Mathieu |
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
Thank you, Mathieu. I appreciate it a lot for the detailed thoughts and guidance!!
|
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
In reply to this post by matlabbe
Hi Mathieu,
On the related topic, could you give some suggestions of approach if I were to utilize the LiDAR on iPhone and port it into rtabmap for existing lidar odometry? My thought is that this should work well in dark areas, however, the max distance of the lidar is only 5 m long. Might work for tight spaces? |
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
Administrator
|
Hi,
It may work for room-size spaces. See http://official-rtab-map-forum.206.s1.nabble.com/Kinect-For-Azure-L515-ICP-lighting-invariant-mapping-td7187.html for parameters on desktop app. On iOS app, you may disable loop closure detection and increase recording rates to have more frames. Set rendering parameters to lowest resolution to avoid fully RAM with rendering stuff. One issue I see if that in total darkness, it may not sop recording frames when VIO cannot output poses. Ideally, we would need to add an option for recording the image data without the pose. On desktop app, you would have to use as input source a database created from iPad app. Also uncheck "Use odometry saved in database", to make rtabmap recompute ICP odometry with the parameters of the linked post above. cheers, Mathieu |
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
Thanks, Mathieu. Could you elaborate on this a little bit more. Slightly confused with the statement. |
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
Administrator
|
ARKit will stop publishing poses when it cannot track features anymore (e.g., in perfect darkness). The data won't be forwarded to rtabmap if so: https://github.com/introlab/rtabmap/blob/64f79813cd2c3a7e3f916a7ef2637e93f1f13a5f/app/ios/RTABMapApp/ViewController.swift#L979-L996. The comment was about to set a parameter to still forward the data with a NULL pose to rtabmap in case of these events.
|
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
Thanks for clarification. As of now, I actually send the previous pose to Rtabmap until the ARKit regains feature tracking/enough visibility. Do you think this will present any issues on the pose estimation portion? The only part I can deduce is might influence the loop closure storage.
|
Loading... |
Reply to author |
Edit post |
Move post |
Delete this post |
Delete this post and replies |
Change post date |
Print post |
Permalink |
Raw mail |
Administrator
|
Hi,
Resending the previous pose may be a workaround. If odometry is recomputed by ICP odometry, it may be able to estimate the correct motion while ARKit cannot. cheers, Mathieu |
Free forum by Nabble | Edit this page |