Detect more loop closures... stuck at local maximum?

Posted by Alan on
URL: http://official-rtab-map-forum.206.s1.nabble.com/Detect-more-loop-closures-stuck-at-local-maximum-tp5682.html

Hello,

I have noticed that automatic loop closure detection is sensitive to the constraints of existing loops. In some cases, it appears that a "bad" loop in the database can consistently cause a number of good loop closures to be rejected because the error is too large after optimization. The end effect is that the process converges on a maximum number that is actually fewer than what would be found if the one bad loop were rejected. Perhaps it would be a good idea to include some way to handle these bad loop closures based on the number of times they result in the rejection of new loops?

I haven't tested this as much as I would like to as I am having trouble getting the detection process to consistently go to completion without crashes. (Unfortunately I can't open databases when I have my logger level set to "debug", and there are no WARN or ERROR level events that occur). I have avoided crashes by disabling SBA from the main GUI before starting loop closure detection, but that pre-run settings dialog is missing from the db editor so I haven't been able update any of my databases with new loops using the auto detect option.

I am running version 0.19.0