Hi,
1) Do you want to run them at the same time (exactly the same trajectory)? You may do a rosbag of both camera and lidar, then launch rtabmp either with camera or with lidar configuration. You can then export the poses of the two generated databse in RGB-D Dataset format and use their tool to compare the paths. Ref:
https://vision.in.tum.de/data/datasets/rgbd-dataset/tools2) You have to debug if the jump is caused by a wrong loop closure (rtabmap) or if the jump is happening from for wheel_imu odometry. If it is from the odometry, revise your odometry node. If it is coming from rtabmap, decrease covariance set in your odometry twist, that way rtabmap won't accept loop closures deforming too much the map.
3) What kind of lidar? But yes, we can use lidar for the local localization (refine wheel odom, do icp_odometry) and still keep the camera for global loop closure detection.
cheers,
Mathieu