-
Notifications
You must be signed in to change notification settings - Fork 223
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When IMU is on, estimation is even worse than pure visual mode? #3
Comments
Thank you for discussion. When you say "IMU is worse" I think it has two aspects: (1) VIO may fail, causing the pose estimator flying away. (2) If IMU's working well, the accuracy is worse than pure vision. So I think you mean the first case, right? |
Yeah, you are right. I mean the first case as you described, where VIO fails, and system seems to propagate IMU integration as the estimator's output, and the trajectory flies away quickly.
} On the other hand, I noticed that when VIO fails, the visual part of the system does not seem to try to relocalize itself in the global map, i.e. the visual part has no respond at all. This confuses me a lot. Is it possible to prohibit the IMU output long duration's integration result(which is certainly not reliable)? |
Exactly. You can change the value of these parameters to see the effect. But I'm afraid this won't make a very large change in performance. |
I see in source code that |
Please note that the initialization in VIO is a bit more complicated than in pure vision case. You need to estimate the gravity, velocity, scale and the IMU biases during the initialization process, rather than the initial pose and map in SLAM. The initialization may also fail if you don't provide sufficient motion because the scale and accelerator biases are co-related and thus having ambiguity. We have provided a initialization process in the code and you may call it after lost. A probably more convenient way of using VIO is only use gyroscope to estimate the rotation and ignore the translation when the vision is not working. Because the integrated angle is likely to be right, you can still have a rotation estimation in such cases. And the accelerator may diverge fast if you don't know the speed and its bias. |
Since there is an initialization process in the code, why haven't you called them yet? As shown in the above code of my last comment, you seem to omit this process and put TODO entry here. May I know the reason? By the way, could you let me know where I can find the initialization process? |
The IMU init is written in an extra thread and it usually needs 30 seconds to converge. I wonder if this is a good way to init the IMU and I'm trying other methods (in a new repo because ORB is hard to debug). Please take a look at ygz-stereo-inertial where I use a very simple strategy initing stereo-IMU VO system. |
@highlightz @gaoxiang12 I am also trying to get around this TODO for relocalization in case of IMU being used -- do you guys have any update on that? Any help in this regards will be great! |
@TouqeerAhmad See https://github.com/jingpang/LearnVIORB/ for what you want. In this project, all need IMU reinitialization functions are provided at file Tracking.cc. |
Thank you @highlightz I will look into it. Just a follow up -- were you able to run YGZ-SLAM on your own data? specifically data from an android mobile device |
@TouqeerAhmad Yeah, I feed my iPhone 6S image and IMU data to YGZ. Because the dataset is easy to handle( with smooth motion and enough textures), it runs well. But I've never tested android data. |
@highlightz would you mind sharing the iPhone 6 data and the respective config file? I have S8/S7 data but not from and ios device. How did you do your cam-imu calibration? did you use kalibr for that? Thanks again for your help! |
Yeah, use Kalibr to acquire extrinsic matrix as well as IMU parameters please. |
@highlightz Thanks for sharing the data set. I just tried to run it but for some reason it crashed right after updating the map scale and Map NavState -- did you make any changed to YGZ-SLAM to run with iphone 6 data -- as for me it runs fine for euroc data sets but crashing for android and now for iphone, do we need to make any change in the code based on device/sensors? other than just replacing the config file |
@highlightz Also the trajectory you posted above is that for the same data set you shared? -- as I see you are moving in a loop around desks but there is no loop in the trajectory. please let me know if I missed something here |
As for your crash, you need to debug your program, and trace back the reason why your program crash. |
@highlightz Can you please confirm the layout of the mobile phone when you collected the data. Was the mobile in landscape mode?, and
I collected some data with my friends iphone 6 and the acc_x for me is close to negative 9.8 but for your data set that you shared it was always close to positive 9.8. Then I changed the app and it became 9.8 positive. Can you please confirm the above. |
@highlightz |
Hi, I recently try to use IMU with mono version and also get bad result. It will be very helpful if someone could give some comment with this, |
Hi! |
Thanks for your sharing at first.
As you may know, in EuRoC V102 dataset, there are continuous rotational motions in the middle part, which I think could pose great challenges to a mono-based slam system. Meanwhile, when IMU information is fused, the estimation progress should have been robust. So I build your algorithm on my Ubuntu 14.04 machine with configurations as Intel® Core™ i5-3230M CPU @ 2.60GHz × 4, 8G memory. The result surprises me, because the running with IMU module off is better than with IMU on, though there are several failures in pure visual mode, the loop closure function is available and can successfully relocalize. However, when IMU is on, the state just flies out of the vicon room, which I thought would not possibly happen. Below is the runnings.
IMU on:
IMU off:
What are the possible reasons? Look forward to hearing from you.
The text was updated successfully, but these errors were encountered: