Abstract
Event cameras, when combined with inertial sensors, show significant potential for motion estimation in challenging scenarios, such as high-speed maneuvers and low-light environments. While numerous methods exist for producing such estimations, most boil down to solving a synchronous discrete-time fusion problem. However, the asynchronous nature of event cameras and their unique fusion mechanism with inertial sensors remain underexplored. In this article, we introduce a monocular event-inertial odometry method called asynchronous event-inertial odometry (AsynEIO), designed to fuse asynchronous event and inertial data within a unified Gaussian process (GP) regression framework. Our approach incorporates an event-driven front-end that tracks feature trajectories directly from raw event streams at a high temporal resolution. These tracked feature trajectories, along with various inertial factors, are integrated into the same GP regression framework to enable asynchronous fusion. With deriving analytical residual Jacobians and noise models, our method constructs a factor graph that is iteratively optimized and pruned using a sliding-window optimizer. Comparative assessments highlight the performance of different inertial fusion strategies, suggesting optimal choices for varying conditions. Experimental results on both public datasets and our own event-inertial sequences indicate that AsynEIO outperforms existing methods, especially in high-speed and low-illumination scenarios.
| Original language | English |
|---|---|
| Pages (from-to) | 5020-5039 |
| Number of pages | 20 |
| Journal | IEEE Transactions on Robotics |
| Volume | 41 |
| DOIs | |
| State | Published - 2025 |
Keywords
- Event-inertial fusion
- Gaussian process (GP) regression
- motion estimation
Fingerprint
Dive into the research topics of 'AsynEIO: Asynchronous Monocular Event-Inertial Odometry Using Gaussian Process Regression'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver