Continuous-Time Gaussian Process Motion-Compensation for Event-Vision Pattern Tracking with Distance Fields
- Publisher:
- IEEE
- Publication Type:
- Conference Proceeding
- Citation:
- 2023 IEEE International Conference on Robotics and Automation (ICRA), 2023, 2023-May, pp. 804-812
- Issue Date:
- 2023-07-04
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Continuous-Time_Gaussian_Process_Motion-Compensation_for_Event-Vision_Pattern_Tracking_with_Distance_Fields.pdf | Published version | 1.27 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
This work addresses the issue of motion compensation and pattern tracking in event camera data An event camera generates asynchronous streams of events triggered independently by each of the pixels upon changes in the observed intensity Providing great advantages in low light and rapid motion scenarios such unconventional data present significant research challenges as traditional vision algorithms are not directly applicable to this sensing modality The proposed method decomposes the tracking problem into a local SE 2 motion compensation step followed by a homography registration of small motion compensated event batches The first component relies on Gaussian Process GP theory to model the continuous occupancy field of the events in the image plane and embed the camera trajectory in the covariance kernel function In doing so estimating the trajectory is done similarly to GP hyperparameter learning by maximising the log marginal likelihood of the data The continuous occupancy fields are turned into distance fields and used as templates for homography based registration By benchmarking the proposed method against other state of the art techniques we show that our open source implementation performs high accuracy motion compensation and produces high quality tracks in real world scenarios
Please use this identifier to cite or link to this item: