SkyLoc: Cross-Modal Global Localization With a Sky-Looking Fish-Eye Camera and OpenStreetMap
- Publisher:
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Publication Type:
- Journal Article
- Citation:
- IEEE Transactions on Intelligent Transportation Systems, 2025, 26, (5), pp. 5832-5842
- Issue Date:
- 2025-01-01
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
Global localization can estimate geo-referenced locations (e.g., longitude and latitude), which is a fundamental capability for autonomous vehicles. Most existing solutions rely on the Global Navigation Satellite Systems (GNSS). Their accuracy could be degraded by the multi-path effects or occlusions of GNSS signals in urban environments. Some GNSS-free methods could achieve global localization by comparing the current on-line sensory data with pre-built databases/maps. However, they require tedious human efforts to drive a vehicle to collect and maintain the databases/maps. Moreover, most of these methods use front-looking cameras or LiDARs, so the captured data could be easily contaminated by dynamic objects (e.g., moving vehicles and pedestrians). To provide a solution to these problems, this paper proposes a novel global localization method by comparing an image from a sky-looking fish-eye camera with the publicly available OpenStreetMap (OSM), and using particle filter to achieve real-time metric localization in dynamic traffic environments. To evaluate our method, we extend a public dataset with OSM data, which are retrieved through the given geo-referenced location information. Experimental results demonstrate the effectiveness and efficiency of our method.
Please use this identifier to cite or link to this item: