Long-term Visual Localization with Mobile Sensors

CVPR 2023

Shen Yan1,3, Yu Liu1, Long Wang2, Zehong Shen3, Zhen Peng2, Haomin Liu2, Maojun Zhang1, Guofeng Zhang3, Xiaowei Zhou3

1National University of Defense Technology    2SenseTime Research    3Zhejiang University   


A novel outdoor visual localization framework with multi-sensor prior for robust and accurate localization under extreme visual changes; Benchmarking existing methods and demonstrating the effectiveness of the proposed approach; A new dataset for multi-sensor visual localization with seasonal and illumination variations.

Despite the remarkable advances in image matching and pose estimation, image-based localization of a camera in a temporally-varying outdoor environment is still a challenging problem due to huge appearance disparity between query and reference images caused by illumination, seasonal and structural changes. In this work, we propose to leverage additional sensors on a mobile phone, mainly GPS, compass, and gravity sensor, to solve this challenging problem. We show that these mobile sensors provide decent initial poses and effective constraints to reduce the searching space in image matching and final pose estimation. With the initial pose, we are also able to devise a direct 2D-3D matching network to efficiently establish 2D-3D correspondences instead of tedious 2D-2D matching in existing systems. As no public dataset exists for the studied problem, we collect a new dataset that provides a variety of mobile sensor data and significant scene appearance variations, and develop a system to acquire ground-truth poses for query images. We benchmark our method as well as several state-of-the-art baselines and demonstrate the effectiveness of the proposed approach. The code and dataset will be released publicly.

Overview video


    title={Long-term Visual Localization with Mobile Sensors},
    author={Yan, Shen and Liu, Yu and Wang, Long and Shen, Zehong and Peng, Zhen and Liu, Haomin and Zhang, Maojun and Zhang, Guofeng and Zhou, Xiaowei},