Smart Mobility and Multi-Sensor Fusion

active

This program turns positioning algorithms into deployable systems across the mobility stack — autonomous vehicles, smartphones, drones, urban service robotics, and smart-city infrastructure. Where Integrity-First Navigation asks “is this position correct?” and AI for Positioning asks “how do we compute it?”, Smart Mobility asks “how do we use it?”

Why it matters. Embodied systems cannot depend on one sensor. They need navigation architectures that remain stable when GNSS is blocked, perception degenerates, or low-cost sensors drift. The program spans AV perception integration (LiDAR + camera + GNSS + INS), smartphone-grade urban positioning (3DMA GNSS for L5/E5a/B2a receivers), UAV navigation, connected-AV safety, and urban service robotics.

Funding & partners.

Recognition. PolyU Presidential Award in Knowledge Transfer 2022–24; IPIN 2024 General Chair (Hong Kong, 300+ delegates).

Representative publications. Safety-Quantifiable Planar-Feature LiDAR Localization for Intelligent Vehicles (Zhang et al., IEEE TIV 2024); FGO-Based Smartphone IMU-Only Indoor SLAM (Bai, Wen, Hsu, Yang, IEEE TAES 2024); Adaptive Weighted GNSS/VINS/Wi-Fi RTT-based Seamless Positioning System for Smartphone (Su et al., IPIN 2024).

Funding: RGC CRF Co-PI (HK$7M total, HK$0.6M share, AV safety); ITF-ITSP HK$801k (PI, smartphone urban positioning); Google Research Award US$40k (PI); RC-Metaverse HK$300k (PI)

Started: 2017