Essential Site Maintenance: Authorea-powered sites will be updated circa 15:00-17:00 Eastern on Tuesday 5 November.
There should be no interruption to normal services, but please contact us at [email protected] in case you face any issues.

Jianyuan Ruan

and 1 more

not-yet-known not-yet-known not-yet-known unknown Public datasets play a crucial role in advancing autonomous robotics research. The rapid evolution of sensors and applications continually drives the need for new datasets. For instance, the shift from mechanical LiDAR (Light Detection and Ranging) sensors on autonomous vehicles to (hybrid) solid-state LiDAR technologies like MEMS (Micro-electromechanical systems) LiDAR has brought about enhanced durability and reduced costs. However, datasets supporting research on these sensors are scarce. This paper presents the multi-modular HK-MEMS dataset, incorporating data from LiDARs, a camera, GNSS, and Inertial Navigation Systems. Notably, it is the first dataset to offer automotive-grade MEMS LiDAR data for research in Simultaneous Localization and Mapping (SLAM). Compared with existing datasets, our data emphasize extreme environments like degenerate urban tunnels and dynamic scenarios, aiming to enhance the robustness of SLAM systems. The data are collected on various platforms including a handheld device, a mobile robot, and notably, buses with real driving behaviors. We collect 187 minutes and 75.4 kilometers of data. State-of-the-art SLAM methods are evaluated on this benchmark. The result highlights the challenges in extreme environments and underscores the ongoing need to enhance the robustness of SLAM systems. This dataset serves as a valuable platform for exploring the potential and limitations of MEMS LiDAR, and a challenge to enhance the robustness of SLAM in urban navigation scenarios. The data is available at https://github.com/RuanJY/HK_MEMS_Dataset.