International Journal of Control, Automation, and Systems 2025; 23(1): 41-54
https://doi.org/10.1007/s12555-024-0003-4
© The International Journal of Control, Automation, and Systems
Simultaneous localization and mapping (SLAM) algorithms have been researched to achieve precise pose estimation for autonomous vehicles for several years. However, the target platforms in these studies have primarily focused on small ground vehicles and unmanned aerial vehicles (UAVs). Consequently, this paper proposes an improved LiDAR-based odometry estimation method for autonomous buses in urban environments, addressing a research gap often observed in the focus on predominant smaller platforms. Many SLAM algorithms have adopted the LiDAR-inertial odometry (LIO) method that uses inertial measurement unit (IMU) sensors to enhance accuracy. However, due to its heightened sensitivity to external conditions, the application of IMU on a substantial-size bus is considered impractical. Consequently, this study leverages the vehicle kinematics model and chassis information, including wheel speed, to estimate velocity and yaw rate, thereby improving the robustness and accuracy in comparison to the referenced LiDAR odometry method. Subsequently, the LiDAR map in the local frame undergoes transformation to the world frame by aligning the global navigation satellite system (GNSS) trajectory with the LiDAR SLAM trajectory. The study presents results based on actual vehicle data collected on urban tracks. Additionally, a non-Gaussian noise model was used for intentional GNSS corruption to validate the robustness of alignment methods. Experimental results demonstrate the mitigation of fault estimation and drift observed in the conventional LIO method. In the world transformation of a LiDAR map, the proposed matching methods yield robust results that closely approximate the desired transformation, even in the presence of GNSS position errors.
Keywords Autonomous bus, LiDAR-inertial odometry, map transformation, sensor fusion, vehicle kinematics model, wheel speed.
International Journal of Control, Automation, and Systems 2025; 23(1): 41-54
Published online January 1, 2025 https://doi.org/10.1007/s12555-024-0003-4
Copyright © The International Journal of Control, Automation, and Systems.
Woojin Kwon*, Hyunsung Lee, Ayoung Kim, and Kyongsu Yi
Seoul National University
Simultaneous localization and mapping (SLAM) algorithms have been researched to achieve precise pose estimation for autonomous vehicles for several years. However, the target platforms in these studies have primarily focused on small ground vehicles and unmanned aerial vehicles (UAVs). Consequently, this paper proposes an improved LiDAR-based odometry estimation method for autonomous buses in urban environments, addressing a research gap often observed in the focus on predominant smaller platforms. Many SLAM algorithms have adopted the LiDAR-inertial odometry (LIO) method that uses inertial measurement unit (IMU) sensors to enhance accuracy. However, due to its heightened sensitivity to external conditions, the application of IMU on a substantial-size bus is considered impractical. Consequently, this study leverages the vehicle kinematics model and chassis information, including wheel speed, to estimate velocity and yaw rate, thereby improving the robustness and accuracy in comparison to the referenced LiDAR odometry method. Subsequently, the LiDAR map in the local frame undergoes transformation to the world frame by aligning the global navigation satellite system (GNSS) trajectory with the LiDAR SLAM trajectory. The study presents results based on actual vehicle data collected on urban tracks. Additionally, a non-Gaussian noise model was used for intentional GNSS corruption to validate the robustness of alignment methods. Experimental results demonstrate the mitigation of fault estimation and drift observed in the conventional LIO method. In the world transformation of a LiDAR map, the proposed matching methods yield robust results that closely approximate the desired transformation, even in the presence of GNSS position errors.
Keywords: Autonomous bus, LiDAR-inertial odometry, map transformation, sensor fusion, vehicle kinematics model, wheel speed.
Vol. 23, No. 1, pp. 1~88
Jens E. Bremnes*, Astrid H. Brodtkorb, and Asgeir J. Sørensen
International Journal of Control, Automation and Systems 2021; 19(1): 137-144Dong Sung Pae, Yoon Suk Jang, Sang Kyoo Park, and Myo Taeg Lim*
International Journal of Control, Automation and Systems 2021; 19(1): 40-53Hoon Kang*, Hyun Su Lee, Young-Bin Kwon, and Ye Hwan Park
International Journal of Control, Automation and Systems 2018; 16(3): 1263-1270