Park, Yeong Sang
(Korea Advanced Institute of Science Technology, Department of Civil and Environmental Engineering, Daejeon, South Korea)
,
Shin, Young-Sik
(Korea Institute of Machinery and Materials, Daejeon, South Korea)
,
Kim, Joowan
(Korea Advanced Institute of Science Technology, Department of Civil and Environmental Engineering, Daejeon, South Korea)
,
Kim, Ayoung
(Korea Advanced Institute of Science Technology, Department of Civil and Environmental Engineering, Daejeon, South Korea)
Achieving general 3D motion estimation for all-visibility has been a key challenge in robotics, especially in extreme environments. The widely adopted camera and LiDAR-based motion estimation critically deteriorate under fog or smoke. In this letter, we devised a unique sensor system for 3D velocity...
Achieving general 3D motion estimation for all-visibility has been a key challenge in robotics, especially in extreme environments. The widely adopted camera and LiDAR-based motion estimation critically deteriorate under fog or smoke. In this letter, we devised a unique sensor system for 3D velocity estimation by compositing two orthogonal radar sensors. The proposed configuration allowed securing returns from the static objects on the ground. This work aimed at a realistic sensor deployment in a harsh environment by casing the bare sensor rig into a plastic box. As will be shown, the proposed velocity-based ego-motion estimation presented reliable performance over existing point matching-based methods, which degraded when measurement attenuates due to the casing. Furthermore, we introduce a novel radar instant velocity factor for pose-graph simultaneous localization and mapping (SLAM) framework and solve for 3D ego-motion in the integration with IMU. The validation reveals that the proposed method can be applied to estimate general 3D motion in both indoor and outdoor, targetting various visibility and the structureness in the environment.
Achieving general 3D motion estimation for all-visibility has been a key challenge in robotics, especially in extreme environments. The widely adopted camera and LiDAR-based motion estimation critically deteriorate under fog or smoke. In this letter, we devised a unique sensor system for 3D velocity estimation by compositing two orthogonal radar sensors. The proposed configuration allowed securing returns from the static objects on the ground. This work aimed at a realistic sensor deployment in a harsh environment by casing the bare sensor rig into a plastic box. As will be shown, the proposed velocity-based ego-motion estimation presented reliable performance over existing point matching-based methods, which degraded when measurement attenuates due to the casing. Furthermore, we introduce a novel radar instant velocity factor for pose-graph simultaneous localization and mapping (SLAM) framework and solve for 3D ego-motion in the integration with IMU. The validation reveals that the proposed method can be applied to estimate general 3D motion in both indoor and outdoor, targetting various visibility and the structureness in the environment.
Proc ACM Conf Embedded Netw Sens Syst milliEgo: Single-chip mmWave radar aided Egomotion estimation via deep sensor fusion lu 2020 109
GTSAM library 0
10.1109/OCEANS.1994.364228
10.1109/IFIC.2000.862676
10.15607/RSS.2014.X.007
10.1109/IROS.2018.8593941
Qin, Tong, Li, Peiliang, Shen, Shaojie.
VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator.
IEEE transactions on robotics : a publication of the IEEE Robotics and Automation Society,
vol.34,
no.4,
1004-1020.
※ AI-Helper는 부적절한 답변을 할 수 있습니다.