Environmental Localization and Detection Using 2D LIDAR on a Non-Holonomic Differential Mobile Robot

Muhammad Azriel Rizqifadiilah*, Trihastuti Agustinah, Achmad Jazidie

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

1 Citation (Scopus)

Abstract

This research presents a novel approach for environmental localization and multi-object detection using a 2D LIDAR system integrated into a non-holonomic differential mobile robot. The proposed methodology combines SLAM real-time localization and Euclidean Clustering for detecting surrounding objects, enabling the robot to localize effectively within its working environment and identify multiple obstacles. By segmenting LIDAR data into discrete object clusters, the system can accurately determine the size and shape of detected obstacles, providing a detailed understanding of the environment. Simulation results demonstrate that the proposed approach effectively addresses the challenges of localization and multi-object detection. The Euclidean Clustering approach has shown to be a lot more effective. With a time performance of 443.1677 seconds, it finishes the mission. The Euclidean clustering is significantly faster than the K-Nearest Neighbors (K-NN) technique, which required 11256 seconds with a K value of 100 and 16542 seconds with a K value of 200. The performance of autonomous mobile robots is improved because Euclidean Clustering provides a better time performance in these specific scenarios.

Original languageEnglish
Pages (from-to)277-283
Number of pages7
JournalIET Conference Proceedings
Volume2023
Issue number11
DOIs
Publication statusPublished - 2023
Event2023 International Conference on Green Energy, Computing and Intelligent Technology, GEn-CITy 2023 - Hybrid, Iskandar Puteri, Malaysia
Duration: 10 Jul 202312 Jul 2023

Fingerprint

Dive into the research topics of 'Environmental Localization and Detection Using 2D LIDAR on a Non-Holonomic Differential Mobile Robot'. Together they form a unique fingerprint.

Cite this