Abstract

In conditions that are dangerous for humans and their environment, the use of robots can be a solution to overcome these problems. Various sensors are used to determine the obstacle free path and the exact position of the robot. However, conventional sensors have limitations in terms of detection distance, spatial resolution, and processing complexity. In this study, an autonomous mobile robot has been developed equipped with Light Detection and Ranging (LiDAR) sensor to avoid obstacle. Braitenberg vehicle strategy is used to navigate the movements of the robot. Sensor data collection and control algorithm are implemented on a single computer board of Raspberry Pi 3. The experimental results show that this sensor can measure distance consistently which is not affected by the object's color and ambient light intensity. The mobile robot can avoid colored objects of different sizes. This autonomous mobile robot can also navigate inside a room without any impact on the wall or the obstacle.

Original languageEnglish
Title of host publicationProceedings of 2019 International Conference on Information and Communication Technology and Systems, ICTS 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages197-202
Number of pages6
ISBN (Electronic)9781728121338
DOIs
Publication statusPublished - Jul 2019
Event12th International Conference on Information and Communication Technology and Systems, ICTS 2019 - Surabaya, Indonesia
Duration: 18 Jul 2019 → …

Publication series

NameProceedings of 2019 International Conference on Information and Communication Technology and Systems, ICTS 2019

Conference

Conference12th International Conference on Information and Communication Technology and Systems, ICTS 2019
Country/TerritoryIndonesia
CitySurabaya
Period18/07/19 → …

Keywords

  • Autonomous mobile robot
  • Braitenberg vehicle
  • LiDAR
  • Obstacle avoidance

Fingerprint

Dive into the research topics of 'Lidar-based obstacle avoidance for the autonomous mobile robot'. Together they form a unique fingerprint.

Cite this