Abstract

LiDAR is one of the visual sensors which can measure a distance and form an environment description. This device is needed for many kinds of vehicle navigation especially for the autonomous system. Nowadays, the 3D LiDAR is still expensive in the market. This study has developed and constructed a 3D LiDAR consisting of a single point LiDAR as the main sensor and a Neural Network for classifying objects. Proportional-integral-derivative (PID) controller was involved to maintain the motor rotation in order to stabilize the scanning process. Arduino Mega microcontroller was used as the main processor to obtain the LiDAR data, to control the motor speed, and to communicate the data with computer. In this case, the 3D LiDAR was tested using five different objects. The experimental results show that the system can recognize all objects with a 100% success rate. This proposed system can be expected to support the road safety on an autonomous vehicle. In addition, the 3D LiDAR can be marketed in a low price.

Original languageEnglish
Title of host publication2022 1st International Conference on Information System and Information Technology, ICISIT 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages336-341
Number of pages6
ISBN (Electronic)9781665402002
DOIs
Publication statusPublished - 2022
Event1st International Conference on Information System and Information Technology, ICISIT 2022 - Virtual, Online, Indonesia
Duration: 27 Jul 202228 Jul 2022

Publication series

Name2022 1st International Conference on Information System and Information Technology, ICISIT 2022

Conference

Conference1st International Conference on Information System and Information Technology, ICISIT 2022
Country/TerritoryIndonesia
CityVirtual, Online
Period27/07/2228/07/22

Keywords

  • LiDAR
  • Neural Network
  • object classification
  • road safety

Fingerprint

Dive into the research topics of 'Design of 3D LiDAR Combined with Neural Network for Object Classification'. Together they form a unique fingerprint.

Cite this