Abstract

We as human possess many great abilities and one of them comes from our vision. Through our vision alone we can obtain a lot of information namely object's identity, faces, events, or even combining partial images to form a complete image. Many researches have been done to replicate our vision in machines, and that is because a scene can contain a lot of information. In this study we are interested in obtaining heading information from sequence of images captured by a camera. There exist many ways to obtain heading information such as by using gyroscope or compass sensors, each with its own advantages and weaknesses. By using a camera, mechanical limitations which will disrupt the measurement of heading, such as wheel slippage, uneven terrain, and tilt can be avoided. With this proposed algorithm, heading can be calculated solely from the sequence of images. The results of our experiment show that heading can be calculated with an average of absolute error of 1.23078° in outdoor environment, and 1.02368° in indoor environment.

Original languageEnglish
Title of host publicationProceeding - 2018 International Seminar on Intelligent Technology and Its Application, ISITIA 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages327-331
Number of pages5
ISBN (Electronic)9781538676547
DOIs
Publication statusPublished - 2 Jul 2018
Event2018 International Seminar on Intelligent Technology and Its Application, ISITIA 2018 - Bali, Indonesia
Duration: 30 Aug 201831 Aug 2018

Publication series

NameProceeding - 2018 International Seminar on Intelligent Technology and Its Application, ISITIA 2018

Conference

Conference2018 International Seminar on Intelligent Technology and Its Application, ISITIA 2018
Country/TerritoryIndonesia
CityBali
Period30/08/1831/08/18

Keywords

  • heading calculation
  • vision-based sensing

Fingerprint

Dive into the research topics of 'Heading Calculation from Sequence of Images Based on Corner Feature Detection and Optical Flow Algorithm'. Together they form a unique fingerprint.

Cite this