Abstract

In order to develop the steering control for collision avoidance behaviour, robot must be able to determine its heading orientation with respect to environment. Orientation can be measured by dedicated sensors or through visual features perception. In vision-based orientation estimation problem, most of approaches are making use of a matching process between pair of frames. This paper proposes a method of estimating robot's heading orientation by using only a single-frame of fish-eye image. CIE-LAB colour space is applied to handle colour and illumination intensity change. Straight line segments are extracted from thresholded CIE-LAB image take advantage of Progressive Probabilistic Hough Transform. Angle of the corresponding line segment is measured using combination of Law of Cosines and quadrant principle. Heading orientation in yaw angle is estimated by implementing voting mechanism based on region grouping and length of perpendicular line. Some experiments are made in robot soccer field environment to compare orientation estimation system against IMU's measurement. Discussion about the performance and limitation of the system are included in this paper.

Original languageEnglish
Article number022092
JournalJournal of Physics: Conference Series
Volume1569
Issue number2
DOIs
Publication statusPublished - 23 Jul 2020
Event3rd International Conference on Science and Technology 2019, ICST 2019 - Surabaya, Indonesia
Duration: 17 Oct 201918 Oct 2019

Fingerprint

Dive into the research topics of 'Robot Orientation Estimation Based on Single-Frame of Fish-eye Image'. Together they form a unique fingerprint.

Cite this