Drowsiness Detection Based on Facial Landmark and Uniform Local Binary Pattern

Dini Adni Navastara*, Widhera Yoza Mahana Putra, Chastine Fatichah

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

8 Citations (Scopus)

Abstract

When driving a vehicle, it is often challenging for someone to force his condition to keep driving even though in sleepy condition, thus causing a traffic accident. One of the characteristics of drowsy drivers is the eyes are closed for a certain period. This research proposes a system to detect drowsiness, thus can alert the drowsy driver. The first step is to detect the face using a Funnel-structured cascade algorithm. And then extract the facial landmark features on the face to get the eyes location. The features of eyes are extracted by using a Uniform Local Binary Pattern (ULBP) and the Eyes Aspect Ratio (EAR). EAR is the distance between points at eye landmarks. After the features have been extracted, the system classifies the eyes, whether closed or open by using Support Vector Machine (SVM) method. The system calculates the percentage of eye closure (PERCLOS) to detect drowsiness. Based on the experimental results, the proposed method yields the best accuracy of 95.5% and the optimal value of PERCLOS in drowsiness detection is greater than or equal to 60% with a period of 20 frames.

Original languageEnglish
Article number052015
JournalJournal of Physics: Conference Series
Volume1529
Issue number5
DOIs
Publication statusPublished - 17 Jun 2020
Event2nd Joint International Conference on Emerging Computing Technology and Sports, JICETS 2019 - Bandung, Indonesia
Duration: 25 Nov 201927 Nov 2019

Keywords

  • Drowsiness Detection
  • Facial Landmark
  • Funnel-structured cascade
  • PERCLOS
  • Support Vector Machine
  • Uniform Local Binary Pattern
  • real-time

Fingerprint

Dive into the research topics of 'Drowsiness Detection Based on Facial Landmark and Uniform Local Binary Pattern'. Together they form a unique fingerprint.

Cite this