DEEPREPOMEDUNM: A TRAIN DEEP LEARNING NETWORK AND EXTRACTION FEATURE FOR THE CLASSIFICATION OF PAP SMEAR IMAGES

Dwiza Riana, Sri Hadianti, Sri Rahayu, Faruq Aziz, Frieyadie, Oemie Kalsoem

Research output: Contribution to journalArticlepeer-review

Abstract

The Pap smear test is still the best method for early detection of cervical cancer and preventing the fatal occurrence of cancer in women. Routine examinations can be carried out immediately to detect precancerous lesions and take treatment measures. Although the Pap smear test is a superior test, it still has a weakness in the form of high false positive results due to human negligence. Advances in technology allow the use of deep learning and identification of cell features to classify Pap smear cells. Pap smear cells were acquired to produce Pap smear images. In the process, it generated multiple datasets like RepoMedUNM. The purpose of this study was to classify two classes and four classes of cells consisting of Normal class and three Abnormal classes, namely L-sil, H-sil, and Koilocyt. DeepRepoMedUNM is a classification process that uses VGG16, VGG19, Alexnet, ResNet50, and Euclidean distance methods on 60 Pap smear cell image features. The classification results obtained were compared and analyzed for two classes and four classes. For the RepoMedUNM dataset, we have obtained up-to-date classification accuracy of 96% for two-class and 91% four-class classification using VGG16 model.

Original languageEnglish
Pages (from-to)5787-5800
Number of pages14
JournalJournal of Theoretical and Applied Information Technology
Volume100
Issue number19
Publication statusPublished - 15 Oct 2022
Externally publishedYes

Keywords

  • Cervical Cancer
  • Classification
  • Deep Learning
  • Ensemble Learning
  • Feature Fusion
  • Late Fusion Cervical Cell
  • Pap Smear

Fingerprint

Dive into the research topics of 'DEEPREPOMEDUNM: A TRAIN DEEP LEARNING NETWORK AND EXTRACTION FEATURE FOR THE CLASSIFICATION OF PAP SMEAR IMAGES'. Together they form a unique fingerprint.

Cite this