17 Citations (Scopus)

Abstract

The visual appearance of the fish's head and tail can be used to identify its freshness. A segmentation method that can well isolate those certain parts from a fish body is required for further analysis in a system for detecting fish freshness automatically. In this research, we investigated the performance of two CNN-based segmentation methods, namely YOLO and Mask R-CNN, for separating the head and tail of fish. We retrained the YOLO and Mask R-CNN pre-trained models on the Fish-gres dataset consisting of images with high variability in the background, illumination, and overlapping objects. The experiment on 200 images containing 724 heads and 585 tails annotated manually indicated that both models work optimally. YOLO's performance was slightly better than Mask R-CNN, shown by 98.96% and 96.73% precision, and 80.93% and 75.43% recall, respectively. The experimental result also revealed that YOLO outperforms Mask R-CNN with mAP of 80.12% and 73.39%, respectively.

Original languageEnglish
Title of host publicationICICoS 2020 - Proceeding
Subtitle of host publication4th International Conference on Informatics and Computational Sciences
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728195261
DOIs
Publication statusPublished - 10 Nov 2020
Event4th International Conference on Informatics and Computational Sciences, ICICoS 2020 - Semarang, Indonesia
Duration: 10 Nov 202011 Nov 2020

Publication series

NameICICoS 2020 - Proceeding: 4th International Conference on Informatics and Computational Sciences

Conference

Conference4th International Conference on Informatics and Computational Sciences, ICICoS 2020
Country/TerritoryIndonesia
CitySemarang
Period10/11/2011/11/20

Keywords

  • Mask R-CNN
  • YOLO
  • fish freshness
  • head and tail of fish
  • object detection
  • segmentation

Fingerprint

Dive into the research topics of 'A Comparison of YOLO and Mask R-CNN for Segmenting Head and Tail of Fish'. Together they form a unique fingerprint.

Cite this