Operator-N Layer Construction for Optimizing Capsule Network Methods in Image Classification Problems

Ridho Nur Rohman Wijaya, Budi Setiyono*, Mahmud Yunus, Dwi Ratna Sulistyaningrum

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


Capsule Network (CapsNet) is an image classification method that has successfully demonstrated excellent performance in Deep Learning. The main concept of CapsNet is to change image features into capsules and create entities that better represent information from these features. However, capsule formation causes an increase in the number of training parameters and makes the computation time longer. Moreover, the CapsNet method is less effective in processing images with complex backgrounds because it has limitations in feature extraction. We propose a mathematical approach by introducing an operator-N layer to improve CapsNet performance. The operator-N is an operator constructed using Euclidean norms, which functions to shrink dimensions and speed up the process of translation equivariance from a capsule. According to experimental results on MNIST, Fashion MNIST, and Kuzushiji MNIST datasets, the proposed operator-N layer significantly improves accuracy up to 1.72% with 2.91 times faster computation time than the original CapsNet. Furthermore, the total parameters used during the training process have been reduced to 15.7%. Our research provides valuable insights for overcoming the challenges posed by feature extraction limitations and paves the way for more efficient image classification methods in the future.

Original languageEnglish
Pages (from-to)90-101
Number of pages12
JournalJournal of Information Hiding and Multimedia Signal Processing
Issue number3
Publication statusPublished - 2023


  • Capsule Network
  • Deep Learning
  • Feature Extraction
  • Norm
  • Operator


Dive into the research topics of 'Operator-N Layer Construction for Optimizing Capsule Network Methods in Image Classification Problems'. Together they form a unique fingerprint.

Cite this