TY - JOUR
T1 - Classification of Air-Cured Tobacco Leaf Pests Using Pruning Convolutional Neural Networks and Transfer Learning
AU - Swasono, Dwiretno Istiyadi
AU - Tjandrasa, Handayani
AU - Fatichah, Chastine
N1 - Publisher Copyright:
© 2022
PY - 2022
Y1 - 2022
N2 - Convolutional Neural Network (CNN) usually uses a large image dataset with many parameters. Small datasets require a small number of parameters. Existing standard (pre-trained) models such as Alexnet, VGG, Inception, and Resnet have been tested with high accuracy but have many parameters. For small datasets, too many parameters become less efficient and increase computation costs. The high computational costs make the model unsuitable for computers with limited resources such as embedded devices and mobile phones. This research proposes pruning on the depth of resnet50 architecture and adds a dimensionality reduction layer after the pruning point. This approach does not require a complex pruning criteria algorithm, so it is easy to implement. Resnet50 was chosen because it is a good performance with batch normalization and skip connections. We use transfer learning for Resnet50 weight. Pruning is carried out at a depth of the network by cutting at the layer of the activation function. Several pruning points were selected to produce several models with certain parameters. The more networks layer pruned, the smaller the number of parameters produced. We add a layer for channel reduction after pruned network to reduce the number of feature maps before entering the fully connected (FC) layer as a classifier. We retrained a new network using a 2000 tobacco leaf pest dataset split into 1600 training and 400 validation images with 4-classes. The result shows that the accuracy could be maintained equal to the unpruned network up to 100% accuracy and 74.38% reduction rate for the number of parameters. A higher reduction rate of the number of parameters up to 90.62% still provides high accuracy of validation data around 99.3%. These prove that our proposed method effectively maintained accuracy and reduced the number of parameters.
AB - Convolutional Neural Network (CNN) usually uses a large image dataset with many parameters. Small datasets require a small number of parameters. Existing standard (pre-trained) models such as Alexnet, VGG, Inception, and Resnet have been tested with high accuracy but have many parameters. For small datasets, too many parameters become less efficient and increase computation costs. The high computational costs make the model unsuitable for computers with limited resources such as embedded devices and mobile phones. This research proposes pruning on the depth of resnet50 architecture and adds a dimensionality reduction layer after the pruning point. This approach does not require a complex pruning criteria algorithm, so it is easy to implement. Resnet50 was chosen because it is a good performance with batch normalization and skip connections. We use transfer learning for Resnet50 weight. Pruning is carried out at a depth of the network by cutting at the layer of the activation function. Several pruning points were selected to produce several models with certain parameters. The more networks layer pruned, the smaller the number of parameters produced. We add a layer for channel reduction after pruned network to reduce the number of feature maps before entering the fully connected (FC) layer as a classifier. We retrained a new network using a 2000 tobacco leaf pest dataset split into 1600 training and 400 validation images with 4-classes. The result shows that the accuracy could be maintained equal to the unpruned network up to 100% accuracy and 74.38% reduction rate for the number of parameters. A higher reduction rate of the number of parameters up to 90.62% still provides high accuracy of validation data around 99.3%. These prove that our proposed method effectively maintained accuracy and reduced the number of parameters.
KW - Convolutional neural network
KW - pruning
KW - tobacco leaf pest
KW - transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85134018036&partnerID=8YFLogxK
U2 - 10.18517/ijaseit.12.3.15950
DO - 10.18517/ijaseit.12.3.15950
M3 - Article
AN - SCOPUS:85134018036
SN - 2088-5334
VL - 12
SP - 1229
EP - 1235
JO - International Journal on Advanced Science, Engineering and Information Technology
JF - International Journal on Advanced Science, Engineering and Information Technology
IS - 3
ER -