TY - GEN
T1 - Fractional Gradient Descent Optimizer for Linear Classifier Support Vector Machine
AU - Hapsari, Dian Puspita
AU - Utoyo, Imam
AU - Purnami, Santi Wulan
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/3
Y1 - 2020/10/3
N2 - Supervised learning is one of the activities in data mining that aims to classify or predict data. One of the powerful supervised learning algorithms is the Support Vector Machine which is included in the linear classifier. In data prediction activities, efforts are needed to improve the accuracy of predictions by optimizing parameters in the classification algorithm. In this study, the proposed Fractional Gradient Descent as an unconstraint optimization algorithm on objective functions in the SVM classifier. With Fractional Gradient Descent as an optimizer classification model in training data activities to progress the exactness of prediction models. Fractional Gradient Descentoptimizes the SVM classification model using fractional values so that it has small steps with a small learning rate in the process of reaching global minimums, and achieving convergence with lower iterations. With a learning rate of 0.0001 SVM Classifier with fractional gradient descent have error rate = 0.273083, at learning rate 0.001 with error rate = 0.273070, and at learning rate 0.01 with error rate = 0.273134. The results of the SVM Classifier with stochastic gradient descent optimization reach the convergence point at iteration 350. With fractional gradient descent optimization,it reaches a convergence point of 50 iterations smaller than the SVM Classifier with stochastic gradient descent.
AB - Supervised learning is one of the activities in data mining that aims to classify or predict data. One of the powerful supervised learning algorithms is the Support Vector Machine which is included in the linear classifier. In data prediction activities, efforts are needed to improve the accuracy of predictions by optimizing parameters in the classification algorithm. In this study, the proposed Fractional Gradient Descent as an unconstraint optimization algorithm on objective functions in the SVM classifier. With Fractional Gradient Descent as an optimizer classification model in training data activities to progress the exactness of prediction models. Fractional Gradient Descentoptimizes the SVM classification model using fractional values so that it has small steps with a small learning rate in the process of reaching global minimums, and achieving convergence with lower iterations. With a learning rate of 0.0001 SVM Classifier with fractional gradient descent have error rate = 0.273083, at learning rate 0.001 with error rate = 0.273070, and at learning rate 0.01 with error rate = 0.273134. The results of the SVM Classifier with stochastic gradient descent optimization reach the convergence point at iteration 350. With fractional gradient descent optimization,it reaches a convergence point of 50 iterations smaller than the SVM Classifier with stochastic gradient descent.
KW - Data Mining
KW - Fractional Gradient Descent
KW - Optimization
KW - Supervised learning
KW - Support Vector Machine
UR - http://www.scopus.com/inward/record.url?scp=85096669564&partnerID=8YFLogxK
U2 - 10.1109/ICVEE50212.2020.9243288
DO - 10.1109/ICVEE50212.2020.9243288
M3 - Conference contribution
AN - SCOPUS:85096669564
T3 - Proceeding - 2020 3rd International Conference on Vocational Education and Electrical Engineering: Strengthening the framework of Society 5.0 through Innovations in Education, Electrical, Engineering and Informatics Engineering, ICVEE 2020
BT - Proceeding - 2020 3rd International Conference on Vocational Education and Electrical Engineering
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 3rd International Conference on Vocational Education and Electrical Engineering, ICVEE 2020
Y2 - 3 October 2020 through 4 October 2020
ER -