TY - JOUR
T1 - Unconstrained optimization based fractional order derivative for data classification
AU - Hapsari, Dian Puspita
AU - Utoyo, Imam
AU - Purnami, Santi Wulan
N1 - Publisher Copyright:
© Published under licence by IOP Publishing Ltd.
PY - 2020/9/21
Y1 - 2020/9/21
N2 - Data classification has several problems one of which is a large amount of data that will reduce computing time. The Fractional gradient descent method is an unconstrained optimization algorithm to train classifiers with support vector machines that have convex problems. Compared to the classic integer-order model, a model built with fractional calculus has a significant advantage to accelerate computing time. In this research it is to conduct a qualitative literature review in order to investigate the current state of these new optimization method fractional derivatives can be implemented in the classifier algorithm.
AB - Data classification has several problems one of which is a large amount of data that will reduce computing time. The Fractional gradient descent method is an unconstrained optimization algorithm to train classifiers with support vector machines that have convex problems. Compared to the classic integer-order model, a model built with fractional calculus has a significant advantage to accelerate computing time. In this research it is to conduct a qualitative literature review in order to investigate the current state of these new optimization method fractional derivatives can be implemented in the classifier algorithm.
UR - http://www.scopus.com/inward/record.url?scp=85092765686&partnerID=8YFLogxK
U2 - 10.1088/1742-6596/1613/1/012066
DO - 10.1088/1742-6596/1613/1/012066
M3 - Conference article
AN - SCOPUS:85092765686
SN - 1742-6588
VL - 1613
JO - Journal of Physics: Conference Series
JF - Journal of Physics: Conference Series
IS - 1
M1 - 012066
T2 - 2nd Ahmad Dahlan International Conference on Mathematics and Mathematics Education, ADINTERCOMME 2019
Y2 - 8 November 2019 through 9 November 2019
ER -