Unconstrained optimization based fractional order derivative for data classification

Dian Puspita Hapsari, Imam Utoyo*, Santi Wulan Purnami

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

1 Citation (Scopus)


Data classification has several problems one of which is a large amount of data that will reduce computing time. The Fractional gradient descent method is an unconstrained optimization algorithm to train classifiers with support vector machines that have convex problems. Compared to the classic integer-order model, a model built with fractional calculus has a significant advantage to accelerate computing time. In this research it is to conduct a qualitative literature review in order to investigate the current state of these new optimization method fractional derivatives can be implemented in the classifier algorithm.

Original languageEnglish
Article number012066
JournalJournal of Physics: Conference Series
Issue number1
Publication statusPublished - 21 Sept 2020
Event2nd Ahmad Dahlan International Conference on Mathematics and Mathematics Education, ADINTERCOMME 2019 - Yogyakarta, Indonesia
Duration: 8 Nov 20199 Nov 2019


Dive into the research topics of 'Unconstrained optimization based fractional order derivative for data classification'. Together they form a unique fingerprint.

Cite this