TY - GEN
T1 - FERNN
T2 - 2019 International Joint Conference on Neural Networks, IJCNN 2019
AU - Das, Monidipa
AU - Pratama, Mahardhika
AU - Ashfahani, Andri
AU - Samanta, Subhrajit
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/7
Y1 - 2019/7
N2 - With the recent explosion of data navigating in motion, there is a growing research interest for analyzing streaming data, and consequently, there are several recent works on data stream analytics. However, exploring the potentials of traditional recurrent neural network (RNN) in the context of streaming data classification is still a little investigated area. In this paper, we propose a novel variant of RNN, termed as FERNN, which features single-pass learning capability along with self-evolution property. The online learning capability makes FERNN fit for working on streaming data, whereas the self-organizing property makes the model adaptive to the rapidly changing environment. FERNN utilizes hyperplane activation in the hidden layer, which not only reduces the network parameters to a significant extent, but also triggers the model to work by default as per teacher forcing mechanism so that it automatically handles the vanishing/exploding gradient issues in traditional RNN learning based on back-propagation-through-time policy. Moreover, unlike the majority of the existing autonomous learning models, FERNN is free from normal distribution assumption for streaming data, making it more flexible. The efficacy of FERNN is evaluated in terms of classifying six publicly available data streams, under the prequential test-then-train protocol. Experimental results show encouraging performance of FERNN attaining state-of-the-art classification accuracy with fairly reduced computation cost.
AB - With the recent explosion of data navigating in motion, there is a growing research interest for analyzing streaming data, and consequently, there are several recent works on data stream analytics. However, exploring the potentials of traditional recurrent neural network (RNN) in the context of streaming data classification is still a little investigated area. In this paper, we propose a novel variant of RNN, termed as FERNN, which features single-pass learning capability along with self-evolution property. The online learning capability makes FERNN fit for working on streaming data, whereas the self-organizing property makes the model adaptive to the rapidly changing environment. FERNN utilizes hyperplane activation in the hidden layer, which not only reduces the network parameters to a significant extent, but also triggers the model to work by default as per teacher forcing mechanism so that it automatically handles the vanishing/exploding gradient issues in traditional RNN learning based on back-propagation-through-time policy. Moreover, unlike the majority of the existing autonomous learning models, FERNN is free from normal distribution assumption for streaming data, making it more flexible. The efficacy of FERNN is evaluated in terms of classifying six publicly available data streams, under the prequential test-then-train protocol. Experimental results show encouraging performance of FERNN attaining state-of-the-art classification accuracy with fairly reduced computation cost.
KW - Classification
KW - Data stream
KW - Hyperplane
KW - Online learning
KW - RNN
KW - Teacher forcing
UR - http://www.scopus.com/inward/record.url?scp=85073185351&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2019.8851757
DO - 10.1109/IJCNN.2019.8851757
M3 - Conference contribution
AN - SCOPUS:85073185351
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2019 International Joint Conference on Neural Networks, IJCNN 2019
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 14 July 2019 through 19 July 2019
ER -