TY - GEN
T1 - ACyLeR
T2 - 2024 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies, 3ICT 2024
AU - Kamal, Mustafa
AU - Shiddiqi, Ary Mazharuddin
AU - Nurhayati, Ervin
AU - Putra, Andika Laksana
AU - Mahhisa, Farrela Ranku
N1 - Publisher Copyright:
©2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Long-term time-series forecasting is critical in numerous domains, including economics, climate modeling, and energy management. Traditional deep learning models often struggle with optimizing hyperparameters, which can lead to suboptimal performance and increased sensitivity to initial conditions. This research addresses the problem by proposing an enhanced iTransformer model that integrates an Adaptive Cycling Learning Rate (ACLR) mechanism, named ACyLeR. The ACLR algorithm dynamically adjusts the learning rate during the training phase for better convergence and generalization while minimizing the risk of overfitting. The experiments were written in Python and tested using univariate Water Supply in Melbourne (WSM) and multivariate exchange rate (ER) datasets with 70% training, 10% validation, and 20% testing data grouping. Experimental results demonstrate that the ACyLeR with ACLR outperforms existing baseline models by achieving lower loss values and higher accuracy. The results significantly advance time-series forecasting using iTransformer.
AB - Long-term time-series forecasting is critical in numerous domains, including economics, climate modeling, and energy management. Traditional deep learning models often struggle with optimizing hyperparameters, which can lead to suboptimal performance and increased sensitivity to initial conditions. This research addresses the problem by proposing an enhanced iTransformer model that integrates an Adaptive Cycling Learning Rate (ACLR) mechanism, named ACyLeR. The ACLR algorithm dynamically adjusts the learning rate during the training phase for better convergence and generalization while minimizing the risk of overfitting. The experiments were written in Python and tested using univariate Water Supply in Melbourne (WSM) and multivariate exchange rate (ER) datasets with 70% training, 10% validation, and 20% testing data grouping. Experimental results demonstrate that the ACyLeR with ACLR outperforms existing baseline models by achieving lower loss values and higher accuracy. The results significantly advance time-series forecasting using iTransformer.
KW - Adaptive Cycling Learning Rate (ACLR)
KW - Water Demand Forecasting
KW - iTransformer
UR - http://www.scopus.com/inward/record.url?scp=85217361038&partnerID=8YFLogxK
U2 - 10.1109/3ICT64318.2024.10824647
DO - 10.1109/3ICT64318.2024.10824647
M3 - Conference contribution
AN - SCOPUS:85217361038
T3 - 2024 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies, 3ICT 2024
SP - 475
EP - 480
BT - 2024 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies, 3ICT 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 17 November 2024 through 19 November 2024
ER -