In this paper, a novel fuzzy neural network termed as dynamic parsimonious fuzzy neural network (DPFNN) is proposed. DPFNN is a four layers network, which features coalescence between TSK (Takagi-Sugeno-Kang) fuzzy architecture and multivariate Gaussian kernels as membership functions. The training procedure is characterized by four aspects: (1) DPFNN may evolve fuzzy rules as new training datum arrives, which enables to cope with non-stationary processes. We propose two criteria for rule generation: system error and ε-completeness reflecting both a performance and sample coverage of an existing rule base. (2) Insignificant fuzzy rules observed over time based on their statistical contributions are pruned to truncate the rule base complexity and redundancy. (3) The extended self organizing map (ESOM) theory is employed to dynamically update the centers of the ellipsoidal basis functions in accordance with input training samples. (4) The optimal fuzzy consequent parameters are updated by time localized least square (TLLS) method that exploits a concept of sliding window in order to reduce the computational burden of the least squares (LS) method. The viability of the new method is intensively investigated based on real-world and artificial problems as it is shown that our method not only arguably delivers more compact and parsimonious network structures, but also achieves lower predictive errors than state-of-the-art approaches.
- Dynamic parsimonious fuzzy neural network (DPFNN)
- Radial basis function (RBF)
- Rule growing
- Rule pruning
- Self organizing map (SOM)