## Abstract

Given paired observation (x_{i}, v_{1i}, v_{2i}, ⋯, v_{pi}, t_{1i}, t_{2i}, ⋯, t_{qi}, y_{i}), i = 1, 2, ⋯, n, follow the additive semiparametric regression model y_{i} = μ(x_{i}, v_{i}, t_{i}) +_{i}, where μ(xi,vt,ti)=f(xi)+∑j=1pgj(νji)+∑s=1qhs(tsi) v_{i} = (v_{1i}, v_{2i}, ⋯, v_{pi})^{′}, and t_{i} = (t_{1i}, t_{2i}, ⋯, t_{qi})^{′}. Random errors_{i} is a normal distribution with mean 0 and variance σ ^{2}. To obtain a mixed estimator μ(x_{i}, v_{i}, t_{i}), the regression curve f(x_{i}) is approached by linier parametric, g_{j}(v_{ji}) is kernel with bandwidths Φ = (φ_{1}, φ_{2}, ⋯, φ_{p})^{′} and the regression curve component fourier series h_{s} (t_{si}) is approached by with oscillation paremeter N. The estimator is where . Penalized Least Squares (PLS) method give Minc,β{ L(c)+L(β)+∑s=1qθsS(Hs(tsi)) } with smoothing parameter θ = (θ_{1}, θ_{2}, ⋯, θ_{q})^{′}, the estimator f(x) is and is , where and . So that, μΦ,θ,N(vi,ti)=Z(Φ,θ,N)y is the mixed estimator of μ(v_{i}, t_{i}) where Z(Φ, θ, N) = C(Φ, θ, N) + V(Φ) + E(Φ, θ, N) Matrix C(Φ, θ, N), V(Φ) and E(Φ, θ, N) are depended on Φ, θ and N. Optimal Φ, θ and N can be obtained by the smallest Generalized Cross Validation (GCV).

Original language | English |
---|---|

Article number | 012002 |

Journal | Journal of Physics: Conference Series |

Volume | 855 |

Issue number | 1 |

DOIs | |

Publication status | Published - 12 Jun 2017 |

Event | 1st International Conference on Mathematics: Education, Theory, and Application, ICMETA 2016 - Surakarta, Indonesia Duration: 6 Dec 2016 → 7 Dec 2016 |