×

zbMATH — the first resource for mathematics

Approximation of least squares regression on nested subspaces. (English) Zbl 0669.62047
This paper considers the regression model \(y_ i=\theta (x_ i)+\epsilon_ i\) \((i=1,...,n)\) where \(\theta\) is an unknown function mapping \({\mathbb{R}}^ d\to {\mathbb{R}}^ q\). Let \(\theta_{nm}\) be the least squares estimator of \(\theta\) obtained from the model assuming that \(\theta\) belongs to a given subspace of functions span \(\{\psi_ 1,...,\psi_ m\}.\)
Theorems are given for approximating the bias and variance of \(\theta_{nm}\) in a scale of Hilbert norms natural to the problem, when n and m are large and the design determined by the \(x_ i's\) is suitably approximated by a design measure. Two examples (with \(d=q=1)\) illustrate the theory: polynomial and Fourier series regression.
Reviewer: H.Caussinus

MSC:
62J05 Linear regression; mixed models
62F12 Asymptotic properties of parametric estimators
41A10 Approximation by polynomials
PDF BibTeX XML Cite
Full Text: DOI