Change search
ReferencesLink to record
Permanent link

Direct link
Continuum regression is not always continuous
Stockholm University, Faculty of Science, Department of Mathematics.
1996 In: J. R. Statist. Soc., Vol. B 58, no 4, 703-710 p.Article in journal (Refereed) Published
Place, publisher, year, edition, pages
1996. Vol. B 58, no 4, 703-710 p.
URN: urn:nbn:se:su:diva-24415OAI: diva2:197487
Part of urn:nbn:se:su:diva-7025Available from: 2007-09-06 Created: 2007-08-28Bibliographically approved
In thesis
1. Regression methods in multidimensional prediction and estimation
Open this publication in new window or tab >>Regression methods in multidimensional prediction and estimation
2007 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

In regression with near collinear explanatory variables, the least squares predictor has large variance. Ordinary least squares regression (OLSR) often leads to unrealistic regression coefficients. Several regularized regression methods have been proposed as alternatives. Well-known are principal components regression (PCR), ridge regression (RR) and continuum regression (CR). The latter two involve a continuous metaparameter, offering additional flexibility.

For a univariate response variable, CR incorporates OLSR, PLSR, and PCR as special cases, for special values of the metaparameter. CR is also closely related to RR. However, CR can in fact yield regressors that vary discontinuously with the metaparameter. Thus, the relation between CR and RR is not always one-to-one. We develop a new class of regression methods, LSRR, essentially the same as CR, but without discontinuities, and prove that any optimization principle will yield a regressor proportional to a RR, provided only that the principle implies maximizing some function of the regressor's sample correlation coefficient and its sample variance. For a multivariate response vector we demonstrate that a number of well-established regression methods are related, in that they are special cases of basically one general procedure. We try a more general method based on this procedure, with two meta-parameters. In a simulation study we compare this method to ridge regression, multivariate PLSR and repeated univariate PLSR. For most types of data studied, all methods do approximately equally well. There are cases where RR and LSRR yield larger errors than the other methods, and we conclude that one-factor methods are not adequate for situations where more than one latent variable are needed to describe the data. Among those based on latent variables, none of the methods tried is superior to the others in any obvious way.

Place, publisher, year, edition, pages
Stockholm: Matematiska institutionen, 2007. 146 p.
regression, prediction, principal compnents regression, ridge regression, partial least squares
National Category
Probability Theory and Statistics
Research subject
Mathematical Statistics
urn:nbn:se:su:diva-7025 (URN)978-91-7155-486-4 (ISBN)
Public defence
2007-09-28, sal 14, hus 5, Kräftriket, Stockholm, 13:00
Available from: 2007-09-06 Created: 2007-08-28Bibliographically approved

Open Access in DiVA

No full text

By organisation
Department of Mathematics

Search outside of DiVA

GoogleGoogle Scholar
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 30 hits
ReferencesLink to record
Permanent link

Direct link