Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A two-parametric class of predictors in multivariate regression
Stockholm University, Faculty of Science, Department of Mathematics.
Stockholm University, Faculty of Science, Department of Mathematics.
2007 (English)In: Journal of Chemometrics, ISSN 0886-9383, E-ISSN 1099-128X, Vol. 21, no 5-6, 215-226 p.Article in journal (Refereed) Published
Abstract [en]

We demonstrate that a number of well-established multivariate regression methods for prediction are related in that they are special cases of basically one general procedure. We try a more general method based on this procedure with two metaparameters. In a simulation study, based on a latent structure model, we compare this method to ridge regression (RR), multivariate partial least squares regression (PLSR) and repeated univariate PLSR. For most types of data sets studied, all methods do approximately equally well. There are some cases where RR and least squares ridge regression (LSRR) yield larger errors than the other methods, and we conclude that one-factor methods are not adequate for situations where more than one latent variable are needed to describe the data. Among those based on latent variables, none of the methods tried is superior to the others in any obvious way.

Place, publisher, year, edition, pages
2007. Vol. 21, no 5-6, 215-226 p.
Keyword [en]
joint continuum regression, multivariate prediction, multivariate regression, PCR, PLSR, reduced rank regression, ridge regression, SIMPLS, total least squares
National Category
Mathematics
Identifiers
URN: urn:nbn:se:su:diva-24418DOI: 10.1002/cem.1063ISI: 000250098200006OAI: oai:DiVA.org:su-24418DiVA: diva2:197490
Note
Part of urn:nbn:se:su:diva-7025Available from: 2007-09-06 Created: 2007-08-28 Last updated: 2017-12-13Bibliographically approved
In thesis
1. Regression methods in multidimensional prediction and estimation
Open this publication in new window or tab >>Regression methods in multidimensional prediction and estimation
2007 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

In regression with near collinear explanatory variables, the least squares predictor has large variance. Ordinary least squares regression (OLSR) often leads to unrealistic regression coefficients. Several regularized regression methods have been proposed as alternatives. Well-known are principal components regression (PCR), ridge regression (RR) and continuum regression (CR). The latter two involve a continuous metaparameter, offering additional flexibility.

For a univariate response variable, CR incorporates OLSR, PLSR, and PCR as special cases, for special values of the metaparameter. CR is also closely related to RR. However, CR can in fact yield regressors that vary discontinuously with the metaparameter. Thus, the relation between CR and RR is not always one-to-one. We develop a new class of regression methods, LSRR, essentially the same as CR, but without discontinuities, and prove that any optimization principle will yield a regressor proportional to a RR, provided only that the principle implies maximizing some function of the regressor's sample correlation coefficient and its sample variance. For a multivariate response vector we demonstrate that a number of well-established regression methods are related, in that they are special cases of basically one general procedure. We try a more general method based on this procedure, with two meta-parameters. In a simulation study we compare this method to ridge regression, multivariate PLSR and repeated univariate PLSR. For most types of data studied, all methods do approximately equally well. There are cases where RR and LSRR yield larger errors than the other methods, and we conclude that one-factor methods are not adequate for situations where more than one latent variable are needed to describe the data. Among those based on latent variables, none of the methods tried is superior to the others in any obvious way.

Place, publisher, year, edition, pages
Stockholm: Matematiska institutionen, 2007. 146 p.
Keyword
regression, prediction, principal compnents regression, ridge regression, partial least squares
National Category
Probability Theory and Statistics
Research subject
Mathematical Statistics
Identifiers
urn:nbn:se:su:diva-7025 (URN)978-91-7155-486-4 (ISBN)
Public defence
2007-09-28, sal 14, hus 5, Kräftriket, Stockholm, 13:00
Opponent
Supervisors
Available from: 2007-09-06 Created: 2007-08-28Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full text

Search in DiVA

By author/editor
Björkström, AndersSundberg, Rolf
By organisation
Department of Mathematics
In the same journal
Journal of Chemometrics
Mathematics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 37 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf