Fachinformationszentrum Karlsruhe

Dept. of Mathematics and Computer Science (Berlin)


Your review has been received.
Thank you very much.
reviewer: Miloslav Znojil

reviewernum: 9689

revieweremail: znojil@ujf.cas.cz

zblno: DE015066134

author: Osborne, M. R.; Presnell, Brett; Turlach, B. A.:

shorttitle: Selection of variables in least squares problems

source: IMA J. Numer. Anal. 20, No. 3, 389 - 403 (2000).

rpclass: 65F20

rsclass: 65F35; 55Q52; 62J12; 91B44

keywords: exploratory data analysis, design matrix, selection of columns, stepwise regression, minimal sum of squares of residuals, selection mechanisms, homotopy method, descent method

revtext: The least square technique of solving a large non-homogeneous linear algebraic set is considered within the so called Lasso approach reducing the set of variables (columns of the matrix). The ``non-smooth" constraint (demanding that the $\ell_1$ norm of the solution vector $x$ is smaller than a constant $\kappa$) has a useful property of forcing components of $x$ to zero when $\kappa$ is small, but precisely this requirement makes the problem complicated. The paper introduces and studies the two complementary methods. Both of them have a finite termination property and both of them may find an efficient implementation via a modified Gram-Schmidt orthogonalization. One of them (a compact descent method) can operate as a probe at a particular $\kappa$, the other one (viz., homotopy method) is capable to describe the possible selection regimes globally.


review-form Generator ()
Written by Michael Jost (jo@zblmath.FIZ-Karlsruhe.DE).