Abstract
We consider a joint processing of n independent similar sparse regression
problems. Each is based on a sample (y_{i1}, x_{i1}) . . . , (y_{im}, x_{im})
of m i.i.d. observations from y_{ik} = x^T_{ik}β_i + ε_{ik}, y_{ik} ∈ R, x_{ik} ∈ R^p,
and ε_{ik} ∼ N(0, σ), say. The dimension p is large enough so that the
empirical risk minimizer is not feasible. We consider, from a Bayesian
point of view, three possible extensions of the lasso. Each of the three
estimators, the lassoes, the group lasso, and the RING lasso, utilizes
different assumptions on the relation between the n vectors β_1, . . . , β_n.
problems. Each is based on a sample (y_{i1}, x_{i1}) . . . , (y_{im}, x_{im})
of m i.i.d. observations from y_{ik} = x^T_{ik}β_i + ε_{ik}, y_{ik} ∈ R, x_{ik} ∈ R^p,
and ε_{ik} ∼ N(0, σ), say. The dimension p is large enough so that the
empirical risk minimizer is not feasible. We consider, from a Bayesian
point of view, three possible extensions of the lasso. Each of the three
estimators, the lassoes, the group lasso, and the RING lasso, utilizes
different assumptions on the relation between the n vectors β_1, . . . , β_n.
Original language | English |
---|---|
Title of host publication | Inverse Problems and High-Dimensional estimation |
Subtitle of host publication | Stats in the Château Summer School, August 31 - September 4, 2009 |
Editors | Pierre Alquier, Eric Gautier, Gilles Stoltz |
Publisher | Springer |
Pages | 171-189 |
Number of pages | 19 |
Volume | 203 |
Edition | 1 |
ISBN (Print) | 978-3-642-19988-2 |
DOIs | |
Publication status | Published - 2011 |
Publication series
Name | Lecture Notes in Statistics |
---|---|
Publisher | Springer Berlin Heidelberg |
ISSN (Print) | 0930-0325 |
Keywords / Materials (for Non-textual outputs)
- sparse regression
- LASSO
- Bayesian inference
- compound decision