Projects per year
Abstract
We develop a principled way of identifying probability distributions whose independent and identically distributed (iid) realizations are compressible, i.e., can be wellapproximated as sparse. We focus on Gaussian random underdetermined linear regression (GULR) problems, where compressibility is known to ensure the success of estimators exploiting sparse regularization. We prove that many distributions revolving around maximum a posteriori (MAP) interpretation of sparse regularized estimators are in fact incompressible, in the limit of large problem sizes. A highlight is the Laplace distribution and $\ell^{1}$ regularized estimators such as the Lasso and Basis Pursuit denoising. To establish this result, we identify nontrivial undersampling regions in GULR where the simple least squares solution almost surely outperforms an oracle sparse solution, when the data is generated from the Laplace distribution. We provide simple rules of thumb to characterize classes of compressible (respectively incompressible) distributions based on their second and fourth moments. Generalized Gaussians and generalized Pareto distributions serve as running examples for concreteness.
Original language  English 

Pages (fromto)  50165034 
Journal  IEEE Transactions on Information Theory 
Volume  58 
Issue number  8 
DOIs  
Publication status  Published  Aug 2012 
Keywords
 Basis pursuit
 Lasso
 compressed sensing
 compressible distribution
 highdimensional statistics
 instance optimality
Fingerprint
Dive into the research topics of 'Compressible distributions for highdimensional statistics'. Together they form a unique fingerprint.Projects
 2 Finished

Extensions to compressed sensing theory with application to dynamic MRI
1/03/09 → 31/03/12
Project: Research
