Learning fast sparsifying overcomplete dictionaries

C. Rusu, J. Thompson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper we propose a dictionary learning method that builds an over complete dictionary that is computationally efficient to manipulate, i.e., sparse approximation algorithms have sub-quadratic computationally complexity. To achieve this we consider two factors (both to be learned from data) in order to design the dictionary: an orthonormal component made up of a fixed number of fast fundamental orthonormal transforms and a sparse component that builds linear combinations of elements from the first, orthonormal component. We show how effective the proposed technique is to encode image data and compare against a previously proposed method from the literature. We expect the current work to contribute to the spread of sparsity and dictionary learning techniques to hardware scenarios where there are hard limits on the computational capabilities and energy consumption of the computer systems.
Original languageUndefined/Unknown
Title of host publication2017 25th European Signal Processing Conference (EUSIPCO)
Pages723-727
Number of pages5
DOIs
Publication statusPublished - 1 Aug 2017

Keywords / Materials (for Non-textual outputs)

  • approximation theory
  • computational complexity
  • iterative methods
  • learning (artificial intelligence)
  • sparse matrices
  • dictionary learning method
  • fast fundamental orthonormal transforms
  • image data
  • orthonormal component
  • overcomplete dictionaries
  • sparse approximation algorithms
  • sparse component
  • sub-quadratic computationally complexity
  • Approximation algorithms
  • Dictionaries
  • Linear programming
  • Machine learning
  • Signal processing algorithms
  • Sparse matrices
  • Transforms

Cite this