Warm-Starting in Message Passing Algorithms

Nikolai Skuratovs, Michael E. Davies

Research output: Contribution to conferencePaperpeer-review

Abstract / Description of output

Vector Approximate Message Massing (VAMP) provides
the means of solving a linear inverse problem in a Bayesoptimal
way assuming the measurement operator is sufficiently
random. However, VAMP requires implementing the linear
minimum mean squared error (LMMSE) estimator at every
iteration, which makes the algorithm intractable for large-scale
problems. In this work, we present a class of warm-started
(WS) methods that provides a scalable approximation of LMMSE
within VAMP. We show that a Message Passing (MP) algorithm
equipped with a method from this class can converge to the
fixed point of VAMP while having a per-iteration computational
complexity proportional to that of AMP. Additionally, we provide
the Onsager correction and a multi-dimensional State Evolution
for MP utilizing one of the WS methods. Lastly, we show that the
approximation approach used in the recently proposed Memory
AMP (MAMP) algorithm is a special case of the developed class
of WS methods.
Original languageEnglish
Publication statusAccepted/In press - 22 Apr 2022
EventIEEE International Symposium on Information Theory, ISIT 2022 - Aalto University, Espoo, Finland
Duration: 26 Jun 20221 Jul 2022


ConferenceIEEE International Symposium on Information Theory, ISIT 2022
Abbreviated title ISIT 2022
Internet address


Dive into the research topics of 'Warm-Starting in Message Passing Algorithms'. Together they form a unique fingerprint.

Cite this