Abstract
Untrained Neural Network Prior (UNNP) based algorithms have gained increasing popularity in biomedical imaging, offering superior performance compared to hand-crafted priors and requiring no training. UNNP-based methods typically rely on deep architectures, known for their excellent feature extraction ability compared to shallow ones. Contrary to common UNNP-based approaches, we propose a regularized shallow image prior (R-SIP) method that employs a 3-layer Multi-Layer Perceptron (MLP) as the UNNP in regularizing 2D and 3D Electrical Impedance Tomography (EIT) inversion and utilizes the handcrafted regularization to promote and stabilize the inversion process. The proposed algorithm is comprehensively evaluated on both simulated and real-world geometric and lung phantoms. We demonstrate significantly improved EIT image quality compared to conventional regularization-based algorithms, particularly in terms of structure preservation — a longstanding challenge in EIT. We reveal that 3-layer MLPs with various architectures can achieve similar reconstruction quality, indicating that the proposed R-SIP-based algorithm involves fewer architectural dependencies and entails less complexity in the neural network.
| Original language | English |
|---|---|
| Article number | 4502911 |
| Number of pages | 12 |
| Journal | IEEE Transactions on Instrumentation and Measurement |
| Volume | 74 |
| Early online date | 25 Feb 2025 |
| DOIs | |
| Publication status | E-pub ahead of print - 25 Feb 2025 |
Keywords / Materials (for Non-textual outputs)
- Inverse problem
- electrical impedance tomography
- hand-crafted prior
- shallow multi-layer perceptron
- untrained neural network prior
- Electrical impedance tomography (EIT)
- shallow multilayer perceptron (MLP)
- handcrafted prior
- inverse problem (IP)
- untrained neural network prior (UNNP)