Document detail
ID

oai:arXiv.org:2402.16623

Topic
Mathematics - Numerical Analysis Mathematics - Optimization and Con... 62F15, 65F22, 65K10, 68Q25, 68U10
Author
Lindbloom, Jonathan Glaubitz, Jan Gelb, Anne
Category

Computer Science

Year

2024

listing date

2/28/2024

Keywords
unknown models
Metrics

Abstract

Bayesian hierarchical models can provide efficient algorithms for finding sparse solutions to ill-posed inverse problems.

The models typically comprise a conditionally Gaussian prior model for the unknown which is augmented by a generalized gamma hyper-prior model for variance hyper-parameters.

This investigation generalizes these models and their efficient maximum a posterior (MAP) estimation using the iterative alternating sequential (IAS) algorithm in two ways: (1) General sparsifying transforms: Diverging from conventional methods, our approach permits the use of sparsifying transformations with nontrivial kernels; (2) Unknown noise variances: We treat the noise variance as a random variable that is estimated during the inference procedure.

This is important in applications where the noise estimate cannot be accurately estimated a priori.

Remarkably, these augmentations neither significantly burden the computational expense of the algorithm nor compromise its efficacy.

We include convexity and convergence analysis for the method and demonstrate its efficacy in several numerical experiments.

;Comment: 30 pages, 6 figures

Lindbloom, Jonathan,Glaubitz, Jan,Gelb, Anne, 2024, Generalized sparsity-promoting solvers for Bayesian inverse problems: Versatile sparsifying transforms and unknown noise variances

Document

Open

Share

Source

Articles recommended by ES/IODE AI

Systematic druggable genome-wide Mendelian randomization identifies therapeutic targets for lung cancer
agphd1 subtypes replication hykk squamous cell gene carcinoma causal targets mendelian randomization cancer analysis