In many imaging applications the image intensity is measured by counting incident particles and, consequently, the fluctuations in the counting process can be taken into account by modeling the data as realizations of Poisson random variables. In such case, the maximum likelihood approach for image restoration leads to minimization problems in which the data-fidelity function is the generalized Kullback-Leibler (KL) divergence. Since, in general, these optimization problems are ill-conditioned, regularization approaches are necessary and the design of strategies for selecting a proper value of the regularization parameter is a crucial issue. This is still an open problem for Poisson data and, in special cases, interesting contributions have been recently provided. In this work we consider some regularization models and the theoretical and numerical issues concerning with their parameter estimations. Following the idea of the discrepancy principle, we discuss strategies that provide a parameter estimation as solution of a discrepancy equation and also regularization models in which a suited estimation is obtained by solving constrained optimization problems which force an upper bound on the discrepancy function. Furthermore, reconstruction strategies that require only an overestimation of the regularization parameter will be also presented.
Parameter estimation in regularization models for Poisson data / Zanni, Luca. - (2014). (Intervento presentato al convegno First French-German Mathematical Image Analysis Conference (FGMIA-14) tenutosi a Paris, France nel 13 - 15 January 2014).
Parameter estimation in regularization models for Poisson data
ZANNI, Luca
2014
Abstract
In many imaging applications the image intensity is measured by counting incident particles and, consequently, the fluctuations in the counting process can be taken into account by modeling the data as realizations of Poisson random variables. In such case, the maximum likelihood approach for image restoration leads to minimization problems in which the data-fidelity function is the generalized Kullback-Leibler (KL) divergence. Since, in general, these optimization problems are ill-conditioned, regularization approaches are necessary and the design of strategies for selecting a proper value of the regularization parameter is a crucial issue. This is still an open problem for Poisson data and, in special cases, interesting contributions have been recently provided. In this work we consider some regularization models and the theoretical and numerical issues concerning with their parameter estimations. Following the idea of the discrepancy principle, we discuss strategies that provide a parameter estimation as solution of a discrepancy equation and also regularization models in which a suited estimation is obtained by solving constrained optimization problems which force an upper bound on the discrepancy function. Furthermore, reconstruction strategies that require only an overestimation of the regularization parameter will be also presented.Pubblicazioni consigliate
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris