||February 25th, 2019
||Kelley Engineering Center (KEC) Room 1001
||This seminar is free and open to the public.
Optimal low-rank approximations for Bayesian linear inverse problems
In the Bayesian approach to large-scale inverse problems data are informative relative to the prior only on a low-dimensional subspace of the parameter space. We propose optimal dimensionality reduction techniques for the solution of linear–Gaussian inverse problems that can focus on a quantity of interest (QoI) (a function of the inversion parameters). We study the approximation of the posterior covariance of the QoI as low-rank negative updates of its prior covariance, and prove optimality of this update with respect to a natural geodesic distance on the manifold of symmetric positive definite matrices first proposed by Rao. Assuming exact knowledge of the posterior mean of the QoI, the optimality results extend to optimality in distribution with respect to the Kullback-Leibler divergence and the Hellinger distance between the associated distributions. We also propose the approximation of the posterior mean of the QoI as a low-rank linear function of the data, and prove optimality of this approximation with respect to a weighted Bayes risk. These optimal approximations do not require the explicit computation of the full posterior distribution of the parameters and focus instead on directions that are well informed by the data and are relevant to the QoI.
For more information about Luis Tenorio, Click here.