In this talk, preconditioning strategies for sparse, adaptive quadrature methods for computational Bayesian inversion of operator equations with distributed uncertain input parameters will be presented. Based on sparsity results of the posterior, error bounds and convergence rates of dimension-adaptive Smolyak quadratures can be shown to be independent of the parameter dimension, but the error bounds depend exponentially on the inverse of the covariance of the additive, Gaussian observation noise. We will discuss asymptotic expansions of the Bayesian estimates, which can be used to construct quadrature methods combined with a curvature-based reparametrization of the parametric Bayesian posterior density near the (assumed unique) global maximum of the posterior density leading to convergence with rates independent of the number of parameters as well as of the observation noise variance.