MA40189: Topics in Bayesian statistics |
Quick links |
Lectures and timetable information |
Lecturer: | Simon Shaw; s.shaw at bath.ac.uk | |
Timetable: | Lectures: | Monday 09:15 (3W4.7) and Tuesday 14:15 (3W4.7). |
Problems classes: | Thursday 16:15 (3W4.7). |
The full unit timetable is available here. A schedule for the course is available here.
Syllabus |
Credits: | 6 |
Level: | Masters |
Period: | Semester 2 |
Assessment: | EX 100% |
Other work: | There will be weekly question sheets. These will be set and handed in during problems classes. Any work submitted by the hand-in deadline will be marked and returned to you. Full solutions to all exercises and general feedback sheets will be made available. |
Requisites: | Before taking this unit you must take MA40092 (home-page). |
Description: | Aims & Learning Objectives: Aims: To introduce students to the ideas and techniques that underpin the theory and practice of the Bayesian approach to statistics. Objectives: Students should be able to formulate the Bayesian treatment and analysis of many familiar statistical problems. Content: Bayesian methods provide an alternative approach to data analysis, which has the ability to incorporate prior knowledge about a parameter of interest into the statistical model. The prior knowledge takes the form of a prior (to sampling) distribution on the parameter space, which is updated to a posterior distribution via Bayes' Theorem, using the data. Summaries about the parameter are described using the posterior distribution. The Bayesian Paradigm; decision theory; utility theory; exchangeability; Representation Theorem; prior, posterior and predictive distributions; conjugate priors. Tools to undertake a Bayesian statistical analysis will also be introduced. Simulation based methods such as Markov Chain Monte Carlo and importance sampling for use when analytical methods fail. |
Some useful books |
We won't follow a book as such but useful references, in ascending order of difficulty, include:
Lecture notes and summaries |
Lecture 1 (05 Feb 18): | Introduction: working definitions of classical and Bayesian approaches to inference about parameters. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p4-5 (middle of Example 3). | |
Lecture 2 (06 Feb 18): | §1 The Bayesian method: Bayes' theorem, using Bayes' theorem for parametric inference. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p5 (middle of Example 3)-7 (end of page). | |
Lecture 3 (12 Feb 18): | Sequential data updates, conjugate Bayesian updates, Beta-Binomial example. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p7 (end of page)-9 (after equation (1.11)). | |
Lecture 4 (13 Feb 18): | Definition of conjugate family, role of prior (weak and strong) and likelihood in the posterior. Handout of beta distributions: pdf. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p9 (after equation (1.11))-10 (end of page). | |
Lecture 5 (19 Feb 18): | Example of weak/strong prior finished, kernel of a density, conjugate Normal example. Handout of weak/strong prior example: pdf. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p10 (end of page)-13 (prior to equation (1.17)). | |
Lecture 6 (20 Feb 18): | Conjugate Normal example concluded. Using the posterior for inference, credible interval. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p13 (prior to equation (1.17))-15 (table in Example 9). | |
Lecture 7 (26 Feb 18): | Highest density regions, §2 Modelling: predictive distribution, Binomial-Beta example. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p15 (table in Example 9)-19 (equation (2.3)). | |
Lecture 8 (27 Feb 18): | Predictive summaries, finite exchangeability. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p19 (equation (2.3))-21 (end of Example 14). | |
Lecture 9 (06 Mar 18): | Infinite exchangeability, example of non-extendability of finitely exchangeable sequence, general representation theorem for infinitely exchangeable events. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p21 (end of Example 14)-23 (end of point 3). | |
Lecture 10 (08 Mar 18): | General representation theorem for infinitely exchangeable random variables, example of exchangeable Normal random variables. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p23 (end of point 3)-25 (prior to Section 2.3). | |
Lecture 11 (13 Mar 18): | Sufficiency, k-parameter exponential family, sufficient statistics, conjugate priors for exchangeable k-parameter exponential family random quantities. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p25 (prior to Section 2.3)-27 (equation (2.22)). | |
Lecture 12 (15 Mar 18): | Hyperparameters, usefulness of conjugate priors. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p27 (equation (2.22))-29 (prior to Section 2.4). | |
Lecture 13 (20 Mar 18): | Improper priors, Fisher information matrix, Jeffreys' prior. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p29 (prior to Section 2.4)-31 (end of Example 24). | |
Lecture 14 (22 Mar 18): | Invariance property under transformation of the Jeffreys prior, final remarks about noninformative priors. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p31 (end of Example 24)-34 (end of Chapter 2). | |
Lecture 15 (10 Apr 18): | §3 Computation: normal approximation, expansion about the mode. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p34 (end of Chapter 2)-37 (prior to Section 3.2.1). | |
Lecture 16 (12 Apr 18): | Monte Carlo integration, importance sampling. Basic idea of Markov chain Monte Carlo (MCMC): transition kernel. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p37 (prior to Section 3.2.1)-40 (first paragraph of Section 3.3.1). | |
Lecture 17 (17 Apr 18): | Basic definitions (irreducible, periodic, recurrent, ergodic, stationary) and theorems (existence/uniqueness, convergence, ergodic) of Markov chains and their consequences for MCMC techniques. The Metropolis-Hastings algorithm. Handout: pdf. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p40 (first paragraph of Section 3.3.1)-43 (end of algorithm). | |
Lecture 18 (19 Apr 18): | Example of the Metropolis-Hastings algorithm. Handout of example: pdf. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p43 (end of algorithm)-48 (end of Figure 3.3). | |
Lecture 19 (23 Apr 18): | The Gibbs sampler algorithm and example. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p48 (end of Figure 3.3)-53 (after conditional distributions). | |
Lecture 19A (24 Apr 18): | Gibbs sampler example illustrated. Handout of example: pdf. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p53 (after conditional distributions)-59 (prior to Section 3.3.4). | |
Lecture 20 (26 Apr 18): | Overview of why the Metropolis-Hastings algorithm works, efficiency of MCMC algorithms. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p59 (prior to Section 3.3.4)-61 (after Section 3.3.6). | |
Lecture 21 (30 April 18): | §4 Decision theory: Statistical decision theory: loss, risk, Bayes risk and Bayes rule. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p61 (after Section 3.3.6)-65 (start of Example 36). Note that Section 4.1 Utility is omitted this year. | |
Lecture 22 (01 May 18): | Quadratic loss, Bayes risk of the sampling procedure, worked example. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p65 (start of Example 36)-67 (after equation (4.7)). | |
Lecture 23 (03 May 18): | Example concluded. |
Lecture overview: pdf. Handwritten notes: pdf. Online notes: p67 (after equation (4.7))-69 (end of course). |
Forthcoming material (based upon the 2016/17 schedule):
R functions |
gibbs2 | Gibbs sampler for θ1|θ2 ~ Bin(n, θ2), θ2|θ1 ~ Beta(θ1+α, n-θ1+β), illustration of Example 32 (p54) of the lecture notes. Sample plots for n=10, α=β=1 and n=10, α=2, β=3: pdf. |
gibbs.update | Step by step illustration of the Gibbs sampler for bivariate normal, X, Y standard normal with Cov(X, Y) = rho; press return to advance. |
gibbs | Long run version of the Gibbs sampler for bivariate normal, X, Y standard normal with Cov(X, Y) = rho. |
metropolis.update | Step by step illustration of Metropolis-Hastings for sampling from N(mu.p,sig.p^2) with proposal N(theta[t-1],sig.q^2); press return to advance. |
metropolis | Long run version of Metropolis-Hastings for sampling from N(mu.p,sig.p^2) with proposal N(theta[t-1],sig.q^2). |
Illustration of Example 30 (p45) of the lecture notes using metropolis.update and metropolis with mu.p=0, sig.p=1 and firstly sig.q=1 and secondly sig.q=0.6: pdf. | |
plot.mcmc | Plot time series summaries of output from a Markov chain. Allows you to specify burn-in and thinning. |
f | Function for plotting bivariate Normal distribution in gibbs.update. |
All above | All of the above functions in one file for easy reading into R; thanks to Ruth Salway for these functions. |
The following functions are for sampling from bivariate normals, with thanks to Merrilee Hurn
gibbs1 | Gibbs sampler (arguments: n the number of iterations, rho the correlation coefficient of the bivariate normal, start1 and start2 the initial values for the sampler). |
metropolis1 | Metropolis-Hastings (arguments: n the number of iterations, rho the correlation coefficient of the bivariate normal, start1 and start2 the initial values for the sampler, tau the standard deviation of the Normal proposal). |
metropolis2 | Metropolis-Hastings for sampling from a mixture of bivariate normals (arguments: n the number of iterations, rho the correlation coefficient of the bivariate normal, start1 and start2 the initial values for the sampler, tau the standard deviation of the Normal proposal, sigma2 the variance of the normal mixtures). |
Question sheets and solutions |
Question Sheet Zero: pdf. | Solution Sheet Zero: pdf. Problems class notes: pdf. |
Question Sheet One: pdf. | Solution Sheet One: pdf. Problems class notes: pdf. |
Question Sheet Two: pdf. | Solution Sheet Two: pdf. Problems class notes: pdf. |
Question Sheet Three: pdf. | Solution Sheet Three: pdf. Problems class notes: pdf. |
Question Sheet Four: pdf. | Solution Sheet Four: pdf. Problems class notes: pdf. |
Question Sheet Five: pdf. | Solution Sheet Five: pdf. |
Question Sheet Six: pdf. | Solution Sheet Six: pdf. Problems class notes: pdf. |
Question Sheet Seven: pdf. | Solution Sheet Seven: pdf. Problems class notes: pdf, R functions: txt. |
Question Sheet Eight: pdf. | Solution Sheet Eight: pdf. Problems class notes: pdf (pages C and D). |
Question Sheet Nine: pdf. | Solution Sheet Nine: pdf. |
Past exam papers and solutions |
Exams: | 2016/17 | 2015/16 | 2014/15 | 2013/14 | 2012/13 |
Paper: | |||||
Solutions: |
Last revision: |