Lecturer: Simon Shaw, s.shaw@bath.ac.uk
Unit homepage: http://people.bath.ac.uk/masss/ma40189.html
Lecture notes: http://people.bath.ac.uk/masss/ma40189/MA40189-notes.pdf
\[\begin{eqnarray*} \begin{array}{|c|c|c|c|} \hline \mbox{Week} & \mbox{Tuesday 15:15} & \mbox{Thursday 11:15} & \mbox{Thursday 17:15} \\ \hline \mbox{19 (07 Feb 22)} & \mbox{Lecture 1} & \mbox{Lecture 2} & \begin{array}{c} \ \\ \ \end{array}\mbox{Problems Class}\begin{array}{c} \ \\ \ \end{array} \\ \hline \mbox{20 (14 Feb 22)} & \mbox{Lecture 3} & \mbox{Lecture 4} & \begin{array}{c} \mbox{Problems Class} \\ \mbox{Question Sheet One out} \end{array} \\ \hline \mbox{21 (21 Feb 22)} & \mbox{Lecture 5} & \mbox{Lecture 6} & \begin{array}{c} \mbox{Problems Class} \\ \mbox{Question Sheet One in} \\ \mbox{Question Sheet Two out} \end{array} \\ \hline \mbox{22 (28 Feb 22)} & \mbox{Lecture 7} & \mbox{Lecture 8} & \begin{array}{c} \mbox{Problems Class} \\ \mbox{Question Sheet Two in} \\ \mbox{Question Sheet Three out} \end{array} \\ \hline \mbox{23 (07 Mar 22)} & \mbox{Lecture 9} & \mbox{Lecture 10} & \begin{array}{c} \mbox{Problems Class} \\ \mbox{Question Sheet Three in} \\ \mbox{Question Sheet Four out}\end{array} \\ \hline \mbox{24 (14 Mar 22)} & \mbox{Lecture 11} &\mbox{Lecture 12} & \begin{array}{c} \mbox{Problems Class} \\ \mbox{Question Sheet Four in} \\ \mbox{Question Sheet Five out} \end{array} \\ \hline \mbox{25 (21 Mar 22)} & \mbox{Lecture 13} &\mbox{Lecture 14} & \begin{array}{c} \mbox{Problems Class} \\ \mbox{Question Sheet Five in} \\ \mbox{Question Sheet Six out}\end{array} \\ \hline \mbox{26 (28 Mar 22)} & \mbox{Lecture 15} &\mbox{Lecture 16} & \begin{array}{c} \mbox{Problems Class} \\ \mbox{Question Sheet Six in} \\ \mbox{Question Sheet Seven out} \end{array} \\ \hline \mbox{27 (04 Apr 24)} & \mbox{Lecture 17} &\mbox{Lecture 18} & \begin{array}{c} \mbox{Problems Class} \\ \mbox{Question Sheet Seven in} \\ \mbox{Question Sheet Eight out} \end{array} \\ \hline \mbox{30 (25 Apr 22)} & \mbox{Lecture 19} & \mbox{Lecture 20} & \begin{array}{c} \mbox{Problems Class} \\ \mbox{Question Sheet Eight in} \\ \mbox{Question Sheet Nine out}\end{array} \\ \hline \mbox{31 (02 May 22)} & \begin{array}{c} \mbox{Lecture 21} \\ \mbox{Revision class?} \end{array} &\begin{array}{c} \mbox{Lecture 22} \\ \mbox{Revision class?} \end{array} & \begin{array}{c} \mbox{Revision class?} \end{array} \\ \hline \end{array} \end{eqnarray*}\]
Homework: There will be weekly question sheets handed out in the Thursday problems class. They should be submitted in the problems class on the following Thursday. The question sheets contain questions for submission and also extra questions which may be discussed in the problems class. The extra questions provide additional insight into both the course material and the questions for submission.
Feedback: Any work submitted by the hand-in deadline will be marked and returned, with personal feedback, to you. Full solutions to the extra questions will be published on moodle following the problems class in which they were discussed. Full solutions to all questions will be published on moodle immediately following the hand-in deadline with hard copies available in the corresponding problems class. General feedback sheets will be made available.
Office hours: I am happy to discuss any matters relating to the course at any time, either via email or one-to-one. If you would like to meet then just send me an email, with a list of proposed times and whether you wish to meet in-person or on Teams.
Assessment: 100% exam.
Credits: | 6 |
Level: | Masters |
Period: | Semester 2 |
Assessment: | Examination 100% |
Other work: | There will be weekly question sheets. These will be set and handed in during problems classes. |
Requisites: | Before taking this unit you must take MA40092. |
Description: | Aims & Learning Objectives: |
Aims: | |
To introduce students to the ideas and techniques that underpin the theory and practice of the Bayesian approach to statistics. | |
Objectives: | |
Students should be able to formulate the Bayesian treatment and analysis of many familiar statistical problems. | |
Content: | |
Bayesian methods provide an alternative approach to data analysis, which has the ability to incorporate prior knowledge about a parameter of interest into the statistical model. The prior knowledge takes the form of a prior (to sampling) distribution on the parameter space, which is updated to a posterior distribution via Bayes’ Theorem, using the data. Summaries about the parameter are described using the posterior distribution. The Bayesian Paradigm; decision theory; utility theory; exchangeability; Representation Theorem; prior, posterior and predictive distributions; conjugate priors. Tools to undertake a Bayesian statistical analysis will also be introduced. Simulation based methods such as Markov Chain Monte Carlo and importance sampling for use when analytical methods fail. |
Lecture 1 (08 Feb 22): Introduction: working definitions of classical and Bayesian approaches to inference about parameters.
Lecture 2 (10 Feb 22): 1 The Bayesian method: Bayes’ theorem, using Bayes’ theorem for parametric inference, sequential data updates.
Lecture 3 (15 Feb 22): Conjugate Bayesian updates, Beta-Binomial example.
Lecture 4 (17 Feb 22): Definition of a conjugate family, role of prior (weak and strong) and likelihood in the posterior.
Lecture 5 (22 Feb 22): Kernel of a density, conjugate Normal example, using the posterior for inference: credible interval.
Lecture 6 (24 Feb 22): Highest density regions. 2 Modelling: predictive distribution.
Lecture 7 (01 Mar 22): Binomial-Beta example, finite and infinite exchangeability, example of non-extendibility of finitely exchangeable sequence.
Lecture 8 (03 Mar 22): General representation theorem for infinitely exchangeable events and random quantities, example of exchangeable Normal random quantities, sufficiency.
Lecture 9 (08 Mar 22): k-parameter exponential family, sufficient statistics and conjugate priors for exchangeable k-parameter exponential family random quantities.
Lecture 10 (10 Mar 22): Hyperparameters, improper priors, Fisher information matrix, Jeffreys’ prior.
Lecture 11 (15 Mar 22): Transformations, invariance property under transformation of Jeffreys’ prior, final remarks about noninformative priors.
Lecture 12 (17 Mar 22): 3 Computation: normal approximation, expansion about the mode.
Lecture 13 (22 Mar 22): Monte Carlo integration, importance sampling, basic idea of Markov chain Monte Carlo (MCMC), transition kernel.
Lecture 14 (24 Mar 22): Basic definitions (irreducible, periodic, recurrent, ergodic, stationary) and theorems (existence/uniqueness, convergence, ergodic) of Markov chains and their consequences for MCMC techniques.
Lecture 15 (29 Mar 22): The Metropolis-Hastings algorithm and example.
Lecture 16 (31 Mar 22): The Gibbs sampler: algorithm and example.
Lecture 17 (05 Apr 22): Overview of why the Metropolis-Hastings algorithm works, efficiency of MCMC algorithms and using the MCMC samples for inference.
Lecture 18 (07 Apr 22): 4 Decision theory: preferences, preference ordering, gambles.
Lecture 19 (26 Apr 22): Utility, statistical decision theory: loss, risk, Bayes risk and Bayes rule.
Lecture 20 (28 Apr 22): Bayes risk of the sampling procedure, worked example.