MA40092: Classical Statistical Inference
AIMS To develop a formal basis for methods of statistical inference including criteria for the comparison of procedures. To give an in depth description of the asymptotic theory of maximum likelihood methods and hypothesis testing.
OBJECTIVES On completing the course, students should be able to:
calculate properties of estimates and tests;
derive efficient estimates and tests for a broad range of problems, including applications to a variety of standard distributions;
use the asymptotic theory for maximum likelihood estimators to derive approximate confidence intervals and tests.
Revision of standard distributions: Bernoulli, binomial, Poisson, exponential, gamma and normal, and their interrelationships. Sufficiency and Exponential families.
Point estimation: Bias and variance considerations. Rao-Blackwell theorem. Cramer-Rao lower bound and efficiency. Unbiased minimum variance estimators and a direct appreciation of efficiency through some examples.
Asymptotic theory for maximum likelihood estimators and its uses.
Hypothesis testing: Review of the Neyman-Pearson lemma and maximisation of power. Compound alternative hypotheses, uniformly most powerful tests. Compound null hypotheses, monotone likelihood ratio property. Generalised likelihood ratio tests, asymptotic theory, nuisance parameters. Examples relevant to other final year statistics units.
Detailed course outline
Problem sheet 1 (exponential families, sufficient statistics)
Problem sheet 2 (MLEs, sufficient statistics, bias)
Problem sheet 3 (sufficiency, bias, Rao-Blackwell theorem)
Problem sheet 4 (Rao-Blackwell theorem, Cramer-Rao lower bound)
Problem sheet 5 (Cramer-Rao lower bound, minimum variance unbiased estimators)
Problem sheet 6 (MLEs, functional invariance)
Problem sheet 7 (Neyman-Pearson lemma, UMP tests, monotone likelihood ratio tests)
Problem sheet 8 (monotone likelihood ratio tests, generalised likelihood ratio tests)
Chapter 1 Exponential families, sufficiency, factorisation theorem, invariance of MLEs.
Chapter 2 Rao-Blackwell theorem, Cramer-Rao lower bound, efficiency, minimum variance unbiased estimators.
Chapter 3 Asymptotic theory of Maximum Likelihood Estimators.
Chapter 4 Review of hypothesis testing and the Neyman-Pearson lemma, simple hypotheses, composite alternatives (one and two sided).
Chapter 5 Monotone likelihood ratio tests; Generalised likelihood ratio tests.
A quick note on Chebychev's inequality
Extra for Chapter 3 A sketch proof of the asymptotic distribution for MLEs in the univariate case (not examinable).
Two links to some lecture notes with a more formal treatment of the asymptotics (again, just out of interest).
ASSESSMENT 100% examination.
Exam papers for the last five years can be found on the library's archive.
SOME SUGGESTED READING
Bain and Engelhardt
Introduction to probability and mathematical statistics 512.75 BAI
Cox and Hinkley
Theoretical statistics 512.76 COX
Hogg and Craig
Introduction to mathematical statistics 512.75 HOG
Statistical inference 512.76 SIL
For details of how to contact me, see my homepage