Set: Problems class, Thursday 22nd February 2024.
Due in: Problems class, Thursday 29th February 2024. Paper copies may be submitted in the problems class or directly to me in lectures or my office, 4W4.10. Pdf copies may be submitted to the portal available on the Moodle page.
Task: Attempt questions 1-2; questions 3-4 are extra questions which may be discussed in the problems class.
Let \(X_{1}, \ldots, X_{n}\) be conditionally independent given \(\lambda\) so that \(f(x \, | \, \lambda) = \prod_{i=1}^{n} f(x_{i} \, | \, \lambda)\) where \(x = (x_{1}, \ldots, x_{n})\). Suppose that \(\lambda \sim Gamma(\alpha, \beta)\) and \(X_{i} \, | \, \lambda \sim Exp(\lambda)\) where \(\lambda\) represents the rate so that \(E(X_{i} \, | \, \lambda) = \lambda^{-1}\).
Show that \(\lambda \, | \, x \sim Gamma(\alpha+n, \beta + n\bar{x})\).
Show that the posterior mean for the failure rate \(\lambda\) can be written as a weighted average of the prior mean of \(\lambda\) and the maximum likelihood estimate, \(\bar{x}^{-1}\), of \(\lambda\).
A water company is interested in the failure rate of water pipes. They ask two groups of engineers about their prior beliefs about the failure rate. The first group believe the mean failure rate is \(\frac{1}{8}\) with coefficient of variation \(\frac{1}{\sqrt{11}}\), whilst the second group believe the mean is \(\frac{1}{11}\) with coefficient of variation \(\frac{1}{2}\). [Note: The coefficient of variation is the standard deviation divided by the mean.] Let \(X_{i}\) be the time until water pipe \(i\) fails and assume that the \(X_{i}\) follow the exponential likelihood model described above. A sample of five of pipes is taken and the following times to failure were observed: \(8.2, \ 9.2, \ 11.2, \ 9.8, \ 10.1.\)
Find the appropriate members of the Gamma families the prior statements of the two groups of engineers represent. In each case find the posterior mean and variance. Approximating the posterior by \(N(E(\lambda \, | \, x), Var(\lambda \, | \, x))\), where \(x = (x_{1}, \ldots, x_{5})\), estimate, in each case, the probability that the failure rate is less than 0.1.
How do you expect any differences between the engineers to be reconciled as more data becomes available?
Let \(x\) be the number of successes in \(n\) independent Bernoulli trials, each one having unknown probability \(\theta\) of success. It is judged that \(\theta\) may be modelled by a \(Unif(0, 1)\) distribution so \[\begin{eqnarray*} f(\theta) & = & 1, \ \ \ \ \ 0 < \theta < 1. \end{eqnarray*}\] An extra trial, \(z\) is performed, independent of the first \(n\) given \(\theta\), but with probability \(\frac{\theta}{2}\) of success. The full data is thus \((x, z)\) where \(z = 1\) if the extra trial is a success and \(0\) otherwise.
Show that \[\begin{eqnarray*} f(\theta \, | \, x, z=0) & = & c\{\theta^{\alpha - 1}(1-\theta)^{\beta - 1} + \theta^{\alpha -1}(1 - \theta)^{\beta}\} \end{eqnarray*}\] where \(\alpha = x+1\), \(\beta = n-x+1\) and \(c = \frac{1}{B(\alpha, \beta) + B(\alpha, \beta+1)}\).
Hence show that \[\begin{eqnarray*} E(\theta \, | \, X, Z = 0) & = & \frac{(x+1)(2n-x+4)}{(n+3)(2n-x+3)}. \end{eqnarray*}\] [Hint: Show that \(c = \frac{\alpha + \beta}{B(\alpha, \beta)(\alpha + 2\beta)}\) and work with \(\alpha\) and \(\beta\).]
Show that, for all \(x\), \(E(\theta \, | \, X, Z = 0)\) is less than \(E(\theta \, | \, X, Z = 1)\).
Let \(X_{1}, \ldots, X_{n}\) be conditionally independent given \(\theta\), so \(f(x \, | \, \theta) = \prod_{i=1}^{n} f(x_{i} \, | \, \theta)\) where \(x = (x_{1}, \ldots, x_{n})\), with each \(X_{i} \, | \, \theta \sim N(\mu, \theta)\) where \(\mu\) is known.
Let \(s(x) = \sum_{i=1}^{n} (x_{i} - \mu)^{2}\). Show that we can write \[\begin{eqnarray*} f(x \, | \, \theta) & = & g(s, \theta)h(x) \end{eqnarray*}\] where \(g(s, \theta)\) depends upon \(s(x)\) and \(\theta\) and \(h(x)\) does not depend upon \(\theta\) but may depend upon \(x\). The equation shows that \(s(X) = \sum_{i=1}^{n} (X_{i} - \mu)^{2}\) is sufficient for \(X_{1}, \ldots, X_{n}\) for learning about \(\theta\).
An inverse-gamma distribution with known parameters \(\alpha, \beta > 0\) is judged to be the prior distribution for \(\theta\). So, \[\begin{eqnarray*} f(\theta) & = & \frac{\beta^{\alpha}}{\Gamma(\alpha)}\theta^{-(\alpha+1)}e^{-\beta/\theta}, \ \ \ \ \ \theta > 0. \end{eqnarray*}\]
Show that the distribution of the precision \(\tau = \frac{1}{\theta}\) is \(Gamma(\alpha, \beta)\).
Find the posterior distribution of \(\theta\) given \(x = (x_{1}, \ldots, x_{n})\).
Show that the posterior mean for \(\theta\) can be written as a weighted average of the prior mean of \(\theta\) and the maximum likelihood estimate, \(s(x)/n\), of \(\theta\).
Suppose that \(X_{1}, \ldots, X_{n}\) are identically distributed discrete random variables taking \(k\) possible values with probabilities \(\theta_{1}, \ldots, \theta_{k}\). Inference is required about \(\theta = (\theta_{1}, \ldots, \theta_{k})\) where \(\sum_{j=1}^{k} \theta_{j} = 1\).
Assuming that the \(X_{i}\)s are independent given \(\theta\), explain why \[\begin{eqnarray*} f(x \, | \, \theta) & \propto & \prod_{j=1}^{k} \theta_{j}^{n_{j}} \end{eqnarray*}\] where \(x = (x_{1}, \ldots, x_{n})\) and \(n_{j}\) is the number of \(x_{i}\)s observed to take the \(j\)th possible value.
Suppose that the prior for \(\theta\) is Dirichlet distributed with known parameters \(a = (a_{1}, \ldots, a_{k})\) so \[\begin{eqnarray*} f(\theta) & = & \frac{1}{B(a)} \prod_{j=1}^{k} \theta_{j}^{a_{j}-1} \end{eqnarray*}\] where \(B(a) = B(a_{1}, \ldots, a_{k}) = \frac{\prod_{j=1}^{k} \Gamma(a_{j})}{\Gamma(\sum_{j=1}^{k} a_{j})}\). Show that the posterior for \(\theta\) given \(x\) is Dirichlet with parameters \(a + n = (a_{1} + n_{1}, \ldots, a_{k} + n_{k})\).