Question Sheet Six

Set: Problems class, Thursday 21st March 2024.

Due in: Problems class, Thursday 28th March 2024. Paper copies may be submitted in the problems class or directly to me in lectures or my office, 4W4.10. Pdf copies may be submitted to the portal available on the Moodle page.

Task: Attempt questions 1-2; questions 3 and 4 are extra questions which may be discussed in the problems class.

Question 1

Let \(X_{1}, \ldots, X_{n}\) be exchangeable so that the \(X_{i}\) are conditionally independent given a parameter \(\theta = (\mu, \sigma^{2})\). Suppose that \(X_{i} \, | \, \theta \sim N(\mu, \sigma^{2})\). It is judged that the improper joint prior distribution \(f(\mu, \sigma^{2}) \propto 1/\sigma^{2}\) is appropriate.

  1. Show that the likelihood \(f(x \, | \, \mu, \sigma^{2})\), where \(x = (x_{1}, \ldots, x_{n})\), can be expressed as \[\begin{eqnarray*} f(x \, | \, \mu, \sigma^{2}) & = & \left(2\pi \sigma^{2}\right)^{-\frac{n}{2}}\exp\left\{-\frac{1}{2\sigma^{2}}\left[(n-1)s^{2} + n(\overline{x} - \mu)^{2}\right]\right\}, \end{eqnarray*}\] where \(\overline{x} = \frac{1}{n} \sum_{i=1}^{n} x_{i}\), \(s^{2} = \frac{1}{n-1} \sum_{i=1}^{n} (x_{i} - \overline{x})^{2}\) are respectively the sample mean and variance. Hence, explain why \(\overline{X}\) and \(S^{2}\) are sufficient for \(X = (X_{1}, \ldots, X_{n})\) for learning about \(\theta\).

  2. Find, up to a constant of integration, the posterior distribution of \(\theta\) given \(x\).

  3. Show that \(\mu \, | \, \sigma^{2}, x \sim N(\overline{x}, \sigma^{2}/n)\). Hence, explain why, in this case, the chosen prior distribution for \(\theta\) is noninformative.

[Hint: You should use the property that \[\begin{eqnarray*} f(\mu \, | \, \sigma^{2}, x) \ = \ \frac{f(\mu, \sigma^{2} \, | \, x)}{f(\sigma^{2} \, | \, x)} \ \propto \ f(\mu, \sigma^{2} \, | \, x) \end{eqnarray*}\] where the proportionality is with respect to \(\mu\).]

  1. By integrating \(f(\mu, \sigma^{2} \, | \, x)\) over \(\sigma^{2}\), show that \[\begin{eqnarray*} f(\mu \, | \, x) & \propto & \left[1 + \frac{1}{n-1}\left(\frac{\overline{x} - \mu}{s/\sqrt{n}}\right)^{2}\right]^{-\frac{n}{2}}. \end{eqnarray*}\] Thus, explain why \(\mu \, | \, x \sim t_{n-1}(\overline{x}, s^{2}/n)\), the non-central \(t\)-distribution with \(n-1\) degrees of freedom, location parameter \(\overline{x}\) and squared scale parameter \(s^{2}/n\). How does this result relate to the classical problem of making inferences about \(\mu\) when \(\sigma^{2}\) is also unknown?

Question 2

Let \(X_{1}, \ldots, X_{n}\) be exchangeable with \(X_{i} \, | \, \theta \sim Bern(\theta)\).

  1. Using the improper prior distribution \(f(\theta) \propto \theta^{-1}(1-\theta)^{-1}\) find the posterior distribution of \(\theta \, | \, x\) where \(x = (x_{1}, \ldots, x_{n})\). Find a normal approximation about the mode to this distribution.

  2. Show that the prior distribution \(f(\theta)\) is equivalent to a uniform prior on \[\begin{eqnarray*} \beta & = & \log \left(\frac{\theta}{1-\theta}\right) \end{eqnarray*}\] and find the posterior distribution of \(\beta \, | \, x\). Find a normal approximation about the mode to this distribution.

  3. For which parameterisation does it make more sense to use a normal approximation?

Question 3

In viewing a section through the pancreas, doctors see what are called ``islands’’. Suppose that \(X_{i}\) denotes the number of islands observed in the \(i\)th patient, \(i =1, \ldots, n\), and we judge that \(X_{1}, \ldots, X_{n}\) are exchangeable with \(X_{i} \, | \, \theta \sim Po(\theta)\). A doctor believes that for healthy patients \(\theta\) will be on average around 2; he thinks it is unlikely that \(\theta\) is greater than 3. The number of islands seen in 100 patients are summarised in the following table. \[\begin{eqnarray*} & \begin{array}{|l|rrrrrrr|} \hline \mbox{Number of islands} & 0 & 1 & 2 & 3 & 4 & 5 & \geq 6 \\ \hline \mbox{Frequency} & 20 & 30 & 28 & 14 & 7 & 1 & 0 \\ \hline \end{array} \end{eqnarray*}\]

  1. Express the doctor’s prior beliefs as a normal distribution for \(\theta\). You may interpret the term “unlikely” as meaning “with probability 0.01”.

  2. Find, up to a constant of proportionality, the posterior distribution \(\theta \, | \, x\) where \(x = (x_{1}, \ldots, x_{100})\).

  3. Find a normal approximation to the posterior about the mode. Thus, estimate the posterior probability that the average number of islands is greater than 2.

  4. Why might you prefer to express the doctor’s prior beliefs as a normal distribution on some other parameterisation \(\phi = g(\theta)\)? Suggest an appropriate choice of \(g(\cdot)\) in this case. Now express the doctor’s beliefs using a normal prior for \(\phi\); that for healthy patients \(\phi\) will be on average around \(g(2)\) and it is “unlikely” that \(\phi\) is greater than \(g(3)\). Give an expression for the density of \(\phi \, | \, x\) up to a constant of proportionality.

Question 4

Let \(X_{1}, \ldots, X_{10}\) be the length of time between arrivals at an ATM machine, and assume that the \(X_{i}\)s may be viewed as exchangeable with \(X_{i} \, | \, \lambda \sim Exp(\lambda)\) where \(\lambda\) is the rate at which people arrive at the machine in one-minute intervals. Suppose we observe \(\sum_{i=1}^{10} x_{i} = 4\). Suppose that the prior distribution for \(\lambda\) is given by \[\begin{eqnarray*} f(\lambda) & = & \left\{\begin{array}{ll} c\exp\{-20(\lambda-0.25)^{2}\} & \lambda \geq 0, \\ 0 & \mbox{otherwise} \end{array} \right. \end{eqnarray*}\] where \(c\) is a known constant.

  1. Find, up to a constant \(k\) of proportionality, the posterior distribution \(\lambda \, | \, x\) where \(x = (x_{1}, \ldots, x_{10})\). Find also an expression for \(k\) which you need not evaluate.

  2. Find a normal approximation to this posterior distribution about the mode.

  3. Let \(Z_{i}\), \(i = 1, \ldots, N\) be a sequence of independent and identically distributed standard Normal random variables. Assuming the normalising constant \(k\) is known, explain carefully how the \(Z_{i}\) may be used to obtain estimates of the mean of \(\lambda \, | \, x\).