Question Sheet Five

Set: Problems class, Thursday 14th March 2024.

Due in: Problems class, Thursday 21st March 2024. Paper copies may be submitted in the problems class or directly to me in lectures or my office, 4W4.10. Pdf copies may be submitted to the portal available on the Moodle page.

Task: Attempt questions 1-2; questions 3 and 4 are extra questions which may be discussed in the problems class.

Question 1

Let \(X_{1}, \ldots, X_{n}\) be exchangeable so that the \(X_{i}\) are conditionally independent given a parameter \(\theta\). For each of the following distributions for \(X_{i} \, | \, \theta\) find the Jeffreys prior and the corresponding posterior distribution for \(\theta\).

  1. \(X_{i} \, | \, \theta \sim Bern(\theta)\).

  2. \(X_{i} \, | \, \theta \sim Po(\theta)\).

  3. \(X_{i} \, | \, \theta \sim Maxwell(\theta)\), the Maxwell distribution with parameter \(\theta\) so that \[\begin{eqnarray*} f(x_{i} \, | \, \theta) = \left(\frac{2}{\pi}\right)^{\frac{1}{2}}\theta^{\frac{3}{2}}x_{i}^{2}\exp\left\{-\frac{\theta x_{i}^{2}}{2}\right\}, \ \ x_{i} > 0 \end{eqnarray*}\] and \(E(X_{i} \, | \, \theta) = 2\sqrt{\frac{2}{\pi \theta}}\), \(Var(X_{i} \, | \, \theta) = \frac{3\pi - 8}{\pi \theta}\).

Question 2

Let \(X_{1}, \ldots, X_{n}\) be exchangeable so that the \(X_{i}\) are conditionally independent given a parameter \(\lambda\). Suppose that \(X_{i} \, | \, \lambda \sim Exp(\lambda)\) where \(\lambda\) represents the rate so that \(E(X_{i} \, | \, \lambda) = \lambda^{-1}\).

  1. Show that \(X_{i} \, | \, \lambda \sim Exp(\lambda)\) is a member of the \(1\)-parameter exponential family. Hence, write down a sufficient statistic \(t(X)\) for \(X = (X_{1}, \ldots, X_{n})\) for learning about \(\lambda\).

  2. Find the Jeffreys prior and comment upon whether or not it is improper. Find the posterior distribution for this prior.

  3. Consider the transformation \(\phi = \log \lambda\).

    1. By expressing \(L(\lambda) = f(x \, | \, \lambda)\) as \(L(\phi)\) find the Jeffreys prior for \(\phi\).

    2. By transforming the distribution of the Jeffreys prior for \(\lambda\), \(f(\lambda)\), find the distribution of \(\phi\). [This should yield the same answer as the previous part and is an illustration of the invariance to reparameterisation of the Jeffreys prior.]

Question 3

The Jeffreys prior for Normal distributions. In Lecture 12 we showed that for an exchangeable collection \(X = (X_{1}, \ldots, X_{n})\) with \(X_{i} \, | \, \theta \sim N(\theta, \sigma^{2})\) where \(\sigma^{2}\) is known the Jeffreys prior for \(\theta\) is \(f(\theta) \propto 1\).

  1. Consider, instead, that \(X_{i} \, | \, \theta \sim N(\mu, \theta)\) where \(\mu\) is known. Find the Jeffreys prior for \(\theta\).

  2. Now suppose that \(X_{i} \, | \, \theta \sim N(\mu, \sigma^{2})\) where \(\theta = (\mu, \sigma^{2})\). Find the Jeffreys prior for \(\theta\).

  3. Comment upon your answers for these three Normal cases.

Question 4

Consider, given \(\theta\), a sequence of independent Bernoulli trials with parameter \(\theta\). We wish to make inferences about \(\theta\) and consider two possible methods. In the first, we carry out \(n\) trials and let \(X\) denote the total number of successes in these trials. Thus, \(X \, | \, \theta \sim Bin(n, \theta)\) with \[\begin{eqnarray*} f_{X}(x \, | \, \theta) & = & \binom{n}{x} \theta^{x}(1- \theta)^{n-x}, \ \ x = 0, 1, \ldots, n. \end{eqnarray*}\] In the second method, we count the total number \(Y\) of trials up to and including the \(r\)th success so that \(Y \, | \, \theta \sim Nbin(r, \theta)\), the negative binomial distribution, with \[\begin{eqnarray*} f_{Y}(y \, | \, \theta) & = & \binom{y-1}{r-1} \theta^{r}(1- \theta)^{y-r}, \ \ y = r, r+1, \ldots. \end{eqnarray*}\]

  1. Obtain the Jeffreys prior distribution for each of the two methods. You may find it useful to note that \(E(Y \, | \, \theta) = \frac{r}{\theta}\).

  2. Suppose we observe \(x = r\) and \(y = n\). For each method, calculate the posterior distribution for \(\theta\) with the Jeffreys prior. Comment upon your answers.