Set: Problems class, Thursday 7th March 2024.
Due in: Problems class, Thursday 14th March 2024. Paper copies may be submitted in the problems class or directly to me in lectures or my office, 4W4.10. Pdf copies may be submitted to the portal available on the Moodle page.
Task: Attempt questions 1-2. Questions 3-5 are extra questions which may be discussed in the problems class.
Let \(X_{1}, \ldots, X_{n}\) be exchangeable so that the \(X_{i}\) are conditionally independent given a parameter \(\theta\). For each of the distributions
\(X_{i} \, | \, \theta \sim Bern(\theta)\);
\(X_{i} \, | \, \theta \sim N(\mu, \theta)\) with \(\mu\) known;
\(X_{i} \, | \, \theta \sim Maxwell(\theta)\), the Maxwell distribution with parameter \(\theta\) so that \[\begin{eqnarray*} f(x_{i} \, | \, \theta) = \left(\frac{2}{\pi}\right)^{\frac{1}{2}}\theta^{\frac{3}{2}}x_{i}^{2}\exp\left\{-\frac{\theta x_{i}^{2}}{2}\right\}, \ \ x_{i} > 0 \end{eqnarray*}\] and \(E(X_{i} \, | \, \theta) = 2\sqrt{\frac{2}{\pi \theta}}\), \(Var(X_{i} \, | \, \theta) = \frac{3\pi - 8}{\pi \theta}\); answer the following questions.
Show that \(f(x_{i} \, | \, \theta)\) belongs to the 1-parameter exponential family and for \(X = (X_{1}, \ldots, X_{n})\) state the sufficient statistic for learning about \(\theta\).
By viewing the likelihood as a function of \(\theta\), which generic family of distributions (over \(\theta\)) is the likelihood a kernel of?
By first finding the corresponding posterior distribution for \(\theta\) given \(x = (x_{1}, \ldots, x_{n})\), show that this family of distributions is conjugate with respect to the likelihood \(f(x \, | \, \theta)\).
Let \(X_{1}, \ldots, X_{n}\) be exchangeable so that the \(X_{i}\) are conditionally independent given a parameter \(\theta\). Suppose that \(X_{i} \, | \, \theta\) is geometrically distributed with probability density function \[\begin{eqnarray*} f(x_{i} \, | \, \theta) & = & (1-\theta)^{x_{i}-1}\theta, \ \ x_{i} = 1, 2, \ldots. \end{eqnarray*}\]
Show that \(f(x \, | \, \theta)\), where \(x = (x_{1}, \ldots, x_{n})\), belongs to the \(1\)-parameter exponential family. Hence, or otherwise, find the conjugate prior distribution and corresponding posterior distribution for \(\theta\).
Show that the posterior mean for \(\theta\) can be written as a weighted average of the prior mean of \(\theta\) and the maximum likelihood estimate, \(\bar{x}^{-1}\).
Suppose now that the prior for \(\theta\) is instead given by the probability density function \[\begin{eqnarray*} f(\theta) & = & \frac{1}{2B(\alpha+1, \beta)}\theta^{\alpha}(1-\theta)^{\beta - 1} + \frac{1}{2B(\alpha, \beta+1)}\theta^{\alpha-1}(1-\theta)^{\beta}, \end{eqnarray*}\] where \(B(\alpha, \beta)\) denotes the Beta function evaluated at \(\alpha\) and \(\beta\). Show that the posterior probability density function can be written as \[\begin{eqnarray*} f(\theta \, | \, x) & = & \lambda f_{1}(\theta) + (1 - \lambda) f_{2}(\theta) \end{eqnarray*}\] where \[\begin{eqnarray*} \lambda & = & \frac{(\alpha + n)\beta}{(\alpha + n)\beta + (\beta -n + \sum_{i=1}^{n} x_{i})\alpha} \end{eqnarray*}\] and \(f_{1}(\theta)\) and \(f_{2}(\theta)\) are probability density functions.
Let \(X_{1}, \ldots, X_{n}\) be exchangeable so that the \(X_{i}\) are conditionally independent given a parameter \(\theta\). Suppose that \(X_{i} \, | \, \theta\) is distributed as a double-exponential distribution with probability density function \[\begin{eqnarray*} f(x_{i} \, | \, \theta) & = & \frac{1}{2\theta} \exp \left\{- \frac{|x_{i}|}{\theta}\right\}, \ \ -\infty < x_{i} < \infty \end{eqnarray*}\] for \(\theta > 0\).
Find the conjugate prior distribution and corresponding posterior distribution for \(\theta\) following observation of \(x = (x_{1}, \ldots, x_{n})\).
Consider the transformation \(\phi = \theta^{-1}\). Find the posterior distribution of \(\phi \, | \, x\).
Let \(X_{1}, \ldots, X_{n}\) be a finite subset of a sequence of infinitely exchangeable random quantities with joint density function \[\begin{eqnarray*} f(x_{1}, \ldots, x_{n}) & = & n! \left(1 + \sum_{i=1}^{n} x_{i}\right)^{-(n+1)}. \end{eqnarray*}\] Show that they can be represented as conditionally independent and exponentially distributed.
Let \(X_{1}, \ldots, X_{n}\) be exchangeable so that the \(X_{i}\) are conditionally independent given a parameter \(\theta\). Suppose that \(X_{i} \, | \, \theta\) is distributed as a Poisson distribution with mean \(\theta\).
Show that, with respect to this Poisson likelihood, the gamma family of distributions is conjugate.
Interpret the posterior mean of \(\theta\) paying particular attention to the cases when we may have weak prior information and strong prior information.
Suppose now that the prior for \(\theta\) is given hierarchically. Given \(\lambda\), \(\theta\) is judged to follow an exponential distribution with mean \(\frac{1}{\lambda}\) and \(\lambda\) is given the improper distribution \(f(\lambda) \propto 1\) for \(\lambda > 0\). Show that \[\begin{eqnarray*} f(\lambda \, | \, x) & \propto & \frac{\lambda}{(n+\lambda)^{n\bar{x}+1}} \end{eqnarray*}\] where \(\bar{x} = \frac{1}{n} \sum_{i=1}^{n} x_{i}\).