Set: Lecture 24, Friday 2nd May 2025.
Due in: There is no deadline for this work. I will happily mark and return any work you submit. Paper copies may be submitted directly to me either in lectures or my office, 4W4.10. PDF copies may be submitted to the portal available on the Moodle page. Solution Sheets will be made available on-line on Wednesday 7th May 2025.
Consider the following loss function \[\begin{eqnarray*} L(\theta, d) & = & \frac{d}{\theta} - \log \left(\frac{d}{\theta}\right) - 1. \end{eqnarray*}\] Let \(X_{1}, \ldots, X_{n}\) be exchangeable so that the \(X_{i}\) are conditionally independent given parameter \(\theta\). Suppose that \(X_{i} \, | \, \theta \sim \mbox{Po}(\theta)\) and \(\theta \sim \mbox{Gamma}(\alpha, \beta)\) with \(\alpha > 1\).
Find the Bayes rule of an immediate decision.
Find the Bayes rule after observing \(x = (x_{1}, \ldots, x_{n})\).
Interpret the Bayes rules that you have found.
We wish to estimate the parameter, \(\theta\), of an exponential distribution. We consider that \(X_{1}, \ldots, X_{n}\) are exchangeable with \(X_{i} \, | \, \theta \sim \mbox{Exp}(\theta)\) so that \(E(X_{i} \, | \, \theta) = 1/\theta\). Our prior for \(\theta\) is \(\mbox{Gamma}(\alpha, \beta)\) with \(\alpha > 3\). We wish to produce an estimate, \(d\), for \(\theta\), with loss function \[\begin{eqnarray*} L(\theta, d) & = & \frac{(\theta - d)^{2}}{\theta^{3}}. \end{eqnarray*}\]
Find the Bayes rule and Bayes risk of an immediate decision.
Suppose that we may take a sample of \(n\) observations before estimating \(\theta\). Find the Bayes rule and Bayes risk when we have observed \(x = (x_{1}, \ldots, x_{n})\). Show that the Bayes rule can be expressed as a weighted average of the Bayes rule for the immediate decision and the maximum likelihood estimate of \(\theta\), \(1/\overline{x}\).
Find the Bayes risk of the sampling procedure.
If each observation costs a fixed amount \(c\) find the total risk of a sample of size \(n\) and thus the optimal choice of \(n\). (Hint: the total risk of the sample is the Bayes risk of the sampling procedure added to the sample cost.)
A certain coin has probability \(\omega\) of landing heads. We assess that the prior for \(\omega\) is a Beta distribution with parameters \(\alpha, \beta > 1\). The loss function for estimate \(d\) and value \(\omega\) is \[\begin{eqnarray*} L(\omega, d) & = & \frac{(\omega - d)^{2} + d^{2}}{\omega}. \end{eqnarray*}\]
Find the Bayes rule and Bayes risk for an immediate decision.
Suppose that we toss the coin \(n\) times before making our decision, and observe \(k\) heads and \(n-k\) tails. Find the new Bayes rule and Bayes risk.
Find the Bayes risk of the sampling procedure of tossing the coin \(n\) times and then making a decision when \(\alpha = \beta = 2\).