Set: Problems class, Thursday 15th February 2024.
Due in: Problems class, Thursday 22nd February 2024. Paper copies may be submitted in the problems class or directly to me in lectures or my office, 4W4.10. Pdf copies may be submitted to the portal available on the Moodle page.
Task: Attempt questions 1-3; questions 4-6 are extra questions which may be discussed in the problems class. You can find the table of useful distributions here.
For each of the following distributions, write down the probability density function and find a corresponding kernel. Your kernel should be as simple as possible.
\(X \, | \, \theta \sim Po(\theta)\).
\(Y \, | \, \theta, \beta \sim Beta(\beta \theta, \beta)\).
\(\theta \, | \, \alpha, \beta, x \sim Gamma(\alpha + x + 1, \beta - 3x)\).
\(\phi \, | \, \mu, \overline{x}, \tau \sim N(\tau \mu + (1 - \tau)\overline{x}, \overline{x}^{2}\tau^{-2})\).
In each of the following, state the distribution of the corresponding random variable.
\(f(x \, | \, \theta) \propto \frac{\theta^{x}}{x!}\), \(x = 0, 1, \ldots\), \(\theta > 0\).
\(f(x) \propto e^{-2x}\), \(x > 0\).
\(f(x) \propto 1\), \(0 \leq x \leq 1\).
\(f(x \, | \, \alpha, \beta) \propto x^{-\alpha}e^{-2\beta/x}\), \(x > 0\), \(\alpha > 1\), \(\beta > 0\).
\(f(x \, | \, m) \propto (1-x)^{(m-1)/2}\), \(0 \leq x \leq 1\), \(m > - 1\).
\(f(x \, | \, \theta, \phi) \propto \exp\left\{-\frac{\theta}{\phi}x^{2} + (\phi + 1)x \right\}\), \(-\infty < x < \infty\), \(\theta > 0\), \(\phi > 0\).
Find the following sums and integrals by identifying a kernel of a probability density function and using properties of probability density functions.
\(\sum_{x=0}^{\infty} \frac{\theta^{x}}{x!}\), \(\theta > 0\).
\(\sum_{x=0}^{\infty} \frac{x\theta^{x}}{x!}\), \(\theta > 0\).
\(\int_{0}^{1} x^{\alpha - 1}(1-x)^{\beta - 1} \, dx\), \(\alpha > 0\), \(\beta > 0\).
\(\int_{0}^{1} \frac{1}{2}(\beta+3)(\beta+2)(\beta+1)x(1-x)^{\beta} \, dx\), \(\beta > -1\).
\(\int_{-\infty}^{\infty} 4\mu^{(a-1)}\left(\frac{\sigma^{2}+\tau^{2}}{\sigma\tau}\right)^{\frac{1}{2}}\exp\left\{-\left(\frac{\sigma^{2}+\tau^{2}}{\sigma\tau}\right)(\theta - \mu)^{2}\right\} \, d\theta\), \(\sigma > 0\), \(\tau > 0\).
Suppose that we are interested in \(\theta\), the probability that a coin will yield a ‘head’ when spun in a specified manner. We judge that the prior distribution is \(\theta \sim Beta(4, 4)\). The coin is spun ten times (you do not witness the spins) and, rather than being told how many heads were seen, you are only told that the number is less than three.
Find the posterior distribution up to proportionality, and show that the normalising constant \(k\) is given by \[\begin{eqnarray*} k & = & \frac{\Gamma(18)}{1536 \times \Gamma(4) \Gamma(12)}. \end{eqnarray*}\]
Show that the posterior mean is \(\frac{39}{128}\).
Consider the random variables \(X\) and \(Y\) which, for convenience, you may assume are continuous. Recall that the conditional expectation of \(g(X)\) given \(Y\) is defined as \[\begin{eqnarray*} E(g(X) \, | \, Y) & = & \int_{X} g(x)f(x \, | \, y) \, dx \end{eqnarray*}\] for any function \(g(\cdot)\) where \(f(x \, | \, y)\) is the conditional distribution of \(X\) given \(Y\). Prove that
Consider the random variables \(\theta\) and \(X\). Consider an estimator \(t(X)\) of \(\theta\) and that we measure the quality of an estimator by its mean squared error, the expected squared distance between the estimator and \(\theta\). Show that, given \(X\), the choice \(t = E(\theta \, | \, X)\) minimises this mean squared error.