Renewal processes and stationarity

You will get more from the exercises by trying them before looking at the solutions!

  1. Suppose that X is a simple symmetric random walk on Z, started from 0. Show that T=inf is a stopping time (i.e. show that the event \{T \le n\} is determined by X_0, X_1, \ldots, X_n). What is the value of \operatorname{\mathbb{P}}\left(T < \infty\right)? What is the distribution of X_{T}?

Hint: It may be easier to describe \{T>n\} (why is this enough?). Will X eventually make 20 consecutive jumps to the right? For X_T, think symmetry.

Solution: The event \{T >n \} = \{ X_1 \not\in \{-10,10\}, X_2 \not\in \{-10,10\},\dots, X_n \not\in \{-10,10\} \} is clearly determined by X_1, X_2, \dots, X_n, and therefore so is the complementary event \{ T \leq n\} = \{T>n\}^\mathsf{c}. Since a simple symmetric random walk must eventually leave any bounded interval containing its starting point, the stopping time is almost surely finite: \operatorname{\mathbb{P}}\left(T<\infty\right) = 1. To prove rigorously, consider X walking on the whole of \mathbb{Z}, and define events A_k = \{ \text{jumps } 20k+1, 20k+2, \dots, 20k+20 \text{ are all to the right} \}, for k= 0,1,\dots. Observe that: (i) the A_k are independent, (ii) if A_k occurs then T < 20k+20 (either X already left (-10,10) by time 20k, or X_{20k} \in (-10,10) and the next 20 steps are to the right, meaning X_{20k+20} > 10), (iii) \operatorname{\mathbb{P}}\left(A_k\right) = 2^{-20}>0. Hence the smallest k for which A_k occurs is a Geometric random variable with positive success probability, which is almost surely finite, and hence T is too. The distribution of X_T follows from symmetry of the random walk; since the walk starts at the mid point of the interval [-10,10] is it equally likely to hit either end first: \operatorname{\mathbb{P}}\left(X_T = 10\right) = \operatorname{\mathbb{P}}\left(X_T=-10\right) = 1/2.

  1. For an irreducible recurrent Markov chain (X_n)_{n \ge 0} on a discrete state-space S, fix i \in S and let H_0^{(i)} = \inf\{n \ge 0: X_n=i\}. For m \ge 0, let H_{m+1}^{(i)} = \inf \{n > H_m^{(i)}: X_n = i\}. Show that H_0^{(i)}, H_1^{(i)}, \ldots is a sequence of stopping times.

Hint: You might find it helpful to view H_m^{(i)} as the (m+1)-th visit to state i.

Solution: Guided by the hint, observe that H_{m+1}^{(i)} is the first time after H_m^{(i)} that X visits state i. Hence an induction argument shows that H_{m}^{(i)} is the time of the (m+1)-th visit to i. Then, \{ H_m^{(i)} \leq n\} is just the event that at least m+1 of the random variables X_0, X_1, \dots, X_n are equal to i, which is clearly determined by X_0, \dots, X_n.

  1. Check that it follows from the strong Markov property that (H_{m+1}^{(i)} - H_m^{(i)} , m \ge 0) is a sequence of i.i.d. random variables, independent of H_0^{(i)}.

Hint: Use the strong Markov property at the times H_0^{(i)}, H_1^{(i)}, \dots. Use the fact that X_{H_m^{(i)}} is (almost surely) constant.

Solution: We apply the strong Markov property sequentially to the stopping times H_0^{(i)}, H_1^{(i)}, \dots. First, we remark that H_0^{(i)} is finite almost surely, since X is irreducible and recurrent. Moreover X_{H_0^{(i)}} \equiv i almost surely, by definition. Hence the strong Markov property implies that (X_{H_0^{(i)}+n})_{n\geq 0} has the same distribution as (X_n)_{n\geq 0} started from X_0=i, whatever the value of H_0^{(i)}. This implies that H_1^{(i)}-H_0^{(i)} is finite and has the same distribution as the first return time to state i of the walk when started in state i, i.e., the distribution of R_i = \inf\{n > 0: X_n = i\} given X_0 = i, and H_1^{(i)}-H_0^{(i)} is independent of H_0^{(i)}. Now, suppose that H_m^{(i)} is finite; applying the Strong Markov property to H_m^{(i)} implies that (X_{H_m^{(i)}+n})_{n\geq 0} has the same distribution as (X_n)_{n\geq 0} started from X_0=i, whatever the value of H_m^{(i)}. So, as before, H_{m+1}^{(i)}-H_m^{(i)} is finite, has the same distribution as R_i given X_0=i, and H_{m+1}^{(i)}-H_m^{(i)} is independent of H_m^{(i)}. Moreover, the strong Markov property also shows that (X_{H_m^{(i)}+n})_{n\geq 0} and (X_n)_{0\leq n< H_m^{(i)}} are independent, so H_{m+1}^{(i)}-H_m^{(i)} is independent of H_k^{(i)} for all k=0,1,\dots, m-1. Hence, by induction the random variables H_{m+1}^{(i)}-H_m^{(i)}, m=0,1,\dots are independent and identically distributed and also independent of H_0^{(i)}.

  1. Suppose that (N(n))_{n \ge 0} is a delayed renewal process with inter-arrival times Z_0, Z_1, \ldots where Z_0 is a non-negative random variable, independent of Z_1, Z_2, \ldots which are i.i.d. strictly positive random variables with common mean \mu. Use the Strong Law of Large Numbers for T_k = \sum_{i=0}^k Z_i to show that \frac{N(n)}{n} \to \frac{1}{\mu} \quad \text{ a.s. as $n \to \infty$}. Hint: note that T_{N(n)-1} \le n < T_{N(n)} so that N(n)/n can be sandwiched between N(n)/T_{N(n)} and N(n)/T_{N(n)-1}. Use this and the fact that N(n) \to \infty as n \to \infty.

Solution: Since N(n) = \#\{k\geq 0: T_k \leq n\}, the random variable N(n) takes values in \{1,2,\dots\} and N(n) = \ell if and only if T_{\ell-1} \leq n < T_{\ell}, for all \ell \geq 1. In other words, T_{N(n)-1} \leq n < T_{N(n)}. This implies that N(n)/T_{N(n)} < N(n)/n \leq N(n)/T_{N(n)-1}. Since Z_1 is finite almost surely (since it has a finite mean), we have N(n) \to\infty almost surely as n\to\infty. Thus \liminf_{n\to\infty}(N(n)/n) \geq \liminf_{n\to\infty}(N(n)/T_{N(n)}) = \liminf_{n\to\infty}{n/T_n} = 1/\mu, and \limsup_{n\to\infty}(N(n)/n) \leq \limsup_{n\to\infty}\left(\frac{N(n)}{N(n)-1}\frac{N(n)-1}{T_{N(n)-1}}\right) \leq \limsup_{n\to\infty}(n/(n-1)) \cdot \limsup_{n\to\infty}{n/T_n} = 1/\mu.

  1. Let (Y(n))_{n \ge 0} be the auxiliary Markov chain associated to a delayed renewal process (N(n))_{n \ge 0} i.e. Y(n) = T_{N(n-1)} - n. Check that you agree with the transition probabilities given in the lecture notes.

Hint: Note that T_{N(n-1)} agrees with the smallest sum T_j = \sum_{i=0}^j Z_i that is greater than or equal to n, because T_{N(n-1)-1} \leq n-1 < T_{N(n-1)}. (This interpretation also holds for n=0, where N(-1) \equiv 0.)

Solution: Suppose Y(n) = k \geq 1, so that T_{N(n-1)} = n + k \geq n+1. In other words, T_{N(n-1)} is greater than or equal to n+1, and it must be the smallest such T_j, since T_{N(n-1)-1} \leq n-1 < n+1. Hence T_{N(n)} = T_{N(n-1)}, and therefore Y(n+1) = T_{N(n)} - (n+1) = T_{N(n-1)} - n - 1 = Y(n) - 1, with probability 1. Otherwise, if Y(n) = 0, then T_{N(n-1)} = n, so T_{N(n-1)} is strictly less than n+1, and T_{N(n-1)+1} = T_{N(n-1)} + Z_{N(n-1)+1} is greater than or equal n+1 (since Z_{N(n-1)+1} is strictly positive). Hence T_{N(n)} = T_{N(n-1)} + Z_{N(n-1)+1}, and therefore Y(n+1) = T_{N(n)} - (n+1) = T_{N(n-1)} + Z_{N(n-1)+1} - n - 1 = Z_{N(n-1)+1}-1. This means that the probability \operatorname{\mathbb{P}}\left(Y(n+1) = i \mid Y(n)=0\right) is equal to \operatorname{\mathbb{P}}\left(Z_{N(n-1)+1} = i+1\right) = \operatorname{\mathbb{P}}\left(Z_1 = i+1\right).

  1. Let \nu_i = \frac{1}{\mu} \operatorname{\mathbb{P}}\left(Z_1 \ge i+1\right), \quad i \ge 0. Check that \nu = (\nu_i)_{i \ge 0} defines a probability mass function.

Hint: Recall the “tail sum formula”” for expectation of a non-negative integer-valued random variable.

Solution: Since Z_1 is non-negative and integer valued, \mu = \operatorname{\mathbb{E}}\left[Z_1\right] = \sum_{i\geq 0} \operatorname{\mathbb{P}}\left(Z_1 > i\right) = \sum_{i\geq 0} \mu \nu_i implying \sum_{i\geq 0}\nu_i = 1.

  1. Suppose that Z^* has the size-biased distribution associated with the distribution of Z_1, defined by \operatorname{\mathbb{P}}\left(Z^* = i\right) = \frac{i \operatorname{\mathbb{P}}\left(Z_1 = i\right)}{\mu}, \quad i \ge 1.
    1. Verify that this is a probability mass function.
    2. Let L \sim \text{U}\{0,1,\ldots, Z^*-1\}. Show that L \sim \nu.
      Note that you can generate L starting from Z^* by letting U \sim \text{U}[0,1] and then setting L = \lfloor U Z^* \rfloor.
    3. What is the size-biased distribution associated with \text{Po}(\lambda)?

Hint:

  1. Use the usual definition of expectation of a non-negative integer-valued random variable.
  2. Find \operatorname{\mathbb{P}}\left(L=j\right) using the partition theorem/law of total probability (partitioning on the possible values of Z^*).
  3. if Z\sim \text{Po}(\lambda) then \operatorname{\mathbb{P}}\left(Z=i\right) = \exp(-\lambda)\lambda^i/i!.

Solution:

  1. We have \sum_{i\geq 1} \operatorname{\mathbb{P}}\left(Z^* = i\right) = \frac{1}{\mu} \sum_{i\geq 1} i \operatorname{\mathbb{P}}\left(Z_1 = i\right) = \frac{1}{\mu} \sum_{i\geq 0} i \operatorname{\mathbb{P}}\left(Z_1 = i\right) = \operatorname{\mathbb{E}}\left[Z_1\right]/\mu = 1.
  2. By the law of total probability \operatorname{\mathbb{P}}\left(L=j\right) = \sum_{i\geq 1} \operatorname{\mathbb{P}}\left(L=j\mid Z^*=i\right)\operatorname{\mathbb{P}}\left(Z^*=i\right) = \sum_{i\geq j+1} \frac{1}{i} \frac{i \operatorname{\mathbb{P}}\left(Z_1 = i\right)}{\mu} = \frac{1}{\mu}\operatorname{\mathbb{P}}\left(Z_1 \geq j+1\right) = \nu_j.
  3. If Z\sim \text{Po}(\lambda) then \mu =\operatorname{\mathbb{E}}\left[Z\right] = \lambda, and \operatorname{\mathbb{P}}\left(Z^*=i\right) = \frac{i \exp(-\lambda)\lambda^i/i!}{\lambda} = \exp(-\lambda)\lambda^{i-1}/(i-1)! = \operatorname{\mathbb{P}}\left(Z = i-1\right). In other words, the size biased distribution Z^* has the same distribution as Z+1.
  1. Show that \nu is stationary for Y.
    Hint: Y is clearly not reversible, so there’s no point trying detailed balance!

Additional hint: Show that \nu P = \nu for the transition matrix P for the chain Y.

Solution: Recall (from the lectures, or Exercise 5 above) that the non-zero entries of P are P_{k,k-1} = 1 for k\geq 1 and P_{0,i} = \operatorname{\mathbb{P}}\left(Z_1 = i+1\right) for i\geq 0. Thus for all j\geq 0, we have \sum_i \nu_i P_{i,j} = \nu_{j+1} + \nu_0 \operatorname{\mathbb{P}}\left(Z_1 = j+1\right) = \frac{1}{\mu}\left(\operatorname{\mathbb{P}}\left(Z_1 \geq j+2\right) + \operatorname{\mathbb{P}}\left(Z_1 = j+1\right)\right) = \nu_j.

(In fact, it is not difficult to show uniqueness here: if \pi is a measure satisfying \pi_j = \sum_i \pi_i P_{i,j} = \pi_{j+1} + \pi_0 \operatorname{\mathbb{P}}\left(Z_1=j+1\right) then \pi is proportional to \nu.)

  1. Check that if \operatorname{\mathbb{P}}\left(Z_1 = k\right) = (1-p)^{k-1} p, for k \ge 1, the stationary distribution \nu for the time until the next renewal is \nu_i = (1-p)^i p, for i \ge 0. (In other words, if we flip a biased coin with probability p of heads at times n=0,1,2,\ldots and let N(n) = \#\{0 \le k \le n: \text{we see a head at time $k$}\} then (N(n), n \ge 0) is a stationary delayed renewal process.)

Hint: Calculate \operatorname{\mathbb{P}}\left(Z_1\geq i+1\right) = \operatorname{\mathbb{P}}\left(Z_1 > i\right) using a geometric series (or knowledge about Geometric random variables).

Solution: For all i\geq 0, \operatorname{\mathbb{P}}\left(Z_1 \geq i+1\right) = \sum_{k\geq i+1} (1-p)^{k-1}p = (1-p)^i \sum_{k\geq 1}(1-p)^{k-1}p = (1-p)^i. But \mu = 1/p since Z_1 is a Geometric random variable with success probability p. Hence \nu_i = \frac{1}{\mu}\operatorname{\mathbb{P}}\left(Z_1 \geq i+1\right) = (1-p)^i p.