3.2 Martingales

Martingales. Theorem: the ‘integral’ of a martingale is a martingale.

(Note that some of this material should be familiar from MA30125.)

Now think of an asset-price process. We can think of buying \(\phi_n\) units of the asset at time \(n\) as making a ‘gamble’ on the change in the asset’s value between time \(n\) and time \(n+1\). If we imagine \(r=0\), then our ‘winnings’ at time \(n+1\), for buying \(\phi_n\) units of the asset are: \(\phi_n(S_{n+1}-S_n)\). In particular, we can think of \(X_{n+1} = 1+(S_{n+1}-S_n)\) as being a sequence of fair games, and the condition which we need to be satisfied in order for the game to be fair is that: \[\mathbb{E}\left[ X_{n+1} | X_1, X_2, \dots, X_n \right] = 1.\] This is the case if and only if: \[\mathbb{E}\left[ S_{n+1} | S_1, S_2, \dots, S_n \right] = S_n.\]

Solution: Note: conditioning on \(X_1,\dots,X_n\) is equivalent to conditioning on \(S_1,\dots,S_n\) since the sets \[\begin{align} \{X_1=x_1,\dots,X_n=x_n\}\text{ and }\{S_1=s_1,\dots,S_n=s_n\}\end{align}\] are equivalent using \(X_{n+1} = 1+(S_{n+1}-S_n)\) for some values \(s_1,\dots,s_n\) provided \(s_0\) is fixed and non-random.

Then \[\begin{align} \mathbb{E}\left[ X_{n+1} | X_1,\dots,X_n \right] = 1 \iff&\mathbb{E}\left[ 1+S_{n+1}-S_n | X_1,\dots,X_n \right] = 1\\ \iff&\mathbb{E}\left[ 1+S_{n+1}-S_n | S_1,\dots,S_n \right] = 1\\ \iff&1+\mathbb{E}\left[ S_{n+1} | S_1,\dots,S_n \right]-S_n = 1\\ \iff&\mathbb{E}\left[ S_{n+1} | S_1,\dots,S_n \right] = S_n\end{align}\] \(\square\)

Assumption 23. From now on, we will make the following assumption: there will be some fundamental random process, typically the asset price process, or the fair game process, which generates all the randomness in our scenario. Note that, for example, with the gains process, knowing \(X_1, \dots, X_n\) and the (not random) functions, \(\alpha_1, \dots, \alpha_n\) determines \(G_n\), so all the information that would be relevant at time \(n\) is determined by observing \(X_n\). When we take conditional expectation given this process, then we will use the shorthand: \[\mathbb{E}_{[} \left[ Y \right]n] := \mathbb{E}\left[ Y|X_1, X_2, \dots, X_n \right]\] We will also assume that all our gambling strategies/portfolio processes are functions of this same underlying process, so we may take out what is known.

We will almost always then want to consider processes \(Y_n\) which are known at time \(n\). In particular, we have the following definition:

Definition 24. We say that a stochastic process \(Y_n\) is adapted (with respect to the underlying process \(X_n\)) if \(Y_n\) is a function of \(X_1, \dots, X_n\).

For example, if \(X_n\) is a sequence of coin flips (which generates all our randomness/the underlying information), and we define \(Y_n\) to be the number of heads on the first \(n\) flips, and \(Z_n\) to be the number of heads on the first \(n+1\) flips, then both \(Y_n\) and \(Z_n\) are stochastic processes, and \(Y_n\) is adapted, since it is a function of \(X_1, \dots, X_n\), but \(Z_n\) is not adapted, since it depends on \(X_1, \dots, X_{n+1}\).

Definition 25. A stochastic process \((M_n)_{n \ge 0}\) is a martingale if:

  1. \(\mathbb{E}|M_n| < \infty\) for all \(n\)

  2. \(M_n\) stays the same on average’: \[\begin{align} \tag{3.2} \mathbb{E}_{[} \left[ M_{n+1} \right]n] = M_n\end{align}\]

  3. the process \(M_n\) is adapted.

Theorem 26. Suppose \(M_n\) is a martingale, and let \(\phi_n\) be a bounded gambling strategy for \(M_n\). Then the gains process defined by: \[G_n = \sum_{j=1}^n \phi_{j} (M_j-M_{j-1}), G_0 = 0\] is a martingale.

Solution: Need to show:

  1. \(\mathbb{E}|G_n| < \infty\) for all \(n\)

  2. \(\mathbb{E}_n[G_{n+1}] = G_n\)

  3. the process \(G_n\) is adapted

  1. Since \(\phi_n\) is a bounded gambling strategy \(\exists K>0 \text{ s.t. } |\phi_n|\le K \forall n\). Therefore: \[\begin{align} |G_n|=&|\sum_{j=1}^n\phi_j(M_j-M_{j-1})|\\ (\Delta\text{-ineq.}) \qquad \le&\sum_{j=1}^n|\phi_j| \, |M_j-M_{j-1}|\\ (\Delta\text{-ineq., bounded gamb. st.}) \qquad \le &K\sum_{j=1}^n(|M_j|+|M_{j-1}|)\\ \le&2K\sum_{j=0}^n|M_j|\end{align}\] Since \(M_n\) is a martingale, \(\forall j: \mathbb{E}|M_j|<\infty\), i.e., \(\forall n: \mathbb{E}|G_n|<\infty\).

  2. Recall \(\phi_j=\phi_j(M_1,\dots,M_{j-1})\) and \[\begin{align} G_{n+1}=\sum_{j=1}^{n+1}\phi_j(M_j-M_{j-1})=G_n+\phi_{n+1}(M_{n+1}-M_n)\end{align}\] Thus: \[\begin{align} \mathbb{E}_n[G_{n+1}] =&\mathbb{E}_n[G_n+\phi_{n+1}(M_{n+1}-M_n)] \quad \text{(linearity, t.o.w.i.k., $G$ adapted)}\\ =&G_n+\phi_{n+1}\mathbb{E}_n[(M_{n+1}-M_n)] \quad \text{(t.o.w.i.k.)}\\ =&G_n+\phi_{n+1}(\mathbb{E}_n[M_{n+1}]-\mathbb{E}_n[M_n]) \quad \text{(linearity)}\\ =&G_n+\phi_{n+1}(M_{n}-M_n) \quad \text{($M_n$ a martingale, t.o.w.i.k.)}\\ =&G_n\end{align}\]

  3. Since \(G_n\) depends only on \(M_1,\dots,M_n\) and \(M_n\) is adapted then \(G_n\) is adapted.

\(\square\)