M216: Exercise sheet 8

    Warmup questions

  • 1. Let \(\lst \alpha 1k\) span \(E\leq V^{*}\). Show that

    \begin{equation*} \sol E=\bigcap _{i=1}^k\ker \alpha _i. \end{equation*}

  • 2. Define \(\alpha ,\beta \in (\R ^3)^{*}\) be given by

    \begin{align*} \alpha (x)&=2x_{1}+x_2-x_3\\ \beta (x)&=x_1-x_2+x_3, \end{align*} for \(x\in \R ^3\).

    Let \(E=\Span {\alpha ,\beta }\) and compute \(\sol E\).

    Homework

  • 3. Let \(A,B\in M_4(\C )\) be given by

    \begin{equation*} A= \begin{pmatrix} 0&1&1&1\\0&0&0&0\\0&0&0&0\\0&0&0&0 \end {pmatrix}\qquad B= \begin{pmatrix*}[r] 0&1&1&1\\0&0&0&-1\\0&0&0&1\\0&0&0&0 \end {pmatrix*} \end{equation*}

    Compute the Jordan normal forms of \(A\) and \(B\).

    Are \(A\) and \(B\) similar?

  • 4. Let \(U\leq V\) and \(v\in V\) with \(v\notin U\). Show that there is \(\alpha \in V^{*}\) such that \(\alpha \) is zero on \(U\) but \(\alpha (v)\neq 0\).

    Hint: Apply theorem 5.3 to \(V/U\).

    Extra questions

  • 5. Let \(V\) be a vector space over a field \(\F \) and let \(\alpha ,\beta \in V^{*}\) be non-zero linear functionals.

    Prove that \(\ker \alpha =\ker \beta \) if and only there is non-zero \(\lambda \in \F \) such that \(\alpha =\lambda \beta \).

    Hint: If \(v_0\notin \ker \alpha \), show that \(V=\Span {v_0}+\ker \alpha \).

  • 6. Let \(V\) be a vector space over \(\F \). For \(v\in V\), define \(\ev (v):V^{*}\to \F \) by

    \begin{equation*} \ev (v)(\alpha )=\alpha (v). \end{equation*}

    • (a) Show that \(\ev (v)\) is linear so that \(\ev (v)\in V^{**}\).

    • (b) We therefore have a map \(\ev :V\to V^{**}\). Show that \(\ev \) is linear.

    • (c) Show that \(\ev \) is injective.

    • (d) Deduce that if \(V\) is finite-dimensional then \(\ev :V\to V^{**}\) is an isomorphism.

Please hand in at 4W level 1 by NOON on Friday December 1st


M216: Exercise sheet 8—Solutions

  • 1. Let \(v\in \sol E\) so that \(\alpha (v)=0\), for all \(\alpha \in E\). Then, in particular, each \(\alpha _i(v)=0\) so that \(v\in \ker \alpha _i\), for \(\bw 1ik\). That is, \(v\in \bigcap _{i=1}^k\ker \alpha _i\) and \(\sol E\leq \bigcap _{i=1}^k\ker \alpha _i\).

    Conversely, let \(v\in \bigcap _{i=1}^k\ker \alpha _i\) so that \(\alpha _i(v)=0\), for \(\bw 1ik\). Let \(\alpha \in E\). Then \(\alpha =\sum _{i=1}^k\lambda _i\alpha _i\), for some \(\lst \lambda 1k\in \F \), since the \(\alpha _i\) span \(E\), and

    \begin{equation*} \alpha (v)=\sum _{i=1}^k\lambda _i\alpha _i(v)=0 \end{equation*}

    so that \(v\in \sol E\). Thus \(\bigcap _{i=1}^k\ker \alpha _i\leq \sol E\) and we are done.

  • 2. According to question 1, \(\sol E\) consists of those \(x\in \R ^3\) such that \(\alpha (x)=\beta (x)=0\), that is, such that

    \begin{align*} 2x_{1}+x_2-x_3&=0\\x_1-x_2+x_3&=0. \end{align*} Adding these gives \(3x_1=0\) and then the first gives \(x_2=x_3\) so that \(\sol E=\Span {(0,1,1)}\).

  • 3. Both being upper triangular, we see that \(\Delta _A=\Delta _B=x^4\) so that the only eigenvalue of \(A\) or \(B\) is \(0\). Moreover, we compute to see that \(A^2=B^2=0\) so that \(m_A=x^2\). Thus both \(A\) and \(B\) have at least one \(2\times 2\) Jordan block \(J_{2}\). Thus the possibilities for the Jordan normal form of either are \(J_2\oplus J_2\) or \(J_{2}\oplus J_1\oplus J_1\). To distinguish these, recall that the number of Jordan blocks with eigenvalue 0 is the dimension of the kernel. Now \(A\) has clearly has row rank \(1\) and so \(3\)-dimensional kernel. Thus \(A\) has Jordan normal form \(J_2\oplus J_1\oplus J_1\).

    Meanwhile \(B\) has row rank \(2\), thus nullity \(2\) so that it has JNF \(J_2\oplus J_2\).

    Since they have different JNF, \(A\) and \(B\) are not similar.

  • 4. Let \(q:V\to V/U\) be the quotient map so that \(q\) is a linear surjection with kernel \(U\) (this is all we need to know about the quotient construction). Since \(v\notin U\), \(q(v)\neq 0\) so that, by the Sufficiency Principle (Theorem 5.3), there is \(\beta \in (V/U)^{*}\) such that \(\beta (q(v))\neq 0\).

    Let \(\alpha =\beta \circ q:V\to \F \). This is linear, being a composition of linear maps, so \(\alpha \in V^{*}\). Moreover, \(\alpha (v)=\beta (q(v))\neq 0\) while, if \(u\in U\), \(q(u)=0\) so that \(\alpha (u)=\beta (0)=0\).

  • 5. The reverse implication is clear: if \(\lambda \neq 0\) and \(\alpha =\lambda \beta \) then \(\alpha (v)=0\) if and only if \(\lambda \alpha (v)=\beta (v)=0\).

    Now suppose that \(\ker \alpha =\ker \beta \) with \(\alpha \neq 0\). Thus there is \(v_0\in V\) such that \(\alpha (v_0)\neq 0\). Following the hint, let \(v\in V\) and observe that \(v-(\alpha (v)/\alpha (v_0))v_0\in \ker \alpha \) so that \(V=\Span {v_0}+\ker \alpha \).

    Now, since \(v_0\notin \ker \alpha =\ker \beta \), \(\beta (v_0)\neq 0\) also. Set \(\lambda =\alpha (v_0)/\beta (v_0)\) so that

    \begin{equation*} \alpha (v_0)=\lambda \beta (v_0). \end{equation*}

    Further \(\alpha (v)=\lambda \beta (v)\), for all \(v\in \ker \alpha \), since both sides are zero. It follows that \(\alpha =\lambda \beta \) on \(\Span {v_0}+\ker \alpha =V\).

  • 6. This is a case of thinking carefully what each statement means after which it will be very easy to prove.

    • (a) To see that \(\ev (v):V^{*}\to \F \) is linear, we must show that

      \begin{equation*} \ev (v)(\alpha +\lambda \beta )=\ev (v)(\alpha )+\lambda \ev (v)(\beta ), \end{equation*}

      for all \(\alpha ,\beta \in V^{*}\) and \(\lambda \in \F \). Using the definition of \(\ev (v)\), this reads

      \begin{equation*} (\alpha +\lambda \beta )(v)=\alpha (v)+\lambda \beta (v) \end{equation*}

      which is exactly the definition of the (pointwise) addition and scalar multiplication in \(V^{*}\).

    • (b) Linearity of \(\ev :V\to V^{**}\) means that for \(v,w\in V\) and \(\lambda \in \F \), we have

      \begin{equation*} \ev (v+\lambda w)=\ev (v)+\lambda \ev (w). \end{equation*}

      This is supposed to be equality of elements of \(V^{**}\), that is to say, equality of two functions on \(V^{*}\). This holds when the two functions give the same answers on any \(\alpha \in V^{*}\) so we need

      \begin{equation*} \ev (v+\lambda w)(\alpha )=\ev (v)(\alpha )+\lambda \ev (w)(\alpha ). \end{equation*}

      However, using the definition of \(\ev \), this reads

      \begin{equation*} \alpha (v+\lambda w)=\alpha (v)+\lambda \alpha (w) \end{equation*}

      which is true since \(\alpha \) is linear!

    • (c) \(\ev \) is injective if and only if \(\ker \ev =\set 0\). Let \(v\in \ker \ev \). Thus \(\ev (v)=0\in V^{**}\), the zero functional on \(V^{*}\). Otherwise said, \(\ev (v)(\alpha )=0\), for all \(\alpha \in V^{*}\), or equivalently, \(\alpha (v)=0\), for all \(\alpha \in V^{*}\). But the Sufficiency Principle now forces \(v=0\) so that \(\ev \) injects.

    • (d) If \(v\) is finite-dimensional, \(\dim V=\dim V^{*}=\dim V^{**}\) so that \(\ev \) is an isomorphism by rank-nullity since we have just seen that it injects.