-
-
(i) Let \(M \subseteq \R ^s\) be a submanifold. Let \(P \subseteq \R ^s\) be an open subset that contains \(M\), and let \(f : P \to \R ^m\) be a smooth function. Suppose that the restriction of \(f\) to \(M\) is constant. Show that \(T_pM \subseteq \ker Df_p \subseteq \R ^s\) for any \(p \in M\).
-
(ii) Let \(P \subseteq \R ^s\) be an open subset, \(f : P \to \R ^m\) a smooth function, \(q \in \R ^m\) a regular value of \(f\), and \(M := f^{-1}(q)\). Show that \(T_pM = \ker Df_p \subseteq \R ^s\) for any \(p \in M\).
-
-
5. For \(v_1, \ldots , v_n \in \R ^n\), let \(\Det (v_1, \ldots , v_n) \in \R \) denote the determinant of the \(n \times n\) matrix with columns \(v_1, \ldots , v_n\).
-
(i) Show that \(\Det \) spans \(\Alt ^n(\R ^n)\).
-
(ii) For any \(u, v \in \R ^3\), show that there is a unique \(u \times v \in \R ^3\) such that for any \(w \in \R ^3\),
\[ \Det (u, v, w) = (u \times v).w . \]
Here the right hand side is the Euclidean inner product of the vectors \(u \times v\) and \(w\).
-
MA40254 Differential and geometric analysis : Solutions 4
-
1.
-
(i) Let \(\varphi : U' \to U \subseteq M\) be a parametrisation with \(p \in U\), say \(p = \varphi (x)\). Then \(f \circ \varphi : U' \to \R ^m\) is constant, so the chain rule gives
\[ Df_p \circ D\varphi _x = D(f \circ \varphi )_x = 0 . \]
Thus \(T_pM\), the image of \(D\varphi _x : \R ^n \to \R ^s\), is contained in the kernel of \(Df_p : \R ^s \to \R ^m\).
-
(ii) The dimension of \(\ker Df_p\) is \(s-m\) by the Rank-Nullity theorem. On the other hand, we know that \(f^{-1}(q)\) is a submanifold of dimension \(s-m\), and that \(T_p M\) has the same dimension as \(M\). Since \(T_p M\) and \(\ker Df_p\) have equal dimension, equality must hold in \(T_pM \subseteq \ker Df_p\).
-
-
2. For \(S(x,y) \cup S(x', y')\) to be a submanifold of \(\R ^2\), one needs \(S(x,y)\) to be disjoint from the closure \(\overline {S(x',y')} \subset \R ^2\) and vice versa, or that \(x, y, x', y'\) are all colinear.
If \(S(x,y)\) and \(S(x',y')\) intersect in a single point \(z\), then for any open neighbourhood \(U \subseteq S(x,y) \cup S(x', y')\) of \(z\), \(U \setminus \{z\}\) has 4 connected components. Thus \(z\) has no neighbourhood \(U\) diffeomorphic to an interval.
Similarly, if \(x \in S(x',y')\) then for any neighbourhood \(U\) of \(x\), \(U \setminus \{x\}\) has 3 connected components.
-
3. For any \(A \in M_{n,n}(\R )\), the matrix \(A^TA\) is symmetric. So if we let \(S \subseteq M_{n,n}(\R )\) denote the subspace of symmetric matrices, then \(f : M_{n,n}(\R ) \to S, \; A \mapsto A^TA\) is a well-defined function, and \(O(n)\) is \(f^{-1}(I)\). Clearly \(f\) is smooth, so to show that \(O(n)\) is a submanifold, we need only check that \(I \in S\) is a regular value of \(f\).
For any \(A \in M_{n,n}(\R )\), the derivative
\[ Df_A : M_{n,n}(\R ) \to S \]
maps
\[ X \mapsto A^TX + X^TA . \]
We want to prove that this is surjective if \(A \in O(n)\). So suppose that \(Y \in S\). Then setting \(X = \frac 12 AY\) gives \(Df_A(X) = Y\). Thus \(Df_A\) is indeed surjective whenever \(A \in O(n)\).
Now \(T_IO(n)\) equals the kernel of \(Df_I : M_{n,n}(\R ) \to S, \; X \mapsto X + X^T\), i.e., \(T_IO(n) \subseteq M_{n,n}(\R )\) is the subspace of anti-symmetric matrices.
-
4. Recall that, if we choose a basis \(e_1, \ldots , e_n\) for \(V\), then any bilinear form \(\omega : V \times V \to \R \) can be represented by a matrix \(A \in M_{n,n}(\R )\), namely
\[ A_{ij} := \omega (e_i, e_j). \]
Conversely, any \(A \in M_{n,n}(\R )\) defines a unique bilinear form. The condition that \(\omega \) is alternating is equivalent to \(A\) being anti-symmetric (i.e., \(A = -A^T\)), so \(\Alt ^2(V)\) is isomorphic to the subspace of anti-symmetric \(n \times n\) matrices, which has dimension \(\binom n2\).
-
5.
-
(i) This is equivalent to the theorem from Algebra 1B that any function \(M_{n,n}(\R ) \to \R \) that is multilinear and alternating as a function of the columns is a scalar multiple of \(\det \).
Alternatively, use the fact that \(\dim \Alt ^n (\R ^n) = \binom {n}{n} = 1\). To deduce that \(\Det \) is a basis, it therefore suffices to check that \(\Det \neq 0\). If \(e_1, \ldots , e_n \in \R ^n\) is the standard basis, then \(\Det (e_1, \ldots , e_n) = \det I = 1\), so \(\Det \) is indeed a non-zero element of \(\Alt ^n(\R ^n)\).
-
(ii) Recall that the determinant is linear as a function of each column, e.g., if we fix \(u\) and \(v\) then \(w \mapsto \Det (u,v,w)\) is a linear map \(\R ^3 \to \R \). The Riesz representation theorem implies that for any linear function \(\R ^3 \to \R \) there is a unique vector \(z \in \R ^3\) such that \(w \mapsto z.w\) equals the given functional. So in particular there is a \(z\) such that \(z.w = \Det (u,v,w)\) for all \(w \in \R ^3\), and we can define \(u \times v\) to be this \(z\). (Of course \(u \times v\) turns out to have a familiar expression in terms of the components of \(u\) and \(v\).)
-