-
1. Let \(U\) be the open interval \((-1,1) \subset \R \), and let \(f : U \to \R , \; x \mapsto x^2\).
-
(i) For \(x \in U\), what are the domain and codomain of \(Df_x\), the derivative of \(f\) at \(x\)?
-
(ii) For which \(x \in U\) is \(Df_x\) injective?
-
(iii) What are the domain and codomain of the derivative function \(Df\)?
-
(iv) Is \(Df\) injective?
[Hint: \(Df_x\) is a linear map represented by a \(1 \times 1\) matrix, whose only entry is \(\frac {df}{dx} \).]
-
-
3. Let \((V, \norm {\cdot }_V)\) and \((W, \norm {\cdot }_W)\) be normed vector spaces, and let \(\cL (V, W)\) be the vector space of linear maps \(V \to W\). For \(\phi \in \cL (V,W)\), define its operator norm by
\[ \norm {\phi }_{op} := \sup _v \norm {\phi (v)}_W, \]
where the supremum is taken over \(v \in V\) such that \(\norm {v}_V = 1\). Show that \(\norm {\cdot }_{op}\) is a norm on \(\cL (V,W)\).
[Hint: Use the defining properties of the norm on \(W\) to derive the properties required for the operator norm.]
-
4. Which of the following functions are smooth?
-
(i) \(f : S \to \R , \; x \mapsto \sqrt [3]{x^3-2}\), where \(S := \Q \subset \R \).
-
(ii) \(g : S \to \R , \; (x,y) \mapsto \left \{ \begin {aligned} \sqrt {y} & \textrm { if } x \geq 0 \\ -\sqrt {y} & \textrm { if } x \leq 0 \end {aligned} \right . \), where \(S := \{(x,y) : y = x^2 \} \subset \R ^2\).
-
(iii) \(h : S \to \R , \; (x,y) \mapsto \sqrt {y} \), where \(S := \{(x,y) : y = x^2 \} \subset \R ^2\).
[Hint: (i) What is the biggest open subset \(U \subset \mathbb {R}\) such that the function \(x \mapsto \sqrt [3]{x^3-2}\) is differentiable on \(U\)?
(ii) Find a simple function \(G : \mathbb {R}^2 \to \mathbb {R}\) such that the restriction of \(G\) to \(S\) equals \(g\).
(iii) Suppose \(U \subseteq \mathbb {R}^2\) is an open subset containing the origin, and \(H : U \to \mathbb {R}\) is a smooth function whose restriction to \(U \cap S\) equals \(h\). For a suitable interval \(I \subseteq \mathbb {R}\), what can you say about the composition of \(I \to \mathbb {R}^2,\; t \mapsto (t, t^2)\) with \(H\)? ]
-
-
5. Let \(U \subseteq \R ^n\) open, and let \(f : U \to \R ^m\) be a smooth function. If \(Df_x\) is injective for every \(x \in U\), must \(f\) be injective?
[Hint: We will see later that if \(Df_x\) at \(x\) in \(U\) then \(f\) is injective on a neighbourhood of \(x\)—but does this imply \(f\) is injective on all of \(U\)? Think of simple examples (angles on a circle maybe?)]
-
-
(i) Compute the derivative of the matrix multiplication map
\[ m : M_{m,n}(\R ) \times M_{n,p}(\R ) \to M_{m,p}(\R ), \; (A,B) \mapsto AB \]
-
(ii) Compute the derivative of \(s: M_{n,n}(\R ) \to M_{n,n}(\R ), \; A \mapsto A^2\).
[Hint: (i) This is the product rule for matrix multiplication, so we expect \(Dm_{(A,B)}(X,Y) = XB + AY\): prove that this expectation is correct, using the operator norm on matrices (viewed as linear maps), which satisfies \(\norm {X Y}\leq \norm {X} \norm {Y}\).
(ii) Write \(s\) as the composition of the diagonal map \(M_{n,n}(\mathbb {R}) \to M_{n,n}(\mathbb {R}) \times M_{n,n}(\mathbb {R}), \; A \mapsto (A,A)\) and \(m : M_{n,n}(\mathbb {R}) \times M_{n,n}(\mathbb {R}) \to M_{n,n}(\mathbb {R})\). ]
-
MA40254 Differential and geometric analysis : Solutions 1
-
1.
-
(i) \(Df_x\) is the linear map \(v \mapsto 2xv\) with domain \(\R \) and codomain \(\R \).
-
(ii) \(v \mapsto 2xv\) is injective if and only if \(x \neq 0\).
-
(iii) \(Df\) is a function \(U \to \cL (\R , \R )\), where \(\cL (\R , \R )\) is the space of linear maps from \(\R \) to itself. (\(\cL (\R ,\R )\) is naturally isomorphic to \(\R \).)
-
(iv) The linear maps \(Df_x : v \mapsto 2xv\) and \(Df_y : v \mapsto 2yv\) are equal if and only if \(2x=2y\) if and only if \(x = y\), so \(Df\) is injective.
-
-
2. \(Df_{(x,y)}\) is represented by the matrix of partial derivatives
\[ \pmat {3x^2 & -2y \\ y & x} . \]
This is invertible if and only if the determinant \(3x^3 + 2y^2\) is non-zero, i.e., \(x = -\sqrt [3]{2y^2/3}\).
-
3. There are three things to check.
-
(i) For \(\phi \in \cL (V,W)\), we have \(\norm {\phi }_{op} \geq 0\) with equality if and only if \(\phi = 0\).
Since \(\norm {w}_W\) is non-negative for all \(w \in W\), the \(\sup _v \norm {\phi (v)}_W\) is non-negative too. If the latter is 0, then \(\phi (v) = 0\) for all \(v \in V\) such that \(\norm {v}_V = 1\), and by linearity \(\phi \) must be 0.
-
(ii) \(\norm {\lambda \phi }_{op} = |\lambda | \norm {\phi }_{op}\)
\(\sup _v \norm {\lambda \phi (v)}_W = \sup _v |\lambda | \norm {\phi (v)}_W = |\lambda | \sup _v \norm {\phi (v)}_W\)
-
(iii) \(\norm {\phi + \psi }_{op} \leq \norm {\phi }_{op} + \norm {\psi }_{op}\).
\(\sup _v \norm {\phi (v) + \psi (v)}_{W} \leq \sup _v (\norm {\phi (v)}_W + \norm {\psi (v)}_W) \leq \sup _v \norm {\phi (v)}_W + \sup _v \norm {\psi (v)}_W\)
-
-
4.
-
(i) Let \(U := \R \setminus \{\sqrt [3]{2}\}\). Then \(F : U \to \R , \; x \mapsto \sqrt [3]{x^3-2}\) is a smooth function on an open set whose restriction to \(S\) equals \(f\). Thus \(f\) is smooth.
-
(ii) Note that \(g(x,y) = x\) for any \((x,y) \in S\). Thus if we set \(G : \R ^2 \to \R , \; (x,y) \mapsto x\) then \(G\) is a smooth function whose restriction to \(S\) equals \(g\). Hence \(g\) is smooth.
-
(iii) Suppose \(h\) is smooth. Then there is an open subset \(U \subseteq \mathbb {R}^2\) containing the origin, and a smooth function \(H : U \to \mathbb {R}\) whose restriction to \(U \cap S\) equals \(h\).
Let \(\varphi : \R \to \R ^2, \; t \mapsto (t,t^2)\), and let \(I := \varphi ^{-1}(U)\). Then the composition \(H \circ \gamma : I \to \R \) is smooth by the chain rule. But in fact \(H \circ \varphi (t) = |t|\), which is not smooth at \(t = 0 \in I\).
-
-
5. No: take \(U := \R \), and let \(f : \R \to \R ^2, \; \theta \mapsto (\cos \theta , \sin \theta )\). Then \(Df_{\theta }(v) =\bigl (-v\sin \theta ,v\cos \theta )\) which is injective for all \(\theta \); but \(f(2\pi )=f(0)\). [An alternative counterexample is the map \(x\mapsto x^2\) from \(\R \setminus \{0\}\) to \(\R \).]
-
6.
-
(i) We check that \(Dm_{(A,B)}(X,Y) = XB + AY\):
\[ \frac {\norm {(m(A+X, B+Y) - m(A,B) - (XB + AY)}}{\norm {(X,Y)}} = \frac {\norm {XY}}{\norm {(X,Y)}}. \]
We are free to choose which norms we use: convenient choices are the operator norm on matrices (viewed as linear maps), because this satisfies \(\norm {XY}\leq \norm {X}\norm {Y}\), and, on pairs of matrices, the norm \(\norm {(X,Y)}=\max \{\norm {X},\norm {Y}\}\). Then the ratio above is \(\leq \min \{\norm {X},\norm {Y}\}\), which tends to \(0\) as \((X,Y) \to 0\).
-
(ii) Let \(\Delta : M_{n,n}(\mathbb {R}) \to M_{n,n}(\mathbb {R}) \times M_{n,n}(\mathbb {R})\) denote the diagonal map \(A \mapsto (A,A)\). Then \(s = m \circ \Delta \), so by the chain rule \(Ds_A(X) = Dm_{(A,A)}(X,X) = AX + XA\).
-