-*- text -*-
######################################################################
## Filename: diary
## Description: Diary of MA20216 lectures 2023
##
## Author: Francis Burstall
## Modified at: Tue Dec 5 16:04:03 2023
## Modified by: Fran Burstall
######################################################################
Lecture 1: Introduction, administration and propaganda.
Chapter 1. Linear algebra: concepts and examples: defn of
vector space. Examples.
[Page 2: top]
Lecture 2: Familiar cases of F^I. Vector subspace: defn,
efficient characterisation and examples. Linearly
indep lists and spanning lists of vectors; bases; dimension.
Standard basis of F^I and examples.
[notes: end of 1.3.1, page 4]
Lecture 3: Useful facts: extension property of linearly
indep lists; dim of subspace. Linear maps: defn, kernel,
image, isomorphism. Examples from analysis. Set of linear
maps is vector space under pointwise addition/scalar
multiplication. The matrix of a linear map with respect to
bases on domain and codomain. Fancy explanation of linear
maps vs matrices, bases give linear isomorphism F^n to V.
[notes: page 6, middle]
Lecture 4: Extension by linearity. Rank-nullity: statement.
Application: if dim V=dim W, a linear map from V to W
injects if and only if surjects if and only if bijects.
Chapter 2: Sums and quotients. Sum of subspaces: defn and
characterisation as minimal subspace containing the
summands.
[notes: end of section 2.1, page 9]
Lecture 5: Direct sums. Criterion for a sum to be direct.
Two summand case: complements. Many summand case.
Projections: defn, any two summand direct sum comes from
projections. Corollary: dimensions add for two summand
direct sums.
[notes: end of section 2.2.1, page 12]
Lecture 6: Inductively defining direct sums. Application:
dimensions add for arbitrary direct sums. Sum is direct iff
bases of summands concatenate to a basis of the sum.
Complements exist for subspaces of finite-diml
V. Application: Extension from a subspace.
[notes: end of section 2.2]
Lecture 7: Quotients: congruence modulo a subspace U; this
is an equivalence relation; equivalence classes are cosets;
example: fibres of linear maps; quotient space V/U is set of
cosets of U. V/U is a vector space for which quotient map
q:V->V/U is linear surjection with kernel U (statement and
proof that the addition/scalar multiplication are
well-defined).
[notes: page 16, middle]
Lecture 8: V/U is a vector space for which quotient map
q:V->V/U is linear surjection with kernel U (proof). Dim V/U
= dim V - dim U. How to think about quotients: V/U is
vector space with surjective linear map q:V -> V/U with
kernel U: this is all you need to know. First Isomorphism
Theorem.
Chapter 3: polynomials, operators and matrices.
Polynomials: definitions.
[notes: page 18, middle]
Lecture 9: Polynomials: addition and
multiplication. Evaluating polynomials to get
functions. Results from Algebra 1A: remainder theorem,
Fundamental Theorem of Algebra, roots and their
multiplicities. Linear operators and square matrices:
relation between these given a basis. Evaluation of
polynomials on operators and matrices. Evaluation of
polynomials on a fixed operator/matrix preserves addition
and multiplication (statement).
[notes: page 20, bottom]
Lecture 10: Evaluation of polynomials on a fixed
operator/matrix preserves addition and multiplication
(proof). Minimum polynomial of an operator: definition,
existence and examples. How to compute the minimum
polynomial.
[notes: page 22, bottom]
Lecture 11: Minimum polynomial divides any polynomial on
which the operator vanishes. Revision of Eigenvalues and
eigenvectors. Algebraic and geometric multiplicities. Any
eigenvector of an operator lambda is an eigenvector of any
polynomial p of that operator with eigenvalue p(lambda).
Corollary: any eigenvalue is a root of the minimum
polynomial. Cayley-Hamilton theorem: statement and
discussion. Corollary: minimum polynomial divides
characteristic polynomial so that the roots of the minimum
polynomial are exactly the eigenvalues.
Lecture 12: Proof of Cayley-Hamilton. Use this to compute
minimum polynomial when we can factorise the characteristic
polynomial.
New chapter: structure of linear operators. Setting the
scene: similarity problem, normal forms as solution to same.
Example of diagonalisable matrices. Main issue in
non-diagonalisable case is not enough eigenvectors to span.
[notes: end of section 4.1]
Lecture 13: Invariant subspaces: defn and examples. Direct
sum of linear operators on subspaces and direct sum of
matrices. Direct sum of linear operators is linear;
restricts to summands; has direct sum of matrices as matrix.
Any direct sum decomposition into phi-invariant subspaces
means the phi is a direct sum.
[notes: end of Prop 4.3, page 30 middle]
Lecture 14: Properties of a direct sum of operators are
deduced from those of the summands. Operator is
diagonalisable if and only if domain is a direct sum of
eigenspaces. Kernels of powers of an operator increase,
images decrease.
[notes: end of Prop 4.6, page 32, middle]
Lecture 15: In finite-dimensional case, kernels/images of
powers stabilise. Fitting's Lemma. Generalised
eigenvectors/eigenspaces: definition. Generalised eigenspace
of f is an f-invariant subspace containing the eigenspace.
Generalised eigenspaces for distinct eigenvalues have
trivial intersection.
[notes, end of Lemma 4.10, page 34, top]
Lecture 16: Jordan decomposition. Defn of nilpotent
operator. Operator is nilpotent if and only if it has
strictly upper triangular matrix with respect to some
basis.
[notes: end of Prop 4.1, page 35]
Lecture 17: Corollaries: (1) algebraic multiplicity of
eigenvalue is dim of generalised eigenspace (2) generalised
eigenspace=ker(f-lam id)^s where s is multiplicity of lam as
root of minimum polynomial. Example of computing all this.
Jordan blocks: definition and basic properties.
[notes: page 37, bottom]
Lecture 18: Any nilpotent operator on a finite-dimensional
vector space has as matrix a direct sum of Jordan blocks
with respect to some basis.
[notes: end of proof of (4.17), page 39]
Lecture 19: How to count number of Jordan blocks of given
size. Minimum polynomial of nilpotent op. Jordan normal
form. Relation to minimum polynomial. Solution of
similarity problem for matrices. Examples and
computations.
[notes: page 41, middle]
Lecture 20: More examples and computations. New chapter:
Duality. Defn of dual space. Examples of linear
functionals.
[notes: page 43, bottom]
Lecture 21: Dual basis of V^* to one of V. dim V=dim V^*.
Sufficiency Principle. All bases of V^* are dual bases.
Evaluation is a linear injection V to V^** and so an
isomorphism in the finite-dimensional case. Dual space as
set of linear equations. Solution sets: defn.
[notes: page 46, middle]
Lecture 22: Dimension of solution set. Criterion
for linear functionals to span V^* and application thereof.
Properties of solutions sets under inclusion, sum and
intersection. Annihilators: defn, dimension and properties
under inclusion, sum and intersection. Relation between sol
and ann: mutually inverse when V finite-dimensional.
[notes: end of section 5.2, page 48 bottom]
Lecture 23:Transposes: definition and linearity. Examples
of transposes. Rant about categories and functors.
Transposes and matrices. Relation between kernels and images
of a map and its transpose.
Row-rank=column-rank. Application: f injects/surjects if and
only if f^T surjects/injects (for finite-dimensional domain
and target).
[notes: end of chapter 5]
Lecture 24: New chapter: bilinearity. Defn of bilinear map,
pairing and form. Examples. Matrices give bilinear maps on
F^mxF^n. Bilinear forms and matrices, change of basis
formula, congruent matrices. Symmetric bilinear forms: defn
and equivalence to symmetry of matrices.
[notes: page 54 top]
Lecture 25: Radical and rank of a symmetric bilinear form.
Interpretation via induced map V to V^*. Matrix of linear
map. Examples. Diagonalisation theorem.
[notes: end of (6.5), page 56]
Lecture 26: Refinement of diagonalisation thm when F=R or C.
Matrix formulation. Alternative approach via spectral thm.
How to find a diagonalising basis. Signature of a symmetric
bilinear form. Statement of inertia theorem to compute
same.
[notes: page 58, theorem (6.7)]
Lecture 27: Proof of Inertia Theorem. Example of use.
Quadratic forms: definition and examples. Polarisation of a
quadratic form. How to compute the polarisation of a
quadratic form.
[notes: top of page 60]
Lecture 28: What the diagonalisation and inertia theorems mean
for quadratic forms over R and C: linear combinations of
squares. Example with alternative strategies to compute
rank and signature.
[notes: end of chapter 6]
THAT'S ALL, FOLKS!