orthonormal basis
(Both orthogonal and normalize. )
- The length||x||;
- x^T*y = 0(perpendicular vectors)
- Linearly independent vectors ==> create perpendicular vectors.
Column space orthogonal to Left nullspace. (in R^m)
Row space orthogonal to Nullspace. (in R^n)
2.3
rank = # of independent row
independent => all weights =0 is the only way to produce zero.
N(A) = {zero vector} ==> The columns of A are independent.
echelon matrix U/reduced matrix R:
- nonzero rows must be independent
- pivot columns are linearly independent.
2G(p94)
2H
The dimension of the column space is the rank of the matrix.
Row space: (r) ~~ R^n
- The nonzero rows are a basis; row space has dimension r.
- row space: A = U = R = dimension r. (same row space!!!)
Nullspace: (n-r)
- "special solution" = a basis ---- each free var = 1 in turn.
- Ax = 0, Ux = 0, Rx = 0
Column space: (r) ~~ R^m
- basis == The columns with pivot
row rank = column rank
2.4 Four fundamental subspace
Existence of inverse
- inverse exists only when the rank is as large as possible. (say, m, n)
- only square matrix has a two-sided inverse. A: BA = I, AC = I.
2.6 Linear transformation
Rotation
- inverse, square, product - 遵循三角函数规则。
Projection
- P^2 = P
Reflection
- two reflection bring back to the original one. H^2 = I
Orthogonality
3.1
The fundamental subspace meets at right angle.
3.2 Cosine and Projection onto Lines
3.3 Projection and Least Square
Gaussian elimination fails - more eqns than unknowns.
So, we use least square:
E^2
when the derivative is zero, the min error is at the lowest point.
[calculus]: d(E^2)/(dx)
[geometry]: x^ =...= aT*b/aT*a
Least square problem with several var.
To project b onto a subspace. (rather than a line.)
找到(least square solution --- x^)
p = the projection of b onto the column space
3.4
orthogonal正交
orthonormal 正交且规范。
Q (square or rectangle) has orthonormal columns,
=====> Q^T*Q = I
(square orthonormal matrix, inverse = transpose)
只有方阵才有inverse。Any permutation matrix = orthonormal matrix!!!
(unit and orthogonal)