Linear Algebra
Hermitian
- Conjugate Transpose:
MH=¯MT=¯MT
- Hermitian: if MH=M
- (AB)H=BHAH
- AHA must be hermitian
-
Unitary matrix: if MH=M−1
- Hermitian positive definite Matrix:
xHMx>0∀x∈Cn∖0
Vector/Matrix Norms
Vector Norm
-
Vector lp norms: ‖x‖=(∑i|xi|p)1/p
- Infinite norm just find the maximum term of the vector
‖x‖∞=maxi|xi|which is also knownas the maiximum norm.
- Infinite norm just find the maximum term of the vector
‖x‖∞=maxi|xi|
-
Matrix induced norm: ‖x‖2A=xHAx
-
Properties:
- Linearity:
‖kx‖=|k|‖x‖
- Triangle inequality:
‖x+y‖≤‖x‖+‖y‖
- Another inequality:
‖xy‖≤‖x‖‖y‖
- Linearity:
‖kx‖=|k|‖x‖
Matrix Norm
-
Operator norms ‖A‖=max∀x∈Cn∖0‖Ax‖‖x‖
- This norm measures the maximum amount by which the matrix A can re-scale a vector x
- 1-norm:
‖A‖=maxjn∑i=1|aij|which is column of A with maximum l1 norm
-
∞ norm: ‖A‖=maxin∑j|aij|
which is row of A with maximum l1 norm - l2 norm:
‖A‖=√λmax(AHA)
Condition number
- κ(A)=‖A‖‖A−1‖
- For the 2-norm:
κ2(A)=√λmaxAHA√λminAHA
- which is the max singular value over min singular value
- Since eigenvalue is reciprocal for matrix inverse, if A is Hermitian, then:
κ2(A)=|λ(A)|max|λ(A)|min
- A matrix with a large condition number is ill-conditioned, which lead to instable computation → small error lead to large computation error
Iterative Methods for linear systems
Optimisation
- Linear Programming solved by using Simplex Algorithm