AD

Sponsor Us|If you find this site helpful, please consider sponsoring

Support Now →
math

Linear Algebra Core Concepts

Essential linear algebra knowledge for machine learning

#Mathematics #Linear Algebra #Fundamentals

Linear Algebra Core Concepts

Vector

A vector is a quantity with magnitude and direction, used to represent features in machine learning.

Notation:

2D Vector: v = [x, y]
3D Vector: v = [x, y, z]
n-Dimensional Vector: v = [v₁, v₂, ..., vₙ]

Vector Operations:

# Addition
u + v = [u₁ + v₁, u₂ + v₂, ..., uₙ + vₙ]

Scalar Multiplication

αv = [αv₁, αv₂, …, αvₙ]

Dot Product (Inner Product)

u · v = Σ uᵢvᵢ = |u||v|cos(θ)

Norm (Length)

||v|| = √(v₁² + v₂² + … + vₙ²)

Matrix

A matrix is an extension of vectors, used to represent linear transformations and datasets.

Basic Operations:

# Matrix Addition (same dimensions)
(A + B)ᵢⱼ = Aᵢⱼ + Bᵢⱼ

Matrix Multiplication

(C = AB) → Cᵢⱼ = Σₖ AᵢₖBₖⱼ

Transpose

(Aᵀ)ᵢⱼ = Aⱼᵢ

Inverse Matrix (square and invertible)

AA⁻¹ = A⁻¹A = I

Special Matrices

TypeDefinitionExample
Identity Matrix1s on diagonal, 0s elsewhereI = [[1,0],[0,1]]
Diagonal MatrixNon-diagonal elements are 0D = diag(d₁,d₂,…,dₙ)
Symmetric MatrixA = AᵀCovariance matrix
Orthogonal MatrixQᵀQ = QQᵀ = IRotation matrix

Determinant

The determinant is a scalar value of a square matrix, representing the scaling factor of a linear transformation.

2×2 Determinant:

|A| = |a  b|
      |c  d| = ad - bc

Properties:

  • |AB| = |A||B|
  • |Aᵀ| = |A|
  • |A⁻¹| = 1/|A|
  • If |A| = 0, then A is not invertible

Eigenvalues and Eigenvectors

For a square matrix A, if there exists a non-zero vector v and scalar λ satisfying:

Av = λv

Then λ is the eigenvalue and v is the corresponding eigenvector

Solution Steps:

  1. Solve the characteristic equation: |A - λI| = 0
  2. For each λ, solve (A - λI)v = 0 to get the eigenvector

Applications:

  • PCA (Principal Component Analysis)
  • Matrix diagonalization
  • Stability analysis

Matrix Decomposition

Eigen Decomposition:

A = PDP⁻¹

D is the diagonal matrix of eigenvalues, P is the matrix of eigenvectors

Singular Value Decomposition (SVD):

A = UΣVᵀ

U, V are orthogonal matrices, Σ is the diagonal matrix of singular values

Applications:

  • Data dimensionality reduction
  • Recommendation systems
  • Image compression