DerivativeCalculus.com

Linear Algebra Tools Guide

Comprehensive computational methods for solving linear systems, matrix operations, eigenvalues, and numerical linear algebra.

🔧 Tool Categories Overview

Direct Methods

Gaussian elimination, LU decomposition

Iterative Methods

Jacobi, Gauss-Seidel, Conjugate Gradient

Matrix Factorization

QR, SVD, Cholesky, Eigen decomposition

Specialized Solvers

Sparse matrices, banded systems

🎯 Gaussian Elimination

Purpose: Solve Ax = b or find matrix inverse

Input: Augmented matrix [A|b]
1. Forward elimination (row echelon)
2. Back substitution
Output: Solution vector x
Time Complexity: O(n³)
Example System:
2x + y = 5
x - y = 1
Solution: x = 2, y = 1
def gaussian_elimination(A, b):
  # Forward elimination
  n = len(A)
  for i in range(n):
    # Pivot selection...

When to use: Small to medium systems (n ≤ 1000), exact arithmetic needed

🔀 LU Decomposition

Purpose: Factor A = LU, then solve multiple systems efficiently

A = L·U where:
L = lower triangular
U = upper triangular
Solve: Ly = b, then Ux = y
Decomposition: O(n³)
Solve after LU: O(n²)
Example:
A = [[2,1],[1,-1]]
L = [[1,0],[0.5,1]]
U = [[2,1],[0,-1.5]]
# Python using scipy
from scipy.linalg import lu
P, L, U = lu(A)
# P is permutation matrix

When to use: Solving Ax = b for multiple b vectors with same A

📐 QR Factorization

Purpose: Solve least squares, eigenvalue problems, orthogonalization

A = Q·R where:
Q = orthogonal matrix (QᵀQ = I)
R = upper triangular
Methods: Gram-Schmidt, Householder, Givens
Householder QR: O(2n³/3)
Least Squares: Solve min ||Ax - b||
Compute: x = R⁻¹Qᵀb
More stable than normal equations
# MATLAB/Octave
[Q,R] = qr(A);
x = R\(Q'*b);
# Least squares solution

When to use: Least squares problems, ill-conditioned systems

🌟 Singular Value Decomposition (SVD)

Purpose: Matrix analysis, rank determination, pseudoinverse, dimensionality reduction

A = UΣVᵀ where:
U, V = orthogonal matrices
Σ = diagonal of singular values σᵢ
Rank(A) = # of nonzero σᵢ
Full SVD: O(mn²) m≥n
Pseudoinverse: A⁺ = VΣ⁺Uᵀ
Σ⁺ = 1/σᵢ for σᵢ > tolerance
Solves min ||x|| for Ax ≈ b
# Python numpy
U, S, Vt = np.linalg.svd(A)
# S is array of singular values
A_inv = Vt.T @ np.diag(1/S) @ U.T

When to use: Rank-deficient systems, data compression (PCA), image processing

🎭 Eigenvalue Computation

Purpose: Find eigenvalues λ and eigenvectors v where Av = λv

Methods:
1. Power iteration (largest |λ|)
2. QR algorithm (all eigenvalues)
3. Lanczos (large sparse)
4. Jacobi (symmetric)
QR algorithm: O(n³) per iteration
Power Iteration:
vₖ₊₁ = Avₖ / ||Avₖ||
λₖ₊₁ = (Avₖ)ᵀvₖ / (vₖᵀvₖ)
Converges to dominant eigenvalue
# Power iteration in Python
v = np.random.rand(n)
for _ in range(max_iter):
  v = A @ v
  v = v / np.linalg.norm(v)
  lambda_est = v @ A @ v

When to use: Stability analysis, principal component analysis, quantum mechanics

🌀 Iterative Solvers

Purpose: Solve large sparse systems Ax = b without forming A⁻¹

Krylov Methods:
• Conjugate Gradient (CG) - symmetric positive definite
• GMRES - general matrices
• BiCGSTAB - nonsymmetric
Based on Krylov subspace Kₖ(A,b)
CG per iteration: O(nnz)
nnz = nonzeros in A
Conjugate Gradient:
x₀ initial guess, r₀ = b - Ax₀
p₀ = r₀
αₖ = rₖᵀrₖ / pₖᵀApₖ
xₖ₊₁ = xₖ + αₖpₖ
rₖ₊₁ = rₖ - αₖApₖ

When to use: Large sparse systems (n > 10,000), PDE discretizations

📊 Linear Algebra Tools Comparison

Method Best For Complexity Stability Implementation
Gaussian Elimination Small dense systems O(n³) Good with pivoting Easy
LU Decomposition Multiple RHS O(n³) setup, O(n²) solve Good with pivoting Moderate
QR Factorization Least squares, ill-conditioned O(2n³/3) Excellent Moderate
SVD Rank-deficient, analysis O(mn²) Excellent Complex
Conjugate Gradient Large sparse SPD O(nnz) per iteration Good with preconditioning Moderate

💻 Software & Libraries

Python (NumPy/SciPy)

numpy.linalg, scipy.linalg, scipy.sparse

MATLAB/Octave

Built-in: \, inv, eig, svd, lu, qr

Julia

LinearAlgebra stdlib, fast JIT compilation

C++ (Eigen)

Template library, expression templates

R

base::solve, Matrix package for sparse

FORTRAN (LAPACK)

Industry standard, highly optimized

🎯 Practical Application Examples
1. Linear Regression

Solve (XᵀX)β = Xᵀy

Methods:
• Normal equations (XᵀX)⁻¹Xᵀy
• QR: β = R⁻¹Qᵀy
• SVD: β = VΣ⁺Uᵀy
2. Image Compression

Using SVD/PCA

A = UΣVᵀ ≈ UₖΣₖVₖᵀ
Keep k largest σᵢ
Compression ratio: k(m+n+1)/(mn)
3. Network Analysis

PageRank as eigenvector

r = αMr + (1-α)e/n
Solve: (I - αM)r = (1-α)e/n
Use power iteration
# Linear regression with QR in Python
import numpy as np
from scipy.linalg import qr

Q, R = qr(X, mode='economic')
beta = np.linalg.solve(R, Q.T @ y)
# More stable than XᵀX method

⚠️ Numerical Considerations

Condition Number

κ(A) = ||A||·||A⁻¹||. If κ(A) ≈ 10ᵏ, lose k digits of accuracy.

Pivoting

Essential for stability. Partial pivoting (row swaps) or complete pivoting.

Ill-conditioned Matrices

Hilbert matrix: Hᵢⱼ = 1/(i+j-1). κ(Hₙ) grows exponentially.

Sparse Storage

CSR, CSC formats for large sparse matrices to save memory.

💡 Quick Decision Guide

Small dense A, one b

Use Gaussian elimination or A\b

Small dense A, many b

LU decomposition, then solve

Least squares

QR or SVD (more stable than normal equations)

Large sparse SPD

Conjugate Gradient with preconditioning

All eigenvalues

QR algorithm for small, Lanczos for large sparse

Matrix analysis

SVD for rank, condition number, pseudoinverse