0 of 0 units All Tracks
Mathematics · Applied
MATH 2B
Introduction to Applied Linear Algebra

Six fundamental problems. Twenty-nine lessons. One truth: mathematics is a tool for thinking more clearly about the world — not a collection of formulas to memorize. We start with real problems that need these ideas. We derive the theory together. Then we use MATLAB to test our results against measured data. Every lesson connects to something real.

6 Fundamental Problems 29 Lessons Ungrading · Portfolio 3 Learning Tracks Jeff Anderson · ALAF No Required Textbook Purchase
// The 6 Fundamental Problems
01
Nonsingular Linear Systems
NLSP · Ax = b, unique solution
02
General Linear Systems
GLSP · many or no solutions
03
Full-Rank Least Squares
FRLSP · best approximate solution
04
Standard Eigenvalue Problem
SEP · Av = λv, resonance
05
Singular Value Decomposition
SVD · data compression, images
06
Generalized Eigenvalue Problem
GEP · coupled systems, physics
Learning track:
Unit 01

Language of Linear Algebra

Before we can solve anything, we need a shared language. Sets, functions, vectors, matrices — not as symbols to memorize, but as tools you'll actually use. Each idea will feel abstract until you see the problem it was invented to solve. That's where we start: with the problem.

What You Already Know (Activate Before Lesson 0)

Linear algebra feels abstract at first. But you already have the intuitions:

  • You've added vectors as arrows on a graph in physics. The same idea applies to any number of dimensions.
  • You've solved two equations with two unknowns. A linear system is exactly that — but possibly with thousands of unknowns.
  • You've used a spreadsheet. A spreadsheet is a matrix. Matrix multiplication is just a compact way of doing all those cell formulas at once.

The goal is not to memorize procedures. The goal is to see the structure behind problems you already know — and then solve problems you couldn't solve before.

Lesson 0
Major Problems in Applied Linear Algebra
The Problem First

You have 500 unknowns and 500 equations describing an electrical network. Solving it by hand would take years. What mathematical structure could let a computer solve it in milliseconds?

What are the six fundamental problems? Why do they matter? Where do they appear in engineering, data science, and physics? This lesson gives you the map before we explore the territory.
Foundation ▶ Watch Playlist →
Name all 6 fundamental problems without looking. What does each acronym stand for?
Where does the NLSP appear in real life? Give two engineering examples.
Lessons 1–2
Set Theory & Relations and Functions
The Problem First

You need a precise, unambiguous way to say "all possible input-output pairs." Natural language is too vague. That's why mathematicians invented set notation — not for elegance, but for precision.

Sets, subsets, functions, domain, codomain. Injective, surjective, bijective. You'll learn this vocabulary because it appears everywhere in the course — in vector spaces, transformations, and eigenvalue problems.
▶ Playlist L1 →
// Used In
Database theoryFormal verificationProbability foundations
What is the difference between ∈ and ⊆? Give an example of each.
A function is bijective if and only if ___. Why does this guarantee an inverse?
Lessons 3–5
Vectors, Arithmetic & Inner Products
Vectors as models of real things: prices, forces, pixel intensities. Vector arithmetic, norms, inner products. The geometry of dot products and orthogonality.
Core
// Used In
Machine learningRecommendation systemsSignal processing
Write the definition of the Euclidean inner product. What does it measure geometrically?
State the Cauchy-Schwarz inequality. Prove it from the inner product definition.
Two vectors are orthogonal iff their inner product is ___. Why does this follow geometrically?
Lesson 6
Linear Combinations, Span & Linear Independence
Can any vector in the space be built from this set? Span and independence are the key questions of linear algebra. These ideas unlock everything that follows.
Core ▶ Playlist L6 →
// Used In
PCA (data science)Image compressionStructural analysis
Give the formal definition of linear independence. What goes wrong if vectors are dependent?
Can the span of 3 vectors in ℝ⁵ be all of ℝ⁵? Explain why or why not.
Lessons 7–11
Matrices — Structure, Arithmetic & Multiplication
Matrices as linear transformations that rotate, scale, shear, and project. Four interpretations of Ax. Why matrix multiplication is not commutative and what that means.
Core
// Used In
Neural networksGoogle PageRankComputer graphicsGPS
Write out the four interpretations of Ax.
Prove that matrix multiplication is associative but not commutative. Give a counterexample.
What is the adjacency matrix of a graph? Build one for a 4-node network.
// Critical Lens
  • Who invented set theory notation, and why was Cantor's work rejected by the establishment?
  • Vectors are used to represent people in recommendation systems. What gets lost when a human is reduced to a list of numbers?
  • Facebook's social graph models relationships as matrix entries. Who decides what gets counted and what doesn't?
Unit 02

Solving Linear Systems (NLSP & GLSP)

The first two fundamental problems. Can we solve Ax = b? If yes, how many solutions? These questions have been asked since ancient China and Babylon — we now have complete answers.

Lessons 12–14
NLSP: Nonsingular Systems, Inverses & IMT
When Ax = b has exactly one solution. Row reduction, the Invertible Matrix Theorem — fifteen equivalent conditions for invertibility.
NLSP · Problem 1
// Used In
Circuit analysisSupply chainCryptography
State the Invertible Matrix Theorem. List 5 equivalent conditions for A to be invertible.
Compute the inverse of a 2×2 matrix from memory. Why does the formula work?
Lessons 15–16
LU Factorization & Determinants
LU as Gaussian elimination captured in matrix form. Determinants — geometric meaning (volume), computation (permutations), and connection to invertibility.
▶ Playlist L15 →
What does det(A) = 0 tell you geometrically? Algebraically?
Why is LU factorization more efficient than computing A⁻¹ when solving many systems?
Lessons 17–18
GLSP: General Systems & Parametric Solutions
When systems have no solution or infinitely many. RREF, free variables, the parametric solution space. Understanding what "infinite solutions" means structurally.
GLSP · Problem 2
// Used In
Chemical balancingNetwork flowStructural mechanics
A linear system can have 0, 1, or ∞ solutions. Prove it cannot have exactly 2.
What is a free variable? How do you identify them in RREF? Write the general solution.
// Major Project
Electrify Linear-Systems Problem (ELSP)

Build and measure a real electrical circuit. Use Kirchhoff's laws to model it as a linear system. Solve with LU factorization. Compare mathematical predictions to measured values. Why do they differ?

Project Website →
Unit 03

Vector Spaces, Least Squares & Eigentheory

Lessons 19–21
Vector Spaces, Null & Column Spaces, Rank
Abstract vector spaces with the same axioms in infinitely many settings. The four fundamental subspaces of a matrix. Rank-Nullity Theorem.
Core
What are the 10 axioms of a vector space? Why can't you remove any of them?
State the Rank-Nullity Theorem and prove it using the four fundamental subspaces.
Lessons 22–26
Least Squares, Projections & QR Factorization
When Ax = b has no solution, find the best approximation. Normal equations, orthogonal projections, Gram-Schmidt algorithm, QR as a clean numerical strategy.
FRLSP · Problem 3
// Used In
Linear regressionGPS location fittingImage reconstruction
Derive the normal equations (AᵀAx̂ = Aᵀb) from first principles. What geometric fact does this rely on?
Walk through Gram-Schmidt for 3 vectors. What property do the output vectors have?
Lessons 27–29
Eigenvalues, Eigenvectors & Diagonalization
The SEP: Av = λv. Resonant modes of physical systems. Characteristic polynomial. Diagonalization. Why PageRank is an eigenvalue problem.
SEP · Problem 4
// Used In
Google PageRankVibration analysisQuantum mechanicsPCA
What does it mean for a vector to be an eigenvector? Describe it geometrically.
How do you find eigenvalues? Write out the procedure from det(A − λI) = 0.
Why is PageRank an eigenvalue problem? Write the defining equation.
// Critical Lens — Units 2 & 3

How to Succeed in This Course

✍️
Derive Before You Look
Before reading notes on a theorem, try to prove it yourself. 20 minutes of honest struggle beats 2 hours of passive reading. Struggle is how the brain encodes structure.
📐
Draw Every Object
Vectors are arrows. Matrices are transformations. Subspaces are planes. Always draw a picture. If you can't visualize it, you don't yet understand it.
🔁
Teach One Concept Weekly
Explain a concept to a study partner or out loud to yourself. Teaching exposes every gap you didn't know you had. Jeff's "concept image" practice: describe the object, its properties, where it breaks.
🔗
Connect to the 6 Problems
Every lesson connects to at least one fundamental problem. Ask: which problem does this help solve? This transforms isolated facts into a coherent structure you can actually use.
📓
Keep a Living Portfolio
Your portfolio is not a final product — it's a living record. Update it weekly. Include failed attempts, revised proofs, honest reflections. Authenticity matters more than polish.
The 2-Minute Rule
Jeff Anderson's rule: if you've been stuck for 2 minutes, write down exactly what you're stuck on and ask. Asking good questions is the highest-level skill in mathematics.

How the Course Connects

Language Layer
  • Sets & Functions
  • Vectors & Norms
  • Linear Combinations
  • Span & Independence
Matrix Machinery
  • Matrix Arithmetic
  • Transformations
  • Determinants
  • LU & QR Factorizations
Linear Systems
  • NLSP (unique solution)
  • GLSP (0 or ∞ solutions)
  • FRLSP (least squares)
  • Normal equations
Abstract Spaces
  • Vector space axioms
  • Null & column spaces
  • Dimension & rank
  • Rank-nullity theorem
Spectral Theory
  • Eigenvalues & vectors
  • Diagonalization
  • Resonant modes
  • PageRank
Critical Thread
  • Who built this math?
  • Who benefits from it?
  • Who is harmed by it?
  • What should change?

Real Datasets to Work With

COMPAS Recidivism Data (ProPublica)
Apply least squares regression. See how predictions vary by race. What does it mean to optimize a model that affects who goes to prison?
Access Dataset →
ACS PUMS Census Income
Build regression models for income prediction. Which features matter? What does the least squares solution encode about society?
Access Dataset →
SNAP Network Datasets (Stanford)
Web crawls, social networks, citation graphs. Perfect for matrix modeling, PageRank, and eigenvalue applications.
Access Dataset →
Bay Area BART Ridership
Model ridership as a linear system. Identify patterns. Explore which communities are served by transit and which aren't.
Access Dataset →
ELSP Circuit Measurements
Your own measured data from the Electrify project. The most important dataset — because you collected it and know its uncertainty.
Project Site →
ImageNet Sample (PCA Practice)
Apply SVD to real image data. Compress images and watch quality degrade. Understand what the eigenvalue decomposition is capturing.
Access Dataset →

Learning Milestones

📐
Foundations Builder
Complete 3+ lesson units
✓ Milestone Reached
Linear Systems Solver
Complete 6+ lesson units
✓ Milestone Reached
λ
Spectral Theorist
Complete all 9 lesson units
✓ Milestone Reached