|
The outline for the current offering will be updated below as the quarter progresses.
Spring 2026 (Approximate Outline)
Detection and Hypothesis Testing
Mean and conditional expectation, variance, covariance, correlation, and moment generating functions
Random vectors, covariance matrices, and linear transformations
Inequalities: Markov, Chebyshev, Hoeffding, and general concentration inequalities
MSE estimation; linear estimation and the orthogonality principle
Covariance matrices: whitening and coloring, Gaussian random vectors, vector detection and MSE estimation; Kalman filtering and innovations
Convergence and limit theorems: Law of Large Numbers, Central Limit Theorem, and applications
Random processes: definition and examples of discrete and continuous random processes; IID processes, random walk, independent increment processes, Poisson process, Gaussian random processes, stationarity, autocorrelation function, and power spectral density
White noise, bandlimited processes, and response of linear systems to random inputs
Linear filtering; infinite smoothing, causal estimation, spectral factorization, and Wiener filtering
The following course plan is from the previous edition of the course (Fall 2024), provided solely for reference.
Lecture 1: Course Overview
Lecture 2: Review of Probability Inequalities and Limit Theorems (References: EE178 notes or Sections 1.6.(1-2) and 1.7.(1-3) from Gallager)
Lecture 3-4: Concentration Inequalities, Moment Generating Function, Sub-Gaussian Random Variables (References: Chapter 2 Vershynin and Appendix B of Shalev-Shwartz & Ben-David)
Lecture 5-6: Machine Learning, Empirical Risk Minimization, Learning via Uniform Convergence (Reference: Chapters 2-3-4 of Shalev-Shwartz & Ben-David)
Lecture 7: Random Vectors, Mean and Covariance Matrix (Reference: Sections 3.1 to 3.4 of Gallager)
Lecture 8: Properties of a Covariance Matrix, Spectral Decomposition, Karhunen-Loeve Expansion (Reference: Sections 3.1 to 3.4 of Gallager)
Lecture 9: Principal Component Analysis, Gaussian Random Vectors (Reference: Sections 3.1 to 3.4 of Gallager)
Lecture 10: Gaussian Random Vectors (Reference: Sections 3.1 to 3.4 of Gallager)
Lecture 11: Detection/Hypothesis Testing (Reference: Sections 8.1 to 8.2 of Gallager)
Lecture 12: Detection/Hypothesis Testing: Examples (Reference: Sections 8.1 to 8.2 of Gallager)
Lecture 13: No class. Democracy day!
Lecture 14: Midterm
Lecture 15: Detection/Hypothesis Testing for Vector Gaussian Channel, Estimation (Reference: Sections 8.1 to 8.2, Sections 10.1-10.2 of Gallager)
Lecture 16: MMSE Estimation, Sufficient Statistics (Sections 10.1-10.2 of Gallager)
Lecture 17: Recursive Estimation and Kalman Filtering (Sections 10.1-10.2 of Gallager)
Lecture 18: Random Processes, Stationarity (Section 3.6 of Gallager)
Lecture 19: Gaussian Random Processes, Auto-Correlation Function (Section 3.6 of Gallager)
Lecture 20: Power Spectral Density
|