Module II: Concentration of Probability Measures
This module covers the derivation of different families of probabilistic measure concentration inequalities starting from Markov, Chebyshev, and Chernoff inequalities, demonstrating the power of using higher order moments in getting better bounds. The notion of sub-Gaussian variables as an extension from the classical Chernoff inequality is introduced with various examples. Further extension to the sub-exponential family is covered, showing that Chi-squared variables are sub-exponential. Using this, the well-known Johnson-Lindenstrauss inequality for near isometric embedding is derived, along with an elementary proof using the standard Chernoff bound. The module also explores the Berry-Esseen theorem relating to the slow rate of convergence O(1/√n) and its impact on deriving tail probabilities.
| Date | Title | Description |
|---|---|---|
| January 15, 2026 | Lecture Cancelled |
Next lecture in on 20th. |
| January 13, 2026 | Module II is Completed |
Module II is Completed. Module IV will be covered first followed by Module III |
| January 6, 2026 | Module II Begins |
First lecture of Module II: Concentration of Probability Measures. |