Module II: Concentration of Probability Measures

This module covers the derivation of different families of probabilistic measure concentration inequalities starting from Markov, Chebyshev, and Chernoff inequalities, demonstrating the power of using higher order moments in getting better bounds. The notion of sub-Gaussian variables as an extension from the classical Chernoff inequality is introduced with various examples. Further extension to the sub-exponential family is covered, showing that Chi-squared variables are sub-exponential. Using this, the well-known Johnson-Lindenstrauss inequality for near isometric embedding is derived, along with an elementary proof using the standard Chernoff bound. The module also explores the Berry-Esseen theorem relating to the slow rate of convergence O(1/√n) and its impact on deriving tail probabilities.

Date & Time Tuesdays and Thursdays; 3:30 PM to 5:00 PM
Classroom Room B303, EE Department, IISc
Duration Starting January 6, 2026 (4-5 Lectures)
Date Title Description
January 15, 2026 Lecture Cancelled

Next lecture in on 20th.

January 13, 2026 Module II is Completed

Module II is Completed. Module IV will be covered first followed by Module III

January 6, 2026 Module II Begins

First lecture of Module II: Concentration of Probability Measures.

Central Limit Theorem and Tailbounds

M. J. Wainwright (2019) High-Dimensional Statistics: A non-asymptotic viewpoint, Cambridge University Press

N. Halko, P.G. Martinsson, and J. A. Tropp (2011) Finding Structures with Randomness: Probabilistic Algorithms for constructing Approximate Matrix decompositions, SIAM Review, Vol 53, 217-288

A. Blum, J. Hopcroft, and R. Kannan (2020) Foundations of Data Sciences, Cambridge University Press

V. Shikhman and D. Muller (2021) Mathematical Foundations of Big Data Analytics, Springer