Covers statistical machine learning and empirical process theory. The topics to be covered include: basics of statistical decision theory; concentration inequalities; supervised and unsupervised learning; empirical risk minimization; complexity-regularized estimation; generalization bounds for learning algorithms; VC dimension and Rademacher complexities; minimax lower bounds; online learning and optimization. Additionally, we will explore applications of these theories in areas like variational Bayes, high-dimensional statistics, generative model estimation, and various machine learning problems.