Syllabus
- Period: V
- 1st meeting: Wed 26 April 2023 at 14:15; M220 (Krossi), Otakaari 1, 2nd floor
- Schedule: Every Wed 14-16 during 26 Apr 2023 – 14 Jun 2023 at M220 (Krossi), Otakaari 1, 2nd floor (No meeting on 24 May)
- Grading scale: Pass/Fail
- Credits: 3-5
- Registration: By showing up at the 1st meeting
- Recommended prerequisites: MS-C1342 Linear Algebra, MS-C1602 Statistical Inference, MS-E1600 Probability Theory
This reading seminar is an introduction to the statistical theory of large random matrices and their eigenvectors with applications to clustering and community detection. We get an overview of some of the latest developments in this active research area [1-3]. The focus is on understanding the mathematical principles (inequalities, representations, limit theorems) that are used to rigorously analyse the accuracy of statistical learning algorithms in sparse limiting regimes where model dimensions approach infinity. As a backround material, we may consult some of the recent monographs [4-6].
[1] A Zhang 2023. Fundamental limits of spectral clustering in stochastic block models. https://arxiv.org/abs/2301.09289
[2] S Dhara, J Gaudio, E Mossel, C Sandon 2022. Spectral recovery of binary censored block models. SODA. https://arxiv.org/abs/2107.06338
[3] E Abbé, J Fan, K Wang, Y Zhong 2020. Entrywise eigenvector analysis of random matrices with low expected rank. Ann Stat. https://arxiv.org/abs/1709.09565
[4] R Vershynin 2018 High-Dimensional Probability: An Introduction with Applications in Data Science. https://www.math.uci.edu/~rvershyn/
[5] M Wainwright 2019. High-dimensional statistics: A non-asymptotic viewpoint. Cambridge University Press
[6] Y Wu, J Xu 2022. Statistical inference on graphs: Selected Topics http://www.stat.yale.edu/~yw562/teaching/stats-graphs-export.pdf