High Dimensional Analysis: Random Matrices and Machine Learning

Lecturer: Roland Speicher

News

Assistant: Johannes Hoffmann

Time and Place

Lectures

  • Mondays and Wednesdays, 10-12 in HS IV, building E 2.4

Tutorials

  • Wednesdays, 14-16 in SR 2, building E2.5 (not every week - check the schedule in CMS!)

 

General Description

4h Topics Course (9 CP) in the Summer Term 2023

Analysis in high-dimensional spaces is at the basis of an understanding and rigorous treatment of dealing with large data sets. In a very rough sense one has to understand functions of many variables, in particular also their generic and statistical behaviour.

High-dimensional spaces are actually quite strange and exotic, when considered from our experience in small dimensions (remember that we live in three dimensions). On one side, high-dimensional spaces are VERY large, so it is easy to get lost there (a phenomenon often known under the name “curse of dimensionality”), but on the other hand, functions of many variables often concentrate to constant values and thus allow very precise answers to apparently intractable questions (a phenomenon going under the name of “concentration phenomenon” and also considered as the “blessing of dimensionality”).

Also statistics in high dimensions with many variables and many observations is different from what one is used to in a classical setting. In classical statistics the number of variables is small compared to the number of observations and essentially one can understand the distribution of the variables via the law of large numbers by doing many measurements. In modern statistics the number of variables is of the same order as the number of observations, so the relation between the real and the measured properties of the variables is more tricky and new mathematical tools are needed to unravel this. In particular, random matrices (which were originally introduced by the statistician Wishart) are such a tool.

The neural networks of modern deep learning are in some sense a special class of functions of many variables, built out of (random) matrices and also some entry-wise non-linear functions. Thus random matrices should help to say something about neural nets, but the latter also present new and interesting extensions of random matrices.

In this course we will take a trip through high dimensions. We hope not to get lost, but to get a feeling for its strange and beautiful behaviour. Much of the terrain is still unexplored, but there are some first interesting islands which have been discovered, waiting for us to explore them.

This course is intended for everybody with an interest in a mathematical rigorous discussion of high-dimensional phenomena. A basic mathematical background on the level of the MfI 1-3 courses should be sufficient (but also necessary). Be aware, however, our intention is not to develop better and faster algorithms for deep learning, but to touch upon some mathematical theories which might (or might not, who knows) be important for a sound mathematical theory of deep learning.

Tentative List of Topics

  • curse and blessing of dimensionality
  • concentration of vectors and matrices in high dimensions
  • Wishart matrices and Marchenko-Pastur law
  • signal-plus-noise models
  • neural networks, overparameterization, neural tangent kernel, feature learning and all that

Literature

Here are some references on which the lectures will partly rely or which might provide some further reading. In any case they should give an idea what the course will be about. We will not follow one source exclusively, but pick the best from each of them, according to the taste and interest of the lecturer.

  • R. Couillet and Z. Liao: Random Matrix Methods for Machine Learning, Cambridge University Press, 2022
  • D.A. Roberts and S. Yaida: The Principles of Deep Learning Theory, Cambridge University Press, 2022
  • R. Vershynin: High-Dimensional Probability, Cambridge University Press, 2018
  • A. Zagidullina: High-Dimensional Covariance Matrix Estimation, Springer, 2021
  • the Tensor programs framework of Greg Yang
  • A. S. Bandeira, A. Singer, T. Strohmer: Mathematics of Data Science (draft)

This list is not yet final and will increase in due time.

See also the Semesterapparat/Course Reference Shelf provided by the Campus Library for Computer Science and Mathematics.

Postal address

Saarland University
Department of Mathematics
Postfach 15 11 50
66041 Saarbrücken
Germany

 

Visitors

Saarland University
Campus building E 2 4
66123 Saarbrücken
Germany

Information for visitors