The class will be taught
in French or English, depending on attendance (all slides and class
notes are in English).
sanitary situations, classes will most probably held online on
Friday morning from 9am to 12pm. Detailed class notes will be made
available before class and lecturers will go through them.
Practical sessions will be done at home.
Please send the practical sessions (one jupyter notebook .ipynb with cells containing either text or runnable code) to email@example.com with the subject [PSn] with n being the number of the practical session (no acknowledgements will be sent back).
notes / code
|LC||15 January||Introduction to supervised learning (loss, risk, over-fitting and capacity control + cross-validation, Bayes predictor for classification and regression|
|FB||22 January||Least-squares regression (all aspects, from linear
algebra to statistical guarantees and L2 regularization +
Practical session 1, due February 7, 2020
|LC||29 January||Statistical ML without optimization (learning theory, from finite number of hypothesis to Rademacher / covering numbers)|
|FB||5 February||Local averaging techniques (K-nearest neighbor, Nadaraya-Watson regression: algorithms + statistical analysis + practical session)|
|LC||12 February||Empirical risk minimization (logistic regression, loss-based supervised learning, probabilistic interpretation through maximum likelihood)|
|FB||19 February||Convex optimization (gradient descent + nonsmooth + stochastic versions + practical session (logistic regression))|
||Model selection (feature selection, L1 regularization and high-dimensional inference + practical session)|
|FB||12 March||Kernels (positive-definite kernels and reproducing kernel Hilbert spaces)|
|LC||19 March||Neural networks (from one-hidden layer to deep networks + practical session)|
Evaluation: practical sessions to do at home + written take-home exam