Deep Learning: DoItYourself!
Handson tour to deep learning
Recent developments in neural network approaches (more known now as "deep learning") have dramatically changed the landscape of several research fields such as image classification, object detection, speech recognition, machine translation, selfdriving cars and many more. Due its promise of leveraging large (sometimes even small) amounts of data in an endtoend manner, i.e. train a model to extract features by itself and to learn from them, deep learning is increasingly appealing to other fields as well: medicine, time series analysis, biology, simulation.
This course is a deep dive into practical details of deep learning architectures, in which we attempt to demystify deep learning and kick start you into using it in your own field of research. During this course, you will gain a better understanding of the basis of deep learning and get familiar with its applications. We will show how to set up, train, debug and visualize your own neural network. Along the way, we will be providing practical engineering tricks for training or adapting neural networks to new tasks.
By the end of this class, you will have an overview on the deep learning landscape and its applications to traditional fields, but also some ideas for applying it to new ones. You should also be able to train a multimillion parameter deep neural network by yourself. For the implementations we will be using the PyTorch library in Python.
The topics covered in this course include:
#  Date  Description  Course Materials 
Lecture 1  Friday September 14 13h1517h15 Salle Conference (46 rue d'Ulm) 
Course introduction Meet your dev environment First dive into CNNs Testing out pretrained networks 
[slides intro]
[forum] [ipynb] [Alexandre's slides] 
Lecture 2  Friday September 21 13h1517h15 Salle Conference (46 rue d'Ulm) 
Intro to PyTorch Basic operations and automatic differentiation Linear regression Backprop Simple Neural Networks 
[ipynb 1] [Timothée's slides] [ipynb 2] 
Lecture 3  Friday September 28 13h1517h15 salle Conference (46 rue d'Ulm) 
Autograd and Convolutions  [Slides] [ipynb] [ipynb] 
Lecture 4  Friday October 5 13h1517h15 salle des Actes 
Image classification Dropout, Batch normalization, Residual Net 
[slides] [colab ipynb] [practical ipynb] 
Lecture 5  Friday October 12 13h1517h15 salle des Actes 
Embedding: RecSys with Neural Networks Using FullyConnected layers Triplet loss Clustering Dimensionality reduction: PCA, tSNE Result visualization 
[slides] [ipynb] [ipynb collaborative filetring] 
Lecture 6  Friday October 26 13h1517h15 salle des Actes 
Optimisation Practical: Sentiment analysis from text Embeddings GloVe 1d convolutions 
[slides] [ipynb] 
Lecture 7  Friday November 9 13h1517h15 salle 235 A 
Generative Adversarial Networks GAN, Conditional GAN, InfoGAN, DCGAN 
[slides] [ipynb] 
Lecture 8  Friday November 16 13h1517h15 salle 235 A 
Guest lecture by O. Pietquin
(Google) on reinforcement learning 
[slides] 
Lecture 9  Friday November 23 13h1517h15 salle 235 A 
Recurrent Neural Networks LSTM, GRU 
[slides] [ipynb] [ipynb] 
Lecture 10  Friday November 30 13h1517h15 salle 235 A 
Under the hood Organize your code 
[slides] [ipynb] [Andrei's slides] [Andrei's project repo] 
No Lecture!  Friday December 7 
Work on project! 

Lecture 11  Friday December 14 13h1517h15 salle 235 A 
NLP with RNN 
[slides] [ipynb] 
Lecture 12  Friday December 21 13h1517h15 salle 235 A 
Attention and Metric Learning 
[slides] [ipynb] 
Lecture 13  Friday January 11 13h1517h15 salle 235 A 
Work on project! 
[slides] [ipynb] 
Lecture 14  Friday January 18 13h1517h15 salle 235 A 
Work on project! 
[slides] [ipynb] 
Lecture 15  Friday January 25 13h1517h15 salle 235 A 
Presentations 
[slides] [ipynb] 
Please register to follow the course.
Previous course 2017
Fastai by Jeremy Howard
Deep Learning course by Olivier Grisel and Charles Ollion
Deep Learning by François Fleuret
Deep Learning book by Ian Goodfellow and Yoshua Bengio and Aaron Courville
and more to come during the lectures...