Deep Learning: Do-It-Yourself!

Hands-on tour to deep learning

2018 Archive, for the current version see here

Course description

Recent developments in neural network approaches (more known now as "deep learning") have dramatically changed the landscape of several research fields such as image classification, object detection, speech recognition, machine translation, self-driving cars and many more. Due its promise of leveraging large (sometimes even small) amounts of data in an end-to-end manner, i.e. train a model to extract features by itself and to learn from them, deep learning is increasingly appealing to other fields as well: medicine, time series analysis, biology, simulation.

This course is a deep dive into practical details of deep learning architectures, in which we attempt to demystify deep learning and kick start you into using it in your own field of research. During this course, you will gain a better understanding of the basis of deep learning and get familiar with its applications. We will show how to set up, train, debug and visualize your own neural network. Along the way, we will be providing practical engineering tricks for training or adapting neural networks to new tasks.

By the end of this class, you will have an overview on the deep learning landscape and its applications to traditional fields, but also some ideas for applying it to new ones. You should also be able to train a multi-million parameter deep neural network by yourself. For the implementations we will be using the PyTorch library in Python.

The topics covered in this course include:

  • Neural network approaches: feedforward networks, convolutional networks (CNNs), recurrent networks (RNNs)
  • Modern practices: backpropagation, regularization, optimization, fine-tuning
  • Deep Learning research: autoencoders, deep generative models, long short-term memory (LSTM) modules
  • CNN architectures: VGG, ResNet, fully convolutional net, multi input and multi output nets
  • RNN architectures: bidirectional RNNs, encoder-decoder sequence-to-sequence, LSTMs, GRUs

  • Schedule

    # Date Description Course Materials
    Lecture 1 Friday
    September 14
    13h15-17h15
    Salle Conference (46 rue d'Ulm)
    Course introduction
    Meet your dev environment
    First dive into CNNs
    Testing out pre-trained networks
    [slides intro]
    [forum]
    [ipynb]
    [Alexandre's slides]
    Lecture 2 Friday
    September 21
    13h15-17h15
    Salle Conference (46 rue d'Ulm)
    Intro to PyTorch
    Basic operations and automatic differentiation
    Linear regression
    Backprop
    Simple Neural Networks
    [ipynb 1]
    [Timothée's slides] [ipynb 2]
    Lecture 3 Friday
    September 28
    13h15-17h15
    salle Conference (46 rue d'Ulm)
    Autograd and Convolutions [Slides]
    [ipynb]
    [ipynb]
    Lecture 4 Friday
    October 5
    13h15-17h15
    salle des Actes
    Image classification
    Dropout, Batch normalization, Residual Net
    [slides]
    [colab ipynb]
    [practical ipynb]
    Lecture 5 Friday
    October 12
    13h15-17h15
    salle des Actes
    Embedding: RecSys with Neural Networks
    Using Fully-Connected layers
    Triplet loss
    Clustering
    Dimensionality reduction: PCA, t-SNE
    Result visualization
    [slides]
    [ipynb]
    [ipynb collaborative filetring]
    Lecture 6 Friday
    October 26
    13h15-17h15
    salle des Actes
    Optimisation
    Practical: Sentiment analysis from text
    Embeddings
    GloVe
    1d convolutions
    [slides]
    [ipynb]
    Lecture 7 Friday
    November 9
    13h15-17h15
    salle 235 A
    Generative Adversarial Networks
    GAN, Conditional GAN, InfoGAN, DCGAN
    [slides]
    [ipynb]
    Lecture 8 Friday
    November 16
    13h15-17h15
    salle 235 A
    Guest lecture by O. Pietquin (Google) on reinforcement learning

    [slides]
    Lecture 9 Friday
    November 23
    13h15-17h15
    salle 235 A
    Recurrent Neural Networks
    LSTM, GRU
    [slides]
    [ipynb]
    [ipynb]
    Lecture 10 Friday
    November 30
    13h15-17h15
    salle 235 A
    Under the hood
    Organize your code
    [slides]
    [ipynb]
    [Andrei's slides]
    [Andrei's project repo]
    No Lecture! Friday
    December 7

    Work on project!


    Lecture 11 Friday
    December 14
    13h15-17h15
    salle 235 A
    NLP with RNN

    [slides]
    [ipynb]
    Lecture 12 Friday
    December 21
    13h15-17h15
    salle 235 A
    Attention and Metric Learning

    [slides]
    [ipynb]
    Lecture 13 Friday
    January 11
    13h15-17h15
    salle 235 A
    Work on project!

    [slides]
    [ipynb]
    Lecture 14 Friday
    January 18
    13h15-17h15
    salle 235 A
    Work on project!

    [slides]
    [ipynb]
    Lecture 15 Friday
    January 25
    13h15-17h15
    salle 235 A
    Presentations

    [slides]
    [ipynb]

    Practical info

    Friday afternoon: 13h15 - 17h15

    Location will change, please have a look at the website.

    GPU install

    Instructions on how to use AWS instance available on the forum.


    Instructors

    Marc Lelarge - Marc dot Lelarge at ens dot fr
    with the great help of:
  • Andrei Bursuc - andrei dot bursuc at gmail dot com
  • Alexandre Defossez - defossez at fb dot com
  • Timothée Lacroix - tlacroix at fb dot com
  • Alexandre Sablayrolles - asablayrolles at fb dot com
  • Pierre Stock - pstock at fb dot com
  • Neil Zeghidour - neilz at fb dot com

  • Enrolment

    Please register to follow the course.


    Online resources

    GitHub

    Forum (to be able to post on the forum, you must log in on the moodle)

    Previous course 2017

    Fastai by Jeremy Howard

    Deep Learning course by Olivier Grisel and Charles Ollion

    Deep Learning by François Fleuret

    Deep Learning book by Ian Goodfellow and Yoshua Bengio and Aaron Courville

    and more to come during the lectures...