Deep Learning: DoItYourself! 2019 version
Please visit dataflowr for the most recent version
Course description
Recent developments in neural network approaches (more known now as “deep learning”) have dramatically changed the landscape of several research fields such as image classification, object detection, speech recognition, machine translation, selfdriving cars and many more. Due to its promise of leveraging large (sometimes even small) amounts of data in an endtoend manner, i.e. train a model to extract features by itself and to learn from them, deep learning is increasingly appealing to other fields as well: medicine, time series analysis, biology, simulation…
This course is a deep dive into practical details of deep learning architectures, in which we attempt to demystify deep learning and kick start you into using it in your own field of interest. During this course, you will gain a better understanding of the basis of deep learning and get familiar with its applications. We will show how to set up, train, debug and visualize your own neural network. Along the way, we will be providing practical engineering tricks for training or adapting neural networks to new tasks.
By the end of this class, you will have an overview on the deep learning landscape and its applications to traditional fields, but also some ideas for applying it to new ones. You should also be able to train a multimillion parameter deep neural network by yourself. For the implementations we will be using the PyTorch library in Python.
Practical info
This class is part of the Computer science courses taught at ENS in M1.
Prerequisite: introduction to Machine Learning.
Moodle for this course. Please use the forum to ask your questions and submit your homeworks.
Friday morning: 9h12h15 in salle R (ENS).
Schedule
 Lesson 1:
 (slides) introductory slides
 (code) a first example on Colab: dogs and cats with VGG
 Lesson 2:
 Lesson 3:
 (code) a simple example for backprop  solution
 (slides) refresher: linear/logistic regressions, classification and PyTorch module.
 (code) understanding convolutions and your first neural network for a digit recognizer  solution

Homework 1: you can open it on colab or run it on your laptop, the file is on github
 Lesson 4:
 (slides) embeddings and dataloader
 (code) Collaborative filtering: matrix factorization and recommender system
 (slides) Variational Autoencoder by Stéphane
 (code) AutoEncoder
 Lesson 5:
 Reccurrent Neural Networks: slides and associated code
 (code) PyTorch tutorial on charRNN
 (code) Word2vec
 (code) Playing with word embedding
 Lesson 6:
 (slides) Generative Adversarial Networks
 (code) Conditional and Info GANs
 (code) Normalizing Flows
 Lesson 7:
 (slides) Tips and tricks
 Lesson 8:
 (Slides) optimization for DL
 (code) gradient descent optimization algorithms
 Lesson 9:
 (Slides) Deep learning on graphs
 Lesson 10: guest lecture by David Robin
Back to dataflowr