Deep neural networks have gained immense popularity within the last decade due to their outstanding success in many important machine learning tasks such as image recognition, speech recognition, and natural language processing. This course will provide a principled and hands-on approach to deep learning with neural networks. By the end of the course, students will have mastered the principles and practices underlying neural networks including modern methods of deep learning, and will have applied deep learning methods to real-world problems including image recognition, natural language processing, and biomedical applications.

The course will be based on homework and a final group project. The project will include both a written and oral (i.e. presentation) component. Grades in this courser will be based on their homework scores and the quality of the written and oral component of their projects. The course assumes basic prior knowledge in linear algebra and probability.

First lecture on Tuesday, January 23rd

TensorFlow: Monday, Feb. 26th, 7:00-8:00 PM, AKW 400

TA: Tuesdays 5:30-7:30 PM, AKW 104 (or by appointment)

Kevin Moon: Wednesdays 3:00-5:00 PM, AKW 103

Guy Wolf: Thursdays 5:30-7:30 PM in AKW 103

- Neural Networks and Deep Learning by Michael Nielsen
- Deep Learning by Goodfellow, Bengio, and Courville

- Deep learning overview & relevant machine learning background
- Gradient descent & stochastic gradient descent
- Backpropagation & convergence in neural networks
- Deep neural network concepts:
- Cost functions & activation functions
- Regularization & weight initialization

- Johnson-Lindenstrauss lemma
- Convolutional neural networks
- Word embeddings (e.g., word2vec)
- Recurrent neural networks
- Autoencoders for unsupervised learning
- Ultra deep learning with ResNet
- Generative models (e.g., Generative Adverserial Networks)
- Deep reinforcement learning
- Boltzman machines

- Topic 01 - Deep Learning Overview
- Topic 02 - Machine Learning
- Topic 03 - Feed Forward Networks
- Topic 04 - Gradient Descent
- Topic 05 - Backpropagation
- Topic 06 - Learning Slowdown
- Topic 07 - Regularization
- Topic 08 - Hyperparameter Selection
- Topic 09 - SGD Variations
- Topic 10 - Universality
- Topic 11 - Scattering
- Topic 12 - Callenges

- Topic 13 - ConvNets
- Topic 14 - Autoencoders

- Exercise 1 - due by Thursday, Feb. 15, 4:00 PM.
- Data: Exercise1.zip

- Exercise 2 - due by Thursday, Mar. 1, 4:00 PM.
- Clarifications: Exercise2_clarification.pdf
- Python templates: prob4_1.py

- Exercise 3 - due by Thursday, Mar. 29, 4:00 PM.
- Code: Exercise3.zip

- Exercise 4 will be published on TBD, and due Thursday, TBD.
- Exercise 5 will be published on TBD, and due Thursday, TBD.