CS 4803 / 7643 Deep Learning
Fall 2021, Tues/Thurs 12:30 - 1:45 pm, Online Synchronous
Course Information
This is an exciting time to be studying (Deep) Machine Learning, or Representation Learning, or for lack of a better term, simply Deep Learning!
Deep Learning is rapidly emerging as one of the most successful and widely applicable set of techniques across a range of domains (vision, language, speech, reasoning, robotics, AI in general), leading to some pretty significant commercial success and exciting new directions that may previously have seemed out of reach.
This course will introduce students to the basics of Neural Networks (NNs) and expose them to some cutting-edge research. It is structured in modules (background, Convolutional NNs, Recurrent NNs, Deep Reinforcement Learning, Deep Structured Prediction). Modules will be presented via instructor lectures and reinforced with homeworks that teach theoretical and practical aspects. The course will also include a project which will allow students to explore an area of Deep Learning that interests them in more depth.
Teaching Assistants
- Class meets
- Tuesday, Thursday 12:30 - 1:45 pm; Remote
- Class link
- https://primetime.bluejeans.com/a2m/live-event/taqgvgxk
- Piazza
- piazza.com/gatech/fall2021/cs48037643
- Canvas
- CS4803: gatech.instructure.com/courses/211778
- CS7643: gatech.instructure.com/courses/211784
- Gradescope
- CS4803: https://www.gradescope.com/courses/284497
- CS7643: https://www.gradescope.com/courses/284496
- Pre-recorded Lectures
- https://www.youtube.com/playlist?list=PL-fZD610i7yB7gDnPDpFcKpHI9X8z3OQ7
Schedule
Date | Topic | Optional Reading | |
W1: Aug 24 |
Intro lecture + class logistics.
CS 4803 Gradescope, CS 7643 Gradescope Slides PS0 Lecture Recording |
||
W1: Aug 26 |
Image Classification and k-NN.
PS0 due at midnight. Slides, Slides (annotated). Supervised Learning notes, k-NN notes. Lecture Recording |
||
W2: Aug 31 |
Linear Classifiers, Loss Functions.
Slides, Slides (annotated). Lecture Recording |
||
W2: Sep 2 |
Regularization, Neural Networks.
PS/HW1 out Slides, Slides (annotated). Lecture Recording |
||
W3: Sep 7 |
Optimization, Computing Gradients.
Slides, Slides (annotated). Gradients notes. Lecture Recording |
||
W3: Sep 9 |
Forward mode vs Reverse mode Auto-diff.
Slides, Slides (annotated). Lecture Recording |
||
W4: Sep 14 |
Backprop.
Slides, Slides (annotated). Lecture Recording |
||
W4: Sep 16 |
Backprop (Pt 2)
PS/HW1 due night before (Wed 9/15), PS/HW2 out Slides, Slides (annotated). Lecture Recording |
||
W5: Sep 21 |
What is a convolution?
Slides, Slides (annotated). CNNs notes. Lecture Recording |
||
W5: Sep 23 |
CNNs 1: Convolutions, stride.
Slides, Slides (annotated). Lecture Recording |
||
W6: Sep 28 |
CNNs 2: Pooling, fc layers as conv, backprop in conv layers.
Slides, Slides (annotated). CNNs Backprop notes. Lecture Recording |
||
W6: Sep 30 |
CNN Architectures for image classification, pixel-level prediction (semantic segmentation, depth, etc).
Slides, Slides (annotated). Lecture Recording |
||
W7: Oct 5 |
Visualizing CNNs.
Slides (pdf). Lecture Recording |
||
W7: Oct 7 |
RNNS 1
PS/HW2 due night before (Thu. 10/6), PS/HW3 out Slides, Slides (annotated). RNNs notes. Lecture Recording. |
||
W8: Oct 12 | Fall Break | ||
W8: Oct 14 |
RNNs 2: LSTMs, CNNs+RNNs.
Slides, Slides (annotated). Lecture Recording |
||
W9: Oct 19 |
RNNs 2.5: (1/2) of L18 + Image Captioning w/ Attention
Slides, Slides (annotated), Lecture Recording |
||
W9: Oct 21 |
Self-Supervised Learning (Guest Lecture by Ishan Misra)
PS/HW3 due night before (Wed. 10/20), PS/HW4 out Slides (pdf). Lecture Recording |
||
W10: Oct 26 |
Self-Supervised Learning (Guest Lecture by Michael Auli)
Slides (pdf), Lecture Recording |
||
W10: Oct 28 |
RNNs 3: Transformers, BERT, ViLBERT, VLN-BERT (Guest Lecture by Arjun Majumdar)
Slides (pdf), Slides (pdf), Admin. Slides (pdf), Lecture Recording |
||
W11: Nov 2 |
RL background.
Admin. Slides, Slides (RL), Slides (RL, annotated), Lecture Recording |
</td> | |
W11: Nov 4 |
RL: Dynamic Programming (Policy Iteration), Q-Learning, DQN.
PS/HW4 due night before (11/3), PS/HW5 Out Slides, Slides (annotated), Lecture Recording |
||
W12: Nov 9 |
RL: Policy Gradients, REINFORCE, Actor-Critic.
Slides,
Slides (annotated),
Lecture Recording.
|
||
W12: Nov 11 |
Embodied AI (Guest Lecture by Joanne Truong)
Slides,
Lecture Recording.
|
||
W13: Nov 16 |
CVPR (Conference on Computer Vision and Pattern Recognition) - No Class
Conference Link |
||
W13: Nov 18 |
Neural Architecture Search (Guest Lecture by Erik Wijmans)
PS/HW5 due night before (Wed 11/17), Slides, Lecture Recording. |
||
W14: Nov 23 |
Unsupervised Learning and Generative Modeling: VAEs 1
Slides, Slides (annotated), Lecture Recording. |
||
W14: Nov 25 |
Thanksgiving break - No Class
|
||
W15: Nov 30 |
VAEs 2.
Slides, Slides (annotated), Lecture Recording. |
||
W15: Dec 2 |
GANs and done
Slides, Slides (annotated), Lecture Recording. |
||
W16: Dec 7 | No class | ||
W16: Dec 9 |
No class
Project due (can submit by 11:59pm, Dec 9 without penalty) |
Grading
- 80% Homework (4 homeworks)
- 20% Final Project
- 5% (potential bonus) Class Participation
Late policy for deliverables
- No penalties for medical reasons or emergencies. Please see GT Catalog for rules about contacting the office of the Dean of Students.
- Every student has 7 free late days (7 x 24-hour chunks) for this course.
- After all free late days are used up, penalty is 25% for each additional late day.
- Late days for dropped homeworks will be redistributed to other homeworks.
Prerequisites
CS 4803/7643 should not be your first exposure to machine learning. Ideally, you need:
- Intro-level Machine Learning
- CS 3600 for the undergraduate section and CS 7641/ISYE 6740/CSE 6740 or equivalent for the graduate section.
- Algorithms
- Dynamic programming, basic data structures, complexity (NP-hardness)
- Calculus and Linear Algebra
- positive semi-definiteness, multivariate derivates (be prepared for lots and lots of gradients!)
- Programming
- This is a demanding class in terms of programming skills.
- HWs will involve Python and PyTorch.
- Your library of choice for project.
- Ability to deal with abstract mathematical concepts
Project Details (20% of course grade)
The class project is meant for students to (1) gain experience implementing deep models and (2) try Deep Learning on problems that interest them. The amount of effort should be at the level of one homework assignment per group member (1-5 people per group).
A PDF write-up describing the project in a self-contained manner will be the sole deliverable. Your final write-up is required to be between 4 - 6 pages using the template here, structured like a paper from a computer vision conference (CVPR, ECCV, ICCV, etc.). Please use this template so we can fairly judge all student projects without worrying about altered font sizes, margins, etc. After the class, we will post all the final reports online so that you can read about each others’ work. Additionally, we will allow people to upload additional code, videos, and other supplementary material as zip file similar to code upload for assignments. While the PDF may link to supplementary material, external documents, and code, such resources may or may not be used to evaluate the project. The final PDF should completely address all of the points in the rubric described below.
Rubric (60 points)
We are not looking to see if you succeeded or failed at accomplishing what you set out to do. It’s ok if your results are not “good”. What matters is that you put in a reasonable effort, understand the project and how it related to Deep Learning in detail, and are able to clearly communicate that understanding.
A former DARPA director named George H. Heilmeier came up with a list of questions for evaluating research projects. We’ve adapted that list for our rubric.
Introduction / Background / Motivation:
- (5 points) What did you try to do? What problem did you try to solve? Articulate your objectives using absolutely no jargon.
- (5 points) How is it done today, and what are the limits of current practice?
- (5 points) Who cares? If you are successful, what difference will it make?
Approach:
- (10 points) What did you do exactly? How did you solve the problem? Why did you think it would be successful? Is anything new in your approach?
- (5 points) What problems did you anticipate? What problems did you encounter? Did the very first thing you tried work?
Experiments and Results:
- (10 points) How did you measure success? What experiments were used? What were the results, both quantitative and qualitative? Did you succeed? Did you fail? Why?
In addition, 20 more points will be distributed based on:
-
(5 points) Appropriate use of figures / tables / visualizations. Are the ideas presented with appropriate illustration? Are the results presented clearly; are the important differences illustrated?
-
(5 points) Overall clarity. Is the manuscript self-contained? Can a peer who has also taken Deep Learning understand all of the points addressed above? Is sufficient detail provided?
-
(10 points) Finally, points will be distributed based on your understanding of how your project relates to Deep Learning. Here are some questions to think about:
- What was the structure of your problem? How did the structure of your model reflect the structure of your problem?
- What parts of your model had learned parameters (e.g., convolution layers) and what parts did not (e.g., post-processing classifier probabilities into decisions)?
- What representations of input and output did the neural network expect? How was the data pre/post-processed?
- What was the loss function?
- Did the model overfit? How well did the approach generalize?
- What hyperparameters did the model have? How were they chosen? How did they affect performance? What optimizer was used?
- What Deep Learning framework did you use?
- What existing code or models did you start with and what did those starting points provide?
At least some of these questions and others should be relevant to your project and should be addressed in the PDF. You do not need to address all of them in full detail. Some may be irrelevant to your project and others may be standard and thus require only a brief mention. For example, it is sufficient to simply mention the cross-entropy loss was used and not provide a full description of what that is. Generally, provide enough detail that someone with an appropriate background (in both Deep Learning and your domain of choice) could replicate the main parts of your project somewhat accurately, probably missing a few less important details.
Submit the final report by uploading to “Final Project” assignment on Gradescope. There will be a group assignment corresponding to the project submission. Every group should submit the report once and all group member names should be listed through the Gradescope interface. See instructions here. The supplementary material should also be uploaded as zip file to the Final Project (Supplementary Material) assignment.
FAQs
-
What is the delivery format?
Due to Covid-19, this Fall21 instance is “Online Synchronous”, meaning that lectures are delivered over VC at the scheduled class time. All other deliverables are accepted online. The class has no in-person interaction.
-
The class is full. Can I still get in?
Sorry. The course admins in CoC control this process. Please talk to them.
-
Unregistered Students who intend to register:
If you are not registered for this course, you will not have access to Gradescope for submission of PS0. Please fill the following form in order to be added to Gradescope and be able to submit PS0: https://qfreeaccountssjc1.az1.qualtrics.com/jfe/form/SV_3KUjzRkmrUwlWxo
Students who individually emailed us and have not been added yet - you may have left out the details of which course instance you are planning to take (either CS 7643 or CS 4803). There are two separate Gradescope courses for the two instances. Please fill the above form in order to provide us with this information.
-
Registered students who are not able to access Gradescope:
This will happen if you were registered to the course very recently. Gradescope rosters are synced periodically and it may take some time for you to receive a Gradescope sign-up notification. If you still face problems with accessing Gradescope, please email us.
-
I am graduating this Fall and I need this class to complete my degree requirements. What should I do?
Talk to the advisor or graduate coordinator for your academic program. They are keeping track of your degree requirements and will work with you if you need a specific course.
-
Can I audit this class or take it pass/fail?
No. Due to the large demand for this class, we will not be allowing audits or pass/fail. Letter grades only. This is to make sure students who want to take the class for credit can.
-
I have a question. What is the best way to reach the course staff?
Registered students – your first point of contact is Piazza (so that other students may benefit from your questions and our answers). If you have a personal matter, create a private piazza post.
Related Classes / Online Resources
- CS231n Convolutional Neural Networks for Visual Recognition, Stanford
- Machine Learning, Oxford
- Deep Learning, New York University
- Deep Learning, CMU
- Deep Learning, University of Maryland
- Hugo Larochelle’s Neural Networks class
Book
Overviews
Note to people outside Georgia Tech
Feel free to use the slides and materials available online here. If you use our slides, an appropriate attribution is requested. Please email the instructor with any corrections or improvements.