Lecture notes & Exercises

Hadar933, updated 🕥 2022-08-14 13:42:57

Intro to Deep Learning

summary

check out my IDL_notes.pdf lecture notes! if you'd rather not download the file, check the google drive link HERE, or either one of those linked-in discussions 1/2

Chapters recap: | Chapter | Sections recap | | ---------------------------------|:--------------------------------------------------------------:| | Basic NN model | MLP, Activations, Loss function, Backpropagation | | deep NN theory | Universal Approximation Thm., Shallow vs deep | | LTI systems and Convolutional-NN | Time/Translation Invariance, Convolutional NN |
| Rerurrent-NN | Elman network, Backpropagation through time, LSTM, GRU |
| Attention Layers | Attention, self Attention, Multi-head Attention, Transformers |
| Auto-Encoders | Auto-Encoders, VAE, WAE |
| Generative Models | GAN, cGAN, GLOW, GLO |
| Optimizations | Sharp/Smooth minima, Momentum, AdaGrad, Adam |

Exercises

| Exercise | Description | | ----|:----------------------------------------------------------------------------------------------------------------------------------------------------------------:| | ex1 | implementing MLP to identify peptides from the Spike protein of the SARS-CoV-2 virus | | ex2 | comparing Elman network (basic RNN), GRU cell, and MLP with restricted self-attention layer in classifying movie reviews as positive or negative | | ex3 | using GAN and conditional GAN (cGAN) to generate novel samples of the MNIST dataset |

Hadar Sharvit

CS MSc student

GitHub Repository