0% found this document useful (0 votes)
73 views3 pages

Module 8

Module 8 covers various aspects of Neural Networks, Computer Vision, and Deep Learning, including the history, training methods, and architectures like Multi-Layer Perceptrons and Convolutional Neural Networks. It also includes practical sessions on TensorFlow, Keras, and advanced topics such as Generative Adversarial Networks and Transformers. The module emphasizes hands-on learning with live sessions and interview questions related to deep learning.

Uploaded by

97arshukla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views3 pages

Module 8

Module 8 covers various aspects of Neural Networks, Computer Vision, and Deep Learning, including the history, training methods, and architectures like Multi-Layer Perceptrons and Convolutional Neural Networks. It also includes practical sessions on TensorFlow, Keras, and advanced topics such as Generative Adversarial Networks and Transformers. The module emphasizes hands-on learning with live sessions and interview questions related to deep learning.

Uploaded by

97arshukla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Module-8: Neural Networks, Computer Vision and Deep Learning

Deep Learning:Neural Networks.


• History of Neural networks and Deep Learning. 25 mins
• How Biological Neurons work? 8 mins
• Growth of biological neural networks 17 mins
• Diagrammatic representation: Logistic Regression and Perceptron 17 mins
• Multi-Layered Perceptron (MLP). 23 mins
• Notation 18 mins
• Training a single-neuron model. 28 mins
• Training an MLP: Chain Rule 40 mins
• Training an MLP:Memoization 14 mins
• Backpropagation. 26 mins
• Activation functions 17 mins
• Vanishing Gradient problem. 23 mins
• Bias-Variance tradeoff. 10 mins
• Decision surfaces: Playground 15 mins

Deep Learning: Deep Multi-layer perceptrons


• Deep Multi-layer perceptrons:1980s to 2010s 16 mins
• Dropout layers & Regularization. 21 mins
• Rectified Linear Units (ReLU). 28 mins
• Weight initialization. 24 mins
• Batch Normalization. 21 mins
• Optimizers:Hill-descent analogy in 2D 19 mins
• Optimizers:Hill descent in 3D and contours. 13 mins
• SGD Recap 18 mins
• Batch SGD with momentum. 25 mins
• Nesterov Accelerated Gradient (NAG) 8 mins
• Optimizers:AdaGrad 15 mins
• Optimizers : Adadelta andRMSProp 10 mins
• Adam 11 mins
• Which algorithm to choose when? 5 mins
• Gradient Checking and clipping 10 mins
• Softmax and Cross-entropy for multi-class classification. 25 mins
• How to train a Deep MLP? 8 mins
• Auto Encoders. 27 mins
• Word2Vec :CBOW 19 mins
• Word2Vec: Skip-gram 14 mins
• Word2Vec :Algorithmic Optimizations. 12mins

Deep Learning: Tensorflow and Keras.


• Tensorflow and Keras overview 23 mins
• GPU vs CPU for Deep Learning. 23 mins
• Google Colaboratory. 5 mins
• Install TensorFlow 6 mins
• Online documentation and tutorials 6 mins
• Softmax Classifier on MNIST dataset 32 mins
• MLP: Initialization 11 mins
• Model 1: Sigmoid activation 22 mins
• Model 2: ReLU activation. 6 mins
• Model 3: Batch Normalization. 8 mins
• Model 4 : Dropout. 5mins
• MNIST classification in Keras. 18mins
• Hyperparameter tuning in Keras. 11 mins

Deep Learning: Convolutional Neural Nets.


• Biological inspiration: Visual Cortex 18 mins
• Convolution:Edge Detection on images. 28 mins
• Convolution:Padding and strides 19 mins
• Convolution over RGB images. 11 mins
• Convolutional layer. 23 mins
• Max-pooling. 12 mins
• CNN Training: Optimization 9 mins
• Receptive Fields and Effective Receptive Fields 8 mins
• Example CNN: LeNet [1998] 10 mins
• ImageNet dataset. 6 mins
• Data Augmentation. 8 mins
• Convolution Layers in Keras 17 mins
• AlexNet 13 mins
• VGGNet 11 mins
• Residual Network. 22 mins
• Inception Network. 19 mins
• What is Transfer learning. 23 mins
• Code example: Cats vs Dogs. 15 mins
• Code Example: MNIST dataset. 6 mins
• [Interview Question] How to build a face recognition system? 1 mins

Deep Learning: Long Short-term memory (LSTMs)


• Why RNNs? 23 mins
• Recurrent Neural Network. 29 mins
• Training RNNs: Backprop. 16 mins
• Types of RNNs. 14 mins
• Need for LSTM/GRU. 10 mins
• LSTM. 35 mins
• GRUs. 7 mins
• Deep RNN. 7 mins
• Bidirectional RNN. 12 mins
• Code example : IMDB Sentiment classification 33 mins

Deep Learning: Generative Adversarial Networks (GANs)


• Live session on Generative Adversarial Networks (GAN) 124 mins

Encoder-Decoder Models
• LIVE: Encoder-Decoder Models 82 mins

Attention Models in Deep Learning


• Attention Models in Deep Learning 84 mins

Deep Learning: Transformers and BERT


• Transformers and BERT 112 mins

Deep Learning: Image Segmentation


• Live session on Image Segmentation 95 mins
Interview Questions on Deep Learning
• Questions and Answers 30 mins

Deep Learning: Object Detection


• Object Detection 123 mins
• Object Detection YOLO V3 103 mins

Module 8: Live Sessions

You might also like