Week 1: Overview and motivation for the course: Why DL, Importance, Companies which are working, Applications, Future. To whom the course is designed (target audience), Contents (week wise). How this course is different from others. (More hands-on) and live classes, Make it more impactful.
Overview of machine learning and deep learning, difference between ML and DL with an example, History and Evolution of Deep Learning with computational efficiency.
Introduction to Neural networks: Perceptron: logistic regression, Single Layer Perceptron, Single Layer Perceptron numerical problem, Limitations of Single Layer Perceptron
Hands-on on building a simple perceptron model using colab file
Introduction to Multilayer Perceptron, Difference between Shallow neural networks, deep neural networks, Take an example and show how to design the NN (5 to 7 examples), Activation Functions, Loss Functions
Week 2: Gradient Descent (GD) and Backpropagation (MSE)
Optimizers: Momentum-Based GD, Nesterov Accelerated GD, Stochastic GD, AdaDelta, AdaGrad, RMSProp, Adam. Regularization Techniques: L1/L2 regularization, dropout, Early stopping
Hands-on on building the Artificial Neural Network for classification and regression problems with exposure to hyperparameter tuning. Interpreting the results using simple XAI techniques: LIME & SHAP
Week 3: CNN: Fundamentals of Image representation and Image preprocessing and Data augmentation
Introduction to Convolutional Neural Networks, Inspiration behind CNN, Key Components of CNN, Types of convolutions
CNN architecture
CNN architecture
Hands-on on building a simple CNN model for binary and multiclass classification.
Week 4: A typical CNN structure, Standard CNN models: AlexNet, VGGNet 16 and 19
Standard CNN Models: GoogLeNet, ResNet 18, 34
Standard CNN Models: Inception, Transfer Learning
Hands-on on transfer learning and building an ensemble model.
Week 5: Introduction to XAI: Algorithms and its working mechanism
Hands-on: Interpreting the results from CNN model using simple XAI techniques: GRADCAM and SMOOTHGRAD
Week 6: Evaluation metrics for segmentation, CNN based segmentation algorithms: UNet
Attention-based UNet, Introduction to CNN based Object Detection models.
Object detection algorithms: YOLO, RCNN, Faster RCNN models.
Hands-on on object detection using YOLO
Hands-on on UNet and attention based UNet
Week 7: Sequence-to-sequence models: Introduction to Recurrent Neural Networks and their structure Challenges in RNN (Vanishing and Exploding Gradients)
Numerical problem on RNN
Hands-on on building RNN on structured and unstructured data
Variants of RNN and its hands-on
Week 8: Introduction to Long-short term memory (LSTM) architecture and its necessity, Bidirectional LSTM, Stacked LSTMs
Understand the GRU architecture, Compare LSTM vs GRU: speed, accuracy, complexity. When to use GRU over LSTM
Why attention mechanism in RNNs, working of attention mechanism, benefits of attention mechanism
Hands-on on building LSTM models for structured and unstructured data.
Week 9: Introduce NLP tasks, Classical NLP vs Deep Learning NLP, Text Preprocessing: Tokenization, Stopwords, Lemmatization
Word Representations: One-hot encoding, Word embeddings: Word2Vec, GloVe, FastText
Sequence modeling in NLP, Recurrent Neural Networks (RNN) basics, Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU), Word embeddings + RNN for sequence tasks
Hands on session on RNN for NLP task
Week 10: Unsupervised Learning: Introduction to Autoencoder, Architecture of AE and math behind
Types (Simple, Deep, CNN-based, Types), Training of Autoencoders,
Hands-on on building a AE and types of AE
Week 11: Transformer architectures: Self Attention, Encoder Decoder Attention, In-context-learning,
Low-rank adaptation. Self-supervised learning: Objectives and Loss Functions, Masked Language
Modeling
Week 12: Large Language Models: Tokenizers, Pre-training and post-training, multimodal alignment,
model compression, Reinforcement Learning for fine-tuning, Proximal Policy Optimization,
Benchmarking and Evaluation of LLMs.,
Diffusion Models: Deep generative models, VAEs and GANs, Forward and reverse diffusion, Denoising Score Matching, Variational Lower Bounds, Stable Diffusion
DOWNLOAD APP
FOLLOW US