Week 1: Introduction to Theory of Learning: meaning of learning, overfitting etc.
Week 2: Convex functions and sets, Convex Optimization, Optimization problem Formulations
Week 3: Gradient and Sub-gradient descent for non- smooth functions
Week 4: Regularization, Lasso and Ridge, Applications with medical data
Week 5: Accelerating Gradient Descent, Stochastic Gradient Descent and its applications (NN)
Week 6: Support Vector Regression, Logistic Regression for dichotomous variable
Week 7: Maximum likelihood estimation (MLE) in Binomial, Multinomial, Gaussian, models in exponential family
Week 8: Maximum likelihood estimation (MLE) in Binomial, Multinomial, Gaussian, models in exponential family. (Contd.)
Week 9: Dimensionality reduction techniques
Week 10: Dynamical systems and control, Fourier transform and its applications
Week 11: Expectation Maximization (EM) based learning in Mixture models, Hidden Markov Model, Dirichlet processes (Clustering).
Week 12: Bayesian Machine Learning, estimating decisions using posterior distributions, Model selection: Variational Inference.
DOWNLOAD APP
FOLLOW US