Week 1:
1. Introduction to Distributed Optimization
2. Mathematical Optimization, Convex Sets and Convex Functions
Week 2:
1. Strong Convexity and Its Implications
2. Constrained Optimization Problems: Primal and Lagrangian Dual
Week 3:
1. KKT Conditions and Primal/Dual Methods
2. Analysis of Gradient Descent
Week 4:
1. Analysis of Accelerated Optimization Algorithms
2. Optimization Algorithms as Dynamical Systems and Introduction to Stability Theory
Week 5:
1. Lyapunov Analysis of Gradient Flows
2. Gradient Flows for Equality Constrained Optimization and Saddle-Point Problems
Week 6:
1. Accelerated Gradient Flows
2. Augmented Lagrangian and Method of Multipliers
Week 7:
1. ADMM (Alternating Direction Method of Multipliers)
2. Dual Ascent and Dual Decomposition
Week 8:
1. Introduction to Graph Theory
2. Distributed Consensus
Week 9:
1. Continuous-Time Analysis of Consensus Algorithms
2. Distributed Optimization Problem (Economic Dispatch Problem)
Week 10:
1. Distributed Optimization Algorithms
2. Introduction to Neural Networks and Ring-Allreduce Algorithm
Week 11:
1. Introduction to Federated Learning
2. Data Heterogeneity in Federated Learning
Week 12:
1. Computational Heterogeneity in Federated Learning
2. Robustness in Federated Learning
DOWNLOAD APP
FOLLOW US