Basic Information
Definitions in Optimization, Including Extended Real-valued Function, Gradients, Subgradients, and Feasible Descent Direction, Convex Conjugate Function. Necessary and Sufficient Conditions For Optimality of Unconstrained and Constrained Optimization - Stationarity, Kkt. From Optimality Conditions to Iterative Solution Methods# Gradient Descent, Subgradient, Projected Gradient And Proximal Gradient Algorithms and Their Convergence Analysis. Seperation Theorems and Lagrangian Based Duality. Examples From The The Worlds of Logistics, Machine Learning and Signal Processing. Learning Outcmes# at The End of The Course The Students Will Be Able To# 1. Differentiate Between Convex and Nonconvex Problems. 2. Derive Dual Problems and Use Duality to Solve Optimization Problems. 3. Program First-order Algorithms to Solve Structured Optimization Problems. 4. Prove Properties of Optimal Solutions of Continuous Optimization Problems.
Faculty: Data and Decision Sciences
|Undergraduate Studies
|Graduate Studies
Pre-required courses
96327 - Nonlinear Models in Operations Research
Course with no extra credit
236330 - Introduction to Optimization
Course with no extra credit (contained)
97311 - Optimization 1 98311 - Optimization 1