The issues that are taught in the course are divided into main and more mathematical topics. In short, the main topics are the optimality conditions (unconstrained, convex and general nonlinear constrained problems; necessary and sufficient conditions), duality and numerical methods (unconstrained and constrained).
- Formulating optimization model. How do you build a model so that it can be efficiently solved? What type of problems are easy and what are hard? Applications: Markowitz portfolio model. Risk management and coherent risk measures. Production planning. The relation of discrete and continuous optimization problems (nonlinear programming vs. dynamic optimization). Lectures 1,6.
- Optimality. Possibly many local and global optima. Necessary and sufficient optimality conditions: unconstrained (ch. 4.1), convex (thm 3.4.3) and nonlinear constrained problem (4.3.2,4.3.6-8,4.4.1). How to use optimality conditions in solving optimization problems. Proving optimality conditions using separation theorems and geometric optimality. How do you get from Weierstrass theorem to FJ and KKT conditions. Lectures 1-5.
- Convexity in optimization. Introduce different convex sets and functions. Constructing more convex functions using convexity preserving operations. Techniques for proving that a function is convex. Lectures 1-5.
- Lagrange duality. Definition and interpretation of dual problem, duality theorems and using duality in getting upper and lower bounds. Lecture 7.
- Numerical methods. Comparing algorithms: need of derivatives, speed of convergence, computational complexity, need of memory. Methods for unconstrained, non-differentiable, LP and constrained problems. How to choose a suitable method? What software, computers and languages are needed? The relation to solving a set of equations and differential equations. Lectures 8-12.
- Unconstrained optimization. Gradient and Newton's method (ch. 8.6). Conjugate direction and quasi-Newton methods (ch. 8.8). Trust-region method (ch. 8.7). Lectures 8-9.
- Constrained optimization. Penalty function methods: Augmented Lagrangian method (ch. 9.3). Barrier function methods: primal-dual interior point method (ch. 9.5). Sequential Quadratic Programming (SQP) (ch. 10.4).
More mathematical topics
- Generalizations of differentiability. This course deals with one of them: subdifferential (subgradients, 3.2.3). Lectures 3-4.
- Convex analysis. Properties of convex functions, convexity preserving operations and subdifferential calculus. Jensen inequality. Lectures 1-4.
- Axiom of choice. Separation theoremst: Farkas (2.4.5) and Motzkin. Hahn-Banach separation theorem. Lecture 2.
- Implicit function theorem. Lectures 4-5.
- Interpretation of Lagrange (Hamiltonian) function. Lecture 5,7,
- Lagrange and conjugate (Fenchel) duality. Lecture 7.
- Speed of convergence (7.4.1) and condition number. How well-behaved a problem is numerically. Interpretation of eigenvalue of objective function's and Lagrange function's Hessian matrix. Lectures 7-8,11.
- Sherman-Morrison-Woodbury matrix inversion theorem (formula 8.55). Lecture 9.
- Solving a set of linear equations and using sparsity. Lecture 9.
- Interpreting ellipsoids as affinely mapped unit balls. Interpretation of eigenvalues and eigenvectors as ellipsoid's axis and relation to volume. Lecture 10.
- Solving a set of equations in matrix form (ch. 9.5). Lecture 11.
- Different mathematical notions:epigraph, hypograph, lower level set, supporting hyperplanes, perspective function, convex hull, geometric and arithmetic mean, first Fermat point, Newton's optimal shape in homogenous current.