Course syllabus
The syllabus for the course can be found here: Chalmers and GU.
The TimeEdit schedule can be found HERE.
Other important information about the course can be found on the Canvas home page and under the page Program and detailed information.
Purpose
The purpose of this (first) course in optimization is to provide:
- knowledge of some important classes of optimization problems and of application areas of optimization modelling and methods;
- practice in describing relevant parts of a real-world problem in a mathematical optimization model;
- an understanding of necessary and sufficient optimality criteria, of their consequences, and of the basic mathematical theory upon which they are built;
- examples of optimization algorithms that are naturally developed from this theory, their convergence analysis, and their application to practical optimization problems.
Course PM
Aim
The aim of the course is to give a good theoretical foundation for some of the most important areas of optimisation: convex optimisation, linear optimisation, and nonlinear optimisation. The course treats the basics principles for analyzing the properties of an optimisation problem, and the characterization of feasible points that are (locally) optimal. This theory is used to develop a number of examples of optimisation methods, that can be used to solve instances of practical optimisation problem. The course also aims to give some practical training in mathematical modelling and problem solving using optimisation.
Learning outcomes (after completion of the course the student should be able to)
The goal is that you, after completing the course, should understand parts of the theory of optimality, duality, and convexity, and their interrelations. In this way, you will be able to analyze many types of optimization problems occurring in practice and both classify them and provide guidelines as to how they should be solved. The latter is the more practical goal of an otherwise mainly theoretical course.
More precisely, after completing the course you should be able to:
- state and explain the most important concepts in convex analysis, convex optimization, and duality, and be able to apply the theory on concrete examples.
- state and explain the basics of necessary and sufficient optimality conditions, in particular the KKT conditions, and be able to utilize the theory to analyze and solve concrete examples.
- analyze linear programming problems using concepts such as duality and sensitivity; solve linear programming programs using the simplex method and explain how the method works.
- explain notions such as descent and feasible direction, and use these concepts to explain the principles behind, to analyze, and to apply classical optimization methods, e.g., steepest descent, variations of Newton's method, the Frank-Wolfe method, and penalty methods; be able to specify conditions under which these methods converge.
- formulate relevant parts of a real-world problem in terms of a mathematical optimization model, analyze the model using appropriate tools and methods, and apply appropriate solution algorithms.
Content
The main focus of the course is on optimization problems in continuous variables; it builds a foundation for the analysis of an optimization problem. We can roughly separate the material into the following areas:
- Convex analysis: convex set, polytope, polyhedron, cone, representation theorem, extreme point, Farkas Lemma, convex function
- Optimality conditions and duality: global/local optimum, existence and uniqueness of optimal solutions, variational inequality, Karush-Kuhn-Tucker (KKT) conditions, complementarity conditions, Lagrange multiplier, Lagrangian dual problem, global optimality conditions, weak/strong duality
- Linear programming (LP): LP models, LP algebra and geometry, basic feasible solution (BFS), the Simplex method, termination, LP duality, optimality conditions, strong duality, complementarity, interior point methods, sensitivity analysis
- Nonlinear optimization methods: direction of descent, line search, (quasi-)Newton method, Frank--Wolfe method, gradient projection, exterior and interior penalty, sequential quadratic programming
We also touch upon other important problem areas within optimization, such as integer programming and network optimization.
Prerequisites
Passed courses on analysis (in one and several variables) and linear algebra; familiarity with matrix/vector notation and calculus, differential calculus.
Reading Chapter 2 in the course book (see home page) provides a partial background, especially to the mathematical notation used and most of the important basic mathematical terminology. Material for recapping the prerequisites is also provided here.
Course summary:
Date | Details | Due |
---|---|---|