Convex optimization

Arrangement for Convex Optimization For All

01-01 Introduction - Optimization problems?

01-02 Introduction - Convex optimization problem

01-03 Introduction - Goals and Topics

01-04 Introduction - Brief history of convex optimization

02 Convex Sets

02-01 Affine and convex sets

02-01-01 Line, line segment, ray

02-01-02 Affine set

02-01-03 Convex set

02-01-04 Cone

02-02 Some important examples

02-02-01 Convex set examples

02-02-02 Convex Cone examples

02-03 Operations that preserve convexity

02-04 Generalized inequalities

02-05 Separating and supporting hyperplanes

02-06 Dual cones and generalized inequalities

02-06-01 Dual cones

02-06-02 Dual generalized inequalities

03 Convex functions

03-01 Basic properties and examples

03-01-01 Definition

03-01-02 Examples of convex functions

03-01-03 Key properties of convex functions

03-02 Operations that preserve convexity

03-03 The conjugate function

03-04 Quasiconvex functions

03-05 Log-concave and log-convex functions

03-06 Convexity with respect to generalized inequalities

04 Convex optimization basics

04-01 Basic terminology

04-02 Convex solution sets

04-03 First order optimality condition

04-04 Partial optimization

04-05 Transformations and change of variables

04-06 Eliminating equality constraints

04-07 Slack variables

04-08 Relaxation

05 Canonical Problems

05-01 Linear Programming (LP)

05-02 Quadratic Programming (QP)

05-03 Quadratically Constrained Quadratic Programming (QCQP)

05-04 Second-Order Cone Programming (SOCP)

05-05 Semidefinite Programming (SDP)

05-06 Conic Programming (CP)

06 Gradient Descent

06-01 Gradient Descent

06-02 How to choose step sizes

06-02-01 Fixed step size

06-02-02 Backtracking line search

06-02-03 Exact line search

06-03 Convergence analysis

06-03-01 Convergence analysis & Proof

06-04 Gradient boosting

06-05 Stochastic gradient descent

07 Subgradient

07-01 Subgradient

07-02 Subdifferentials

07-02-01 Connection to a Convexity Geometry

07-02-02 Subgradient Calculus

07-03 Subgradient Optimality Condition

07-03-01 Subgradient Optimality Condition

07-03-02 Derivation of First-Order Optimality Condition

07-03-03 Example: Lasso Optimality Condition

07-03-04 Example: Soft-Thresholding

07-03-05 Example: Distance to a Convex Set

08 Subgradient Method

08-01 Subgradient Method

08-01-01 Step size choices

08-01-02 Basic Inequality

08-01-03 Convergence analysis

08-01-04 Convergence rate

08-01-05 Example: Regularized Logistic Regression

08-02 Stochastic Subgradient Method

08-02-01 Stochastic Subgradient Method

08-02-02 Convergence of Stochastic Methods

08-02-04 Batch vs Stochastic Methods

08-03 Improving on the Subgradient Method

Reference

ModuLab, 모두를 위한 컨벡스 최적화 (Convex Optimization For All)

Seoul National University, Convex Optimization

Convex Optimization by Stephen Boyd and Lieven Vandenberghe, Cambridge University Press

Stanford University, EE364a slides


© 2020. All rights reserved.

Powered by Hydejack v8.1.1