Dimensions of optimization settings: (simple vs. complex)

  1. Continuous vs. discrete (There may also be mixed continuous and discrete problems.)
    • A quantity can be modeled as continuous if your instrument can take on any value its resolution allows.
  2. Deterministic vs. stochastic (infinite dimensional, dynamic)
    • A quantity can be modeled as stochastic if its generating mechanism is inherently stochastic, or you acknowledge ignorance towards its mechanism, or you intentionally introduce ignorance by reducing information.
  3. Single objective vs. multi-objective vs. game

Subdivision of continuous optimization problems:

  1. Convex Optimization: linear programming (LP);
  2. Non-convex Optimization

Definitions

An optimization problem is any problem of minimizing an objective function over a feasible set. Given optimization variable \( x \in \mathbb{R}^n \) and real-valued functions: \( f_0 \), the objective function (cost function); inequality constraint functions \(f_i\) and equality constraint functions \(h_i\), the standard form of an optimization problem is: \[\begin{align} \min \quad & f_0(x) \\ \text{s.t.} \quad & f_i(x) \le 0 && i = 1,\dots,m \\ & h_i(x) = 0 && i = 1,\dots,p \end{align}\] The epigraph form of an optimization problem is: \[\begin{align} \min \quad & t \\ \text{s.t.} \quad & f_0(x) - t \le 0 \\ & f_i(x) \le 0 && i = 1,\dots,m \\ & h_i(x) = 0 && i = 1,\dots,p \end{align}\]

The domain of an optimization problem is the set of points for which the objective and all constraint functions are defined: \( D = (\cap_i \text{dom}f_i) ∩ (\cap_j \text{dom}h_j) \). The feasible set (or constraint set) \(X\) of an optimization problem is the set of all points in the domain of an optimization problem that satisfy all constraints. An inequality constraint is active at a feasible point, if the inequality binds: \(\exists x \in X: f_i(x) = 0\). Otherwise it is inactive at that point. A constraint is redundant if it doesn't change the feasible set.

A locally optimal point is a feasible point that optimizes the objective function over the feasible set within one of its neighborhood: \( x = \inf { f_0(X \cap B_x(R)) } \).

The optimal value of an optimization problem is the infimum of the image of the feasible set, which takes value of extended real numbers: \( p^* = \inf \{f_0(X)\} \), where \(p\) is for primal. By convention, the infimum of an empty set is infinity. If the image of feasible points can be arbitrarily low, the optimal value is negative infinity. The optimal set of an optimization problem is the intersection of the inverse image of its optimal value and its feasible set: \( X_{\text{opt}} = f_0^{-1}( p^* ) \cap X \). An optimal point is a point of the optimal set.

For \(\varepsilon>0\), the \(\varepsilon\)-suboptimal set of an optimization problem is defined as \( X_{\text{opt}} = f_0^{-1}[p^* , p^* + \varepsilon] \cap X \). An \( \varepsilon \)-suboptimal point is a point of the \(\varepsilon\)-suboptimal set.

An optimization problem is unconstrained if there is no constraints: \(m = p = 0\). An optimization problem is infeasible, if its feasible set is empty. Informally, two optimization problems are equivalent if from a solution of one, a solution of the other is readily found, and vice versa. It is possible but complicated to give a formal definition of such equivalence.

An ascent direction of a function at a point is a direction in which the function locally strictly increases. A direction of steepest ascent of a function at a point is an ascent direction with the largest directional derivative.

Propositions

Convex function has identical local and global minimum sets.

Computational properties of convex optimization problems:

  • Interior-point methods can be proved to solve (to a specified accuracy) some cases of convex optimization problems in polynomial time.
  • There are NP-hard convex optimization problems.