Dimensions of optimization settings: (simple vs. complex)
Subdivision of continuous optimization problems:
An optimization problem is any problem of minimizing an objective function over a feasible set. Given optimization variable $x \in \mathbb{R}^n$ and real-valued functions: $f_0$, the objective function (cost function); inequality constraint functions $f_i$ and equality constraint functions $h_i$, the standard form of an optimization problem is: $$\begin{align} \min \quad & f_0(x) \\ \text{s.t.} \quad & f_i(x) \le 0 && i = 1,\dots,m \\ & h_i(x) = 0 && i = 1,\dots,p \end{align}$$ The epigraph form of an optimization problem is: $$\begin{align} \min \quad & t \\ \text{s.t.} \quad & f_0(x) - t \le 0 \\ & f_i(x) \le 0 && i = 1,\dots,m \\ & h_i(x) = 0 && i = 1,\dots,p \end{align}$$
The domain of an optimization problem is the set of points for which the objective and all constraint functions are defined: $D = (\cap_i \text{dom}f_i) ∩ (\cap_j \text{dom}h_j)$. The feasible set (or constraint set) $X$ of an optimization problem is the set of all points in the domain of an optimization problem that satisfy all constraints. An inequality constraint is active at a feasible point, if the inequality binds: $\exists x \in X: f_i(x) = 0$. Otherwise it is inactive at that point. A constraint is redundant if it doesn't change the feasible set.
A locally optimal point is a feasible point that optimizes the objective function over the feasible set within one of its neighborhood: $x = \inf { f_0(X \cap B_x(R)) }$.
The optimal value of an optimization problem is the infimum of the image of the feasible set, which takes value of extended real numbers: $p^∗ = \inf \{f_0(X)\}$, where $p$ is for primal. By convention, the infimum of an empty set is infinity. If the image of feasible points can be arbitrarily low, the optimal value is negative infinity. The optimal set of an optimization problem is the intersection of the preimage of its optimal value and its feasible set: $X_{\text{opt}} = f_0^{-1}( p^∗ ) \cap X$. An optimal point is a point of the optimal set.
For $\varepsilon>0$, the $\varepsilon$-suboptimal set of an optimization problem is defined as $X_{\text{opt}} = f_0^{-1}[p^∗ , p^∗ + \varepsilon] \cap X$. An $\varepsilon$-suboptimal point is a point of the $\varepsilon$-suboptimal set.
An optimization problem is unconstrained if there is no constraints: $m = p = 0$. An optimization problem is infeasible, if its feasible set is empty. Informally, two optimization problems are equivalent if from a solution of one, a solution of the other is readily found, and vice versa. It is possible but complicated to give a formal definition of such equivalence.
An ascent direction of a function at a point is a direction in which the function locally strictly increases. A direction of steepest ascent of a function at a point is an ascent direction with the largest directional derivative.
Convex function has identical local and global minimum sets.
Computational properties of convex optimization problems:
Optimization is the pursuit of optimal points in practical/computational problems. Although calculus easily solves this task when the function is differentiable and unconstrained, optimization mostly deal with constrained problems and continuity is not guaranteed.
Development of optimization theory:
Duality in general means two different views of the same object. The dual problem of a discrete problem is continuous/convex, which provides a lower bound and other important information for the solution of the discrete primal.
LPs and convex programs are often solved by the same methods:
Key distinction is not linear-nonlinear but convex-nonconvex. {Rockafellar}
Problem ranking in increasing practical difficulty