Quadratic programming optimizes multivariable quadratic functions with linear constraints. Convexity is crucial for unique global minima, solved by expanded simplex algorithm or other methods like interior point, active set, or conjugate gradient. Real-world problems like portfolio optimization or cost reduction can be solved using quadratic programming.
Quadratic programming is a method used to optimize a multivariable quadratic function that may or may not be linearly constrained. Many real-world problems, such as optimizing a company’s portfolio or reducing a manufacturer’s costs, can be described using a quadratic program. If the objective function is convex, then a feasible solution can exist and can be solved by algorithms known as the expanded simplex algorithm. There are methods for solving some nonconvex quadratic functions, but they are complicated and not readily available.
Mathematical optimization techniques are used in quadratic programming to minimize an objective function. The objective function is composed of a number of decision variables that may or may not be bounded. The decision variables have powers 0, 1, or 2. The objective function can be subject to a set of linear equality and inequality constraints regarding the decision variables. An example of quadratic programming is: minimize f(x,y) = x2 + 3y2 – 12y + 12 where x + y = 6 and ex > 0 and y ≥ 0.
It is often interesting to use multivariate quadratic functions to describe real world problems. For example, using modern portfolio theory, a financial analyst would try to optimize a firm’s portfolio by choosing the proportion of assets that minimizes the risk associated with a given expected return. A quadratic equation composed of asset weights and the correlation between assets describes portfolio variance that can be minimized using quadratic programming. Another example would be a manufacturer using a quadratic equation to describe the relationship between different quality components and the cost of a product. The manufacturer can minimize costs while maintaining certain standards by adding linear constraints to the quadratic program.
One of the most important conditions in solving a quadratic program is the convexity of the target equation. The convexity of a quadratic function is determined by the Hessian or the matrix of its second derivatives. The function is said to be convex if the Hessian matrix is positive definite or positive semidefinite, i.e. if all the eigenvalues are respectively positive or non-negative. If the Hessian is positive and there is a feasible solution, then that local minimum is unique and is a global minimum. If Hesse is semi-positive, a feasible solution may not be unique. Nonconvex quadratic functions can have local or global minima, but are more difficult to determine.
There are many approaches to solve a convex quadratic function using quadratic programming. The most common approach is an expansion of the simplex algorithm. Some other methods include the interior point or barrier method, the active set method, and the conjugate gradient method. These methods are built into some programs such as Mathematica® and Matlab® and are available in libraries for many programming languages.
Protect your devices with Threat Protection by NordVPN