Calculus Optimization Methods/Lagrange Multipliers

The method of Lagrange multipliers solves the constrained optimization problem by transforming it into a non-constrained optimization problem of the form:

  • \operatorname{\mathcal{L}}(x_1,x_2,\ldots, x_n,\lambda)= \operatorname{f}(x_1,x_2,\ldots, x_n)+\operatorname{\lambda}(k-g(x_1,x_2,\ldots, x_n))

Then finding the gradient and hessian as was done above will determine any optimum values of \operatorname{\mathcal{L}}(x_1,x_2,\ldots, x_n,\lambda).

Suppose we now want to find optimum values for f(x,y)=2x^2+y^2 subject to x+y=1 from [2].

Then the Lagrangian method will result in a non-constrained function.

  • \operatorname{\mathcal{L}}(x,y,\lambda)= 2x^2+y^2+\lambda (1-x-y)

The gradient for this new function is

  • \frac{\partial \mathcal{L}}{\partial x}(x,y,\lambda)= 4x+\lambda (-1)=0
  • \frac{\partial \mathcal{L}}{\partial y}(x,y,\lambda)= 2y+\lambda (-1)=0
  • \frac{\partial \mathcal{L}}{\partial \lambda}(x,y,\lambda)=1-x-y=0

Finding the stationary points of the above equations can be obtained from their matrix from.

 \begin{bmatrix}
4 & 0 & -1 \\
0& 2 & -1 \\
-1 & -1 & 0
\end{bmatrix} \begin{bmatrix}
x\\
y \\
\lambda \end{bmatrix}= \begin{bmatrix}
0\\
0\\
-1
\end{bmatrix}

This results in x=1/3, y=2/3, \lambda=4/3.

Next we can use the hessian as before to determine the type of this stationary point.

 H(\mathcal{L})=
 \begin{bmatrix}
4 & 0 & -1 \\
0& 2 & -1 \\
-1&-1&0
\end{bmatrix}

Since  H(\mathcal{L}) >0 then the solution (1/3,2/3,4/3) minimizes f(x,y)=2x^2+y^2 subject to x+y=1 with f(x,y)=2/3.

Last modified on 15 February 2011, at 16:38