# Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/August 2003

## Problem 3

 Let ${\displaystyle A,B\in R^{n\times n}\!\,}$  be symmetric and positive definite matrices, and let ${\displaystyle b\in R^{n}\!\,}$ . Consider the quadratic function ${\displaystyle Q(x)={\frac {1}{2}}x^{T}Ax-x^{T}b\!\,}$  for ${\displaystyle x\in R^{n}\!\,}$  and a descent method to approximate the solution of ${\displaystyle Ax=b\!\,}$ : ${\displaystyle x_{k+1}=x_{k}+\alpha _{k}d_{k}\!\,}$

## Problem 3a

 Define the concept of steepest descent ${\displaystyle d_{k}\!\,}$  and show how to compute the optimal stepsize ${\displaystyle \alpha _{k}\!\,}$

### Descent Direction

${\displaystyle \nabla Q(x_{k})\cdot d_{k}<0\!\,}$

### Optimal step size

Choose ${\displaystyle \alpha _{k}\!\,}$  such that ${\displaystyle Q(x_{k}+\alpha _{k}d_{k})\!\,}$  is minimized i.e.

${\displaystyle \alpha _{k}:\min _{\alpha _{k}}Q(x_{k}+\alpha _{k}d_{k})\!\,}$

${\displaystyle Q(x_{k}+\alpha _{k}d_{k})=\langle x_{k}+\alpha _{k}d_{k},Ax_{k}+\alpha _{k}Ad_{k}\rangle -\langle x_{k}+\alpha _{k}d_{k},b\rangle \!\,}$

${\displaystyle {\frac {d}{d\alpha }}Q(x_{k}+\alpha _{k}d_{k})=\langle Ad_{k},x_{k}\rangle +\alpha _{k}\langle Ad_{k},d_{k}\rangle -\langle d_{k},b\rangle \!\,}$

Setting the above expression equal to zero gives the optimal ${\displaystyle \alpha _{k}\!\,}$ :

${\displaystyle \alpha _{k}={\frac {\langle d_{k},b-Ax_{k}\rangle }{\langle Ad_{k},d_{k}\rangle }}\!\,}$

Note that since ${\displaystyle A\!\,}$  is symmetric

${\displaystyle \langle d_{k},Ax_{k}\rangle =\langle Ad_{k},x_{k}\rangle \!\,}$

## Problem 3b

 Formulate the steepest descent (or gradient method) method and write a pseudocode which implements it.

## Solution 3b

Note that ${\displaystyle d_{k}=-\nabla Q(x_{k})=b-Ax_{k}=r_{k}\!\,}$ . Then the minimal ${\displaystyle \alpha _{k}\!\,}$  is given by ${\displaystyle {\frac {\langle r_{k},r_{k}\rangle }{\langle r_{k},r_{k}\rangle _{A}}}\!\,}$

Given ${\displaystyle x_{0}\in R^{n}\!\,}$

For ${\displaystyle k=0,1,2,\ldots \!\,}$
{\displaystyle {\begin{aligned}r_{k}&=b-Ax_{k}\\&{\mbox{If }}r_{k}=0,{\mbox{ then quit }}\\d_{k}&=r_{k}\\\alpha _{k}&={\frac {\langle r_{k},r_{k}\rangle }{\langle r_{k},r_{k}\rangle _{A}}}\\x_{k+1}&=x_{k}+\alpha _{k}d_{k}\end{aligned}}}


## Problem 3c

 Let ${\displaystyle B^{-1}\!\,}$  be a preconditioner of ${\displaystyle A\!\,}$ . Show how to modify the steepest descent method to work for ${\displaystyle B^{-1}Ax=B^{-1}b\!\,}$  and write a pseudocode. Note that ${\displaystyle B^{-1}A\!\,}$  may not be symmetric. (Hint: proceed as with the conjugate gradient method).

## Solution 3

Since ${\displaystyle B\!\,}$  is symmetric, positive definite, ${\displaystyle B=E^{T}E\!\,}$  where ${\displaystyle E\!\,}$  is upper triangular (Cholesky Factorization).

Then ${\displaystyle B^{-1}=E^{-1}E^{-T}\!\,}$

Hence,

{\displaystyle {\begin{aligned}B^{-1}Ax&=B^{-1}b\\E^{-1}E^{-T}Ax&=E^{-1}E^{-T}b\\E^{-T}Ax&=E^{-T}b\\\underbrace {E^{-T}AE^{-1}} _{\tilde {A}}\underbrace {Ex} _{\tilde {x}}&=\underbrace {E^{-T}b} _{\tilde {b}}\end{aligned}}\!\,}

${\displaystyle {\tilde {A}}\!\,}$  is symmetric:

${\displaystyle (E^{-T}AE^{-1})^{T}=E^{-T}AE^{-1}\!\,}$  since ${\displaystyle A\!\,}$  symmetric

${\displaystyle {\tilde {A}}\!\,}$  is positive definite:

${\displaystyle x^{T}E^{-T}AE^{-1}x=(E^{-1}x)^{T}A(E^{-1}x)>0\!\,}$  since ${\displaystyle A\!\,}$  positive definite

### Pseudocode

Given ${\displaystyle x_{0}\in R^{n}\!\,}$

For ${\displaystyle k=0,1,2,\ldots \!\,}$
{\displaystyle {\begin{aligned}{\tilde {x_{k}}}&=Ex_{k}\\r_{k}&={\tilde {b}}-{\tilde {A}}{\tilde {x_{k}}}\\&{\mbox{If }}r_{k}=0,{\mbox{ then quit }}\\d_{k}&=r_{k}\\\alpha _{k}&={\frac {\langle r_{k},r_{k}\rangle }{\langle r_{k},r_{k}\rangle _{\tilde {A}}}}\\x_{k+1}&=x_{k}+\alpha _{k}d_{k}\end{aligned}}}