Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/January 2006

Problem 1

edit

Let   be continuous on  . A polynomial   of degree not greater than   is said to be a best (or Chebyshev) approximation to   if   minimizes the expression

 

Problem 1a

edit

Show that a sufficient condition for   to be a best approximation is that there exists points   such that

 .

Solution 1a

edit

Assume there exists   such that


 


Then for  


 


Let   and  .


Then   takes on the sign of   since


 


Since   changes signs   times (by hypothesis),   has   zeros.


However   and thus can only have at most   zeros. Therefore   and  

Problem 1b

edit

Compute the best linear approximation to  . [Hint: Drawing a line through the parabola will allow you to conjecture where two points of oscillation must lie. Use conditions from (a) to determine the third point and coefficients of  .]

Solution 1b

edit

First we need to find the roots of   in [0,1], which are given by


 


So our points at which to interpolate are


 


 


Our linear interpolant passes through the points   and  , which using point-slope form gives the equation


 


or


 

Problem 2

edit

We will be concerned with the least squares problem of minimizing

 .

Here   is an   matrix of rank   (which implies  ) and   is the Euclidean vector norm. Let

 

be the QR decomposition of  . Here   are respectively  .

Problem 2a

edit

Show that the solution of the least squares problem satisfies the QR equation   and that the solution is unique. Further show that  .


Solution 2a

edit

The two problems are equivalent

edit

First notice

 


Then we can write

 


Note that multiplying by orthogonal matrices does not affect the norm.


Then solving   is equivalent to solving  , which is equivalent to solving  . Note that a solution exists and is unique since   is n-by-n and non-singular.

Show that  

edit

Similarly

 

Then

 , or simply  , as desired.

Problem 2b

edit

Use the QR equation to show that the least squares solution satisfies the normal equations  .

Solution 2b

edit

 

Problem 3

edit

  Let   be real symmetric and let   be given. For  , define   as the linear combination of the vectors   with the coefficient of   equal to one and orthogonal to the vectors  ; i.e.

         

Problem 3a

edit

Find formulas for   and  

Solution 3a

edit

Using Gram Schmidit, we have

 

Problem 3b

edit

Show that   Where do you use the symmetry of  ?

Solution 3b

edit

Since

 

, if  , then

 


Since   is symmetric,

 


From hypothesis,  

Also from hypothesis,

 


Using the above results we have,

 

Problem 3c

edit

For which non-zero vectors   does   hold?

Solution 3c

edit

For  ,

 


If  , then

 


Since   is a scalar,   is an eigenvector of  .