Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/January 2000

Problem 1

edit

Let   be a function with 3 continuous derivatives. Let   be a quadratic polynomial that interpolates   at  . Let   and

 .

Problem 1a

edit

Show that

 ,

where   depends only on   and determine  . (Hint: the key to this is to prove that   vanishes at some point in  . The result can then be obtained by integration.)

Solution 1a

edit

Proof of Hint

edit

Claim: There exists   such that  


Proof: The interpolation polynomial may be expressed using dividing difference coefficients i.e.


 


which implies


 


In general, the divided difference coefficients may be expressed as a factorial weighted point of a derivative of   i.e.


 


Hence,


 


which implies


 

Application of Hint

edit

From the hint we know that


 


which implies


 


Since   is quadratic,   is constant i.e.   for all  


Therefore,


 


By the fundamental theorem of calculus,


 


Therefore,


 

Problem 1b

edit

Now suppose   and   has 4 continuous derivatives. In this case show


 


where  . What is   in terms of the derivatives of  ?


Solution 1b

edit

Third Derivative of f has Zero

edit

We know that  , because  . Now, by   we can conclude that there exists   such that  .

Application of Fundamental Theorem of Calculus (Twice)

edit
 

Problem 2a

edit

Find   such that   is a polynomial of degree   and this set is orthogonal on   with respect to the weight function  . (Note:  ,  )

Solution 2a

edit

Apply Gram Schmidt

edit

To find orthogonal   use the Gram Schmidt method.

Let   be a basis of  .

Calculate p_0

edit

Choose  .

Calculate p_1

edit

From Gram Schmidt, we have


 , where


 


 


Therefore  

Calculate p_2

edit

Proceeding with Gram Schmidt, we have

  where


 


 


 


 


Therefore


 

Problem 2b

edit

Derive the 2-point Gaussian formula


 


i.e. find the weights and nodes

Solution 2b

edit

Find the Nodes

edit

The nodes   and   are the roots of the  th orthogonal polynomial i.e.  


Applying the quadratic formula yields the roots:

 


 

Find the Weights

edit

The approximation is exact for polynomials at most of degree  . Hence, we have the following system of equations


 


 


Solving the solving the system of equation by substitution yields the weights:


 


 

Problem 3

edit

Let   be an   nonsingular matrix, and consider the linear system  


Problem 3a

edit

Write down the Jacobi iteration for solving   in a way that it would be programmed on a computer

Solution 3a

edit

Choose  


for  

 
<convergence condition>

end


Where  ,   is diagonal,   are lower and upper triangular, respectively.

Problem 3b

edit

Suppose   has   non-zero elements with  . How many operations per iteration does the Jacobi iteration take?

Solution 3b

edit

The   diagonal entries of   are non-zero since otherwise   would not exist.


Therefore   contains   off-diagonal non-zero entries.


The computation during each iteration is given by


 


Therefore there are   multiplies in each iteration.

Problem 3c

edit

Assume that   is strictly diagonally dominant: for  


 


Show that the Jacobi iteration converges for any guess  . (Hint: You may use Gerschgorin's theorem without proving it.)

Solution 3c

edit

Let  .


Theorem 8.2.1 [SB] states that   if and only if the Jacobi iteration converges.


Matrix multiplication and the definitions of   gives the explicit entrywise value of  


  for   and  


Then, using Gerschgorin's Theorem and diagonal dominance, we have the result.