Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/August 2005

Problem 1Edit

Derive the one-, two-, and three-point Gaussian quadrature formulas such that

 

Give Bounds on the error of these formulas.

Solution 1Edit

Find Orthogonal Polynomials w.r.t. Weighting FunctionEdit

For this problem, we need the first three orthogonal polynomials   with respect to a weighted inner product

 

where   in our case. This can be done using Gram-Schmidt Orthogonalization on the basis  

Zero Order PolynomialEdit

Let  

First Order PolynomialEdit

 

Second Order PolynomialEdit

 

Third Order PolynomialEdit

 

Find Roots of PolynomialsEdit

The roots of these polynomials will be the interpolation nodes used in Gauss Quadrature.

 

Formula to Compute CoefficientsEdit

In Gaussian quadrature we have

 , where
 

1 point formulaEdit

 


 


 

where   is the root of  .

2 point formulaEdit

 

 


 

 


 

where   are the roots of  .

3 point formulaEdit

 

 

 


 

 

 


 

where   are the roots of  .

Derive Error BoundEdit

We know that this quadrature is exact for all polynomials of degree at most  .


We choose a polynomial   of degree at most   that Hermite interpolates i.e.


 


The error for this interpolation is


 


Compute the error of the quadrature as follows:


 


where the last line follows from the mean value theorem of integrals.


Notice that   is simply the polynomials orthogonal with respect to weight function since   are the roots of the polynomials.


Hence, the error for 1 point gaussian quadrature is


 


The error for 2 point quadrature:


 


The error for 3 point quadrature:


 

Problem 2Edit

We wish to solve   iteratively where


 


Show that for this   the Jacobi method and the Gauss-Seidel method both converge. Explain why for this   one of these methods is better than the other.

Solution 2Edit

Jacobi MethodEdit

Decompose   into its diagonal, lower, and upper parts i.e.


 


Derive Jacobi iteration by solving for x as follows:


 

Gauss-Seidel MethodEdit

Decompose   into its diagonal, lower, and upper parts i.e.


 


Derive Gauss-Sediel iteration by solving for x as follows:


 

Jacobi ConvergenceEdit

Convergence occurs for the Jacobi iteration if the spectral radius of   is less than 1 i.e.


 


The eigenvalues of the matrix


 


are   i.e. the eigenvalue   has order 3.


Therefore, the spectral radius is  .

Gauss-Seidel ConvergenceEdit

Convergence occurs for the Gauss-Seidel iteration if the spectral radius of   is less than 1 i.e.


 


The eigenvalues of the matrix


 


are  

Therefore, the spectral radius is  .

Comparison of MethodsEdit

In general, the Gauss-Seidel iteration converges faster than the Jacobi iteration since Gauss-Seidel uses the new components of   as they become available, but in this case  , so the Jacobi iteration is faster.

Problem 3Edit

Consider the three-term polynomial recurrence

 

initialized by  , where each   is real and each   is nonzero.

Problem 3aEdit

Prove that each   is a monic polynomial of degree  , and for every  , one has

 


Solution 3aEdit

We prove the claim by by induction.

Base CaseEdit

  is monic with degree zero, and  .


  is monic with degree one, and  .


  is monic with degree 2, and  .

Induction StepEdit

Assume   is monic with degree  , and  .


Also,assume  is monic with degree  , and  .


Then from the hypothesis,   is monic with degree  .


Also,


 


since   is just   plus a linear combination of lower order terms already in  .

Problem 3bEdit

Show that for every   the polynomial   has   simple real roots that interlace with the roots of  .

Solution 3bEdit

We prove the claim by induction.

Base CaseEdit

Let's consider the case  . We know that


 


The quadratic formula shows that   has two simple real roots.


Let   and   be the zeros of  . Then, we have (because of sign changes) that


  and     there exists   such that  .

  and     there exists   such that  .


In conclusion,  .

Induction StepEdit

Let   and   be the simple real roots of   and  , respectively, such that the roots of   are interlaced with the roots of  , i.e.,


 


Then, we have that


 


 


From the induction hypothesis   and   have different signs since  . Then, there exists   such that  .


Proceeding in the same manner for all the intervals  , we obtain that there exists   such that   for  


We now consider the smallest and largest roots of   i.e.  


Since for  ,   is a monic polynomial,


 


Hence for any   (any   larger than the largest root of  )


 


Therefore


  and  


implies there exists   such that  .


If   is even, then by similar reasoning


  and     there exists   such that  .


If   is odd,


  and     there exists   such that  .

Problem 3cEdit

Show that for every   the roots of   are the eigenvalues of the symmetric tridiagonal matrix


 


Solution 3cEdit

Let  


Then   is a monic polynomial in   of degree  .


Expanding this determinant along the last row, we have


 


where   and   are monic polynomials of degree   and  , respectively.


Notice that   and if we let  , then (1) is equivalent to the three-term recurrence stated in the problem.


Thus,   shows that the roots of   are the eigenvalues of  .