Let be a function with 3 continuous derivatives. Let be a quadratic polynomial that interpolates at . Let and
where depends only on and determine . (Hint: the key to this is to prove that vanishes at some point in . The result can then be obtained by integration.)
Proof of HintEdit
Claim: There exists such that
The interpolation polynomial may be expressed using dividing difference coefficients i.e.
In general, the divided difference coefficients may be expressed as a factorial weighted point of a derivative of i.e.
Application of HintEdit
From the hint we know that
Since is quadratic, is constant i.e.
By the fundamental theorem of calculus,
Now suppose and has 4 continuous derivatives. In this case show
where . What is in terms of the derivatives of ?
Third Derivative of f has ZeroEdit
We know that , because . Now, by we can conclude that there exists such that .
Application of Fundamental Theorem of Calculus (Twice)Edit
Find such that is a polynomial of degree and this set is orthogonal on with respect to the weight function . (Note: , )
Apply Gram SchmidtEdit
To find orthogonal use the Gram Schmidt method.
Let be a basis of .
From Gram Schmidt, we have
Proceeding with Gram Schmidt, we have
Derive the 2-point Gaussian formula
i.e. find the weights and nodes
Find the NodesEdit
The nodes and are the roots of the th orthogonal polynomial i.e.
Applying the quadratic formula yields the roots:
Find the WeightsEdit
The approximation is exact for polynomials at most of degree . Hence, we have the following system of equations
Solving the solving the system of equation by substitution yields the weights:
Let be an nonsingular matrix, and consider the linear system
Write down the Jacobi iteration for solving in a way that it would be programmed on a computer
- <convergence condition>
Where , is diagonal, are lower and upper triangular, respectively.
Suppose has non-zero elements with . How many operations per iteration does the Jacobi iteration take?
The diagonal entries of are non-zero since otherwise would not exist.
Therefore contains off-diagonal non-zero entries.
The computation during each iteration is given by
Therefore there are multiplies in each iteration.
Assume that is strictly diagonally dominant: for
Show that the Jacobi iteration converges for any guess . (Hint: You may use Gerschgorin's theorem without proving it.)
Theorem 8.2.1 [SB] states that if and only if the Jacobi iteration converges.
Matrix multiplication and the definitions of gives the explicit entrywise value of
Then, using Gerschgorin's Theorem and diagonal dominance, we have the result.