Final exam for CS 4220 / Math 4260 / CS 5223
Due date: 2022-05-18
This is meant to be straightforward based on the course notes and previous assignments, but you can (and should) consult any resources you wish except for people outside the course staff. Provide attribution (e.g. a citation or link) for any ideas you get. Your final write-up should be your own.
You should do problem 1, and any four out of the remaining five problems. Please indicate which four of the remaining five problems you want graded.
1. Snacks (5 points)
(1 pt) Give an example of a 1D polynomial root-finding problem
where Newton converges, but not quadratically.(1 pt) Given a square nonsingular
with , how would you solve in ?(1 pt) Given data
, write a simple formula for the minimizer of .(2 pt) Let
where . Starting withF = lu(A+diagm(x))
, write code to compute and in additional cost.
2. Hanging cables (10 points)
Consider cable of length
Here
where
Hyperbolic trig identities let us rewrite the end constraint
(2 pts) When
, we find that . Use the given equations and the Taylor expansion for to find that .(4 pts) Complete the Newton iteration to
(and from there ) in terms of and . Give the result and a semilog convergence plot (for an appropriate residual measure) for the case of and and . Your convergence criterion should target a relative error of or smaller in the computed .(4 pts) Write a Newton solver to compute
from and . You will want a good initial guess (e.g. from Taylor expansions around ). Give the computed and a convergence plot for and .
You may also want to sanity check your code by ensuring consistency with the numbers from problem 2 (no points for this, just a hint).
Partial codes
Feel free to modify these implementations as you see fit.
compute_hang (generic function with 1 method)
compute_d (generic function with 1 method)
3. Close calls
For this problem, we define two cubic Bézier curves, the images of
#11 (generic function with 1 method)
Below, we show the curve for
plot_curves (generic function with 2 methods)
Finding the point on nearest_cubic_b(pf, q)
that takes the control points for a Bézier curve and a target point
nearest_cubic_b (generic function with 1 method)
0.637639
0.45858
Our goal is to find the closest points on these two curves, i.e.
is minimal. As a starting point, we provide coe to get an initial guess based mesh of 10 points each in
(5 points) Starting from the guess from the first part, implement a nonlinear Gauss-Seidel method (aka coordinate descent), where in each sweep you first update
(assuming correct) then update (assuming correct). Use the providednearest_cubic_b
routine as a building block. Give a semi-log convergence plot of vs . Give an estimate for the constant in the (linear) convergence rate.(5 points) Write a Newton iteration to refine
and . Do not worry about the end cases (closest points are at the ends), nor about having an inadequate initial guess. Give a convergence plot to illustrate quadratic convergence of the method.
Partial codes
We give you the closest_on_mesh
code. I wrote closest_gs
and closest_newton
routines for questions 1-2.
closest_on_mesh (generic function with 1 method)
0.631579
0.684211
4. Largely linear (10 points)
Consider the general class of equations
(2 pts) Define
and show can be eliminated to obtain an equation .(2 pts) If
is Lipschitz with constant , show that is Lipschitz with constant .(2 pts) Argue that the equation
has a unique solution.(2 pts) Use fixed point iteration on the 1D reduced equation to solve
.(2 pts) Use Newton on the 1D reduced equation to solve
.
5. Rational approximation
We consider the problem of approximating
(2 pts) Why is the constant coefficient of
chosen to be 1 rather than a free parameter ?(4 pts) Find an initial guess at the vectors
and by minimizing with a mesh of size . This is a linear least squares problem. What is the root mean square error?(4 pts) Use Gauss-Newton to refine the coefficient vector, starting from the linearized initial guess. Give a convergence plot. What is the root mean square error?
6. Funny fixed point
Consider the problem
(4 pts) Use the method of Lagrange multipliers to derive stationarity conditions of the form
where is a diagonal matrix whose entries depend on .(2 pts) Dot the stationarity conditions with
to show that if is a constrained stationary point, then the multiplier is given by .(4 pts) Implement the fixed point iteration
, where is the smallest eigenvalue for the pencil , and each iterate is normalized according to the constraint. Use Julia'seigen
to solve the generalized eigenvalue problem at each step. Run starting from the proportional-to-all-ones guess. Monitor the residual . What is the minimum value found?
For the third part, use the matrix
3×3 Matrix{Float64}:
3.0 1.0 1.0
1.0 4.0 1.0
1.0 1.0 5.0
Note: We say