GG7920 Homework

OBJECTIVE: Learn how to minimize Rosenbrock "banana" function by iterative methods.A classic test example for multidimensional minimization of a non-linear function is the Rosenbrock banana function: The minimum is at (1,1) and has the value 0. The traditional starting point is (-1.2,1). The M-file banana.m defines the function.

      function f = banana(x)
      f = 100*(x(2)-x(1)^2)^2+(1-x(1))^2;

PROCEDURE:

1.  Find points that minimize Rosenbrock function by both a steepest descent, preconditioned steepest descent, CG method, and rank 2 Quasi-Newton method. Graph out contours of Rosenbrock function and the iterative solutions for each step of CG and steepest descent methods.

2.  Explain the performance of each method.

 

Answer:

1.       Download the Matlab files: f.m, step.m, and rosenberg.m.

2.       Starting from the point x0=(0,0) and ending when norm(dx)<1.0e-8, following are the results of inversion:

              *****************************************

              Solution from the steepest descent:

              x=     1.0000     1.0000

              f(x1,x2)= 1.6983e-011

              Iterations:   8147

 

              Solution from the regularized steepest descent:

              x=     1.0000     1.0000

              f(x1,x2)= 6.4558e-014

              Iterations:    194

 

              Solution from the conjugate gradient:

              x=     1.0000     1.0000

              f(x1,x2)= 1.0418e-023

              Iterations:     21

 

              Solution from Quasi-Newton Rank 2:

              x=     1.0000     1.0000

              f(x1,x2)= 1.3264e-012

              Iterations:    151

              *****************************************

 

Fig1: Misfit function for standard steepest descent, * represents the true solution. Initial x0=(0,0).

 

Fig2: Misfit function for preconditioned steepest descent, * represents the true solution. Initial x0=(0,0).

 

Fig3: Misfit function for conjugate gradient, * represents the true solution. Initial x0=(0,0).

 

Fig4: Misfit function for rank 2 quasi-newton method, * represents the true solution. Initial x0=(0,0).

3.       From the number of iterations and the figures 1-4, both method give the correct solution x=(1.0,1.0). According to the number of the iterations, Conjugate gradient is the best, then Rank 2 Quasi-Newton, Preconditioned Steepest Descent, the Steepest Descent is the worst. But I doní»t think it is a general rule, it depends on the problem to be solved and the initial value.

4.       Following are the results with the initial value far from the true solution, x0=(20,10). All but the preconditioned steepest descent converges to (1.0,1.0). CG is still the best, then rank 2 QN.

              *****************************************

              Solution from the steepest descent:

              x=     1.0000     1.0000

              f(x1,x2)= 1.8201e-011

              Iterations:  31006

 

              Solution from the regularized steepest descent:

              x=     9.7544    95.1517

              f(x1,x2)= 7.6641e+001

              Iterations:  50000

 

              Solution from the conjugate gradient:

              x=     1.0000     1.0000

              f(x1,x2)= 4.7100e-017

              Iterations:     35

 

              Solution from Quasi-Newton Rank 2:

              x=     1.0000     1.0000

              f(x1,x2)= 3.7352e-012

              Iterations:    471

              *****************************************

 

Fig5: Misfit function for standard steepest descent, * represents the true solution. Initial x0=(20,10).

 

Fig6: Misfit function for preconditioned steepest descent, * represents the true solution. Initial x0=(20,10).

 

Fig7: Misfit function for conjugate gradient, * represents the true solution. Initial x0=(20,10).

 

 

Fig8: Misfit function for rank 2 quasi-newton method, * represents the true solution. Initial x0=(20,10).