Steepest Descent and Conjugate Gradient /Overview

In many real-world applications the problem of solving a very large system of linear equations occurs. Examples are:

  • Tomographic Imaging;
    • The problem of recovering the image from its projections can be cast into a linear algebra problem with the task to solve a large set of linear equations.
  • Least Squares problems

In such a scenario exact solutions may not be feasible due to memory requirements, computing time, etc. However a reasonably good solution may be obtained using a managable amount of iterations. So there is a great interest in solving a linear system of equation iteratively.

There are quite some methods that could be used. As I was particularly interested in learning more about the method of conjugate gradient I setup a Jupyter notebook to explore the concepts behind the methods of steepest descent and conjugate gradients.

The Jupyter notebook can be found here:

https://github.com/michaelbiester/NumericalMethods/blob/master/SolvingLinearEquations/quadraticForms.ipynb

A copy of this notebook can be viewed as a PDF document.

https://github.com/michaelbiester/NumericalMethods/blob/master/SolvingLinearEquations/quadraticForms.pdf


Resources

Search for this article:

An Introduction to the Conjugate Gradient Method Without the Agonizing Pain

Author: Jonathan Richard Shewchuk, August 4, 1994

(The Jupyter notebook tries to follow the steps / descriptions presented in this document).

In her highly readable bachelor thesis

Conjugate Gradients and Conjugate Residuals type methods for solving Least Squares problems from Tomography (Delft University of Technology)

Tamara Kloek discusses various aspects of solving the normal equation iteratively using the conjugate gradient method (among others).