## Course syllabus

### Course PM

This page contains the program of the course: lectures, exercise sessions and computer labs. Other information, such as learning outcomes, teachers, literature and examination, are in a separate course PM.

### Program

Rules for home examination

Regler för hemtentamen

The schedule of the course is in TimeEdit.

#### Lectures

Day Time  Place Remarks
MON 13:15-15:00 Euler Lecture
WED 13:15-15:00 MVF24, MVF25 Computer exercises
THU 10:00-11:45 Pascal Lecture
FRI 13:15-15:00 MVF24, MVF25 Computer exercises
29.10.2019 14.00-18.00 SB Examination
09.01.20129 14.00-18.00 SB Examination

Grades at Examination (written examination + bonus points)

 Grades Chalmers Points Grades GU Points - < 15 U < 15 3 15-20 G 15-27 4 21-27 VG > 27 5 > 27

Changes compared to the last occasion

1. New homework 2.

2. New comp. ex. 2,3.

3. Modified comp.ex.4.

Deadlines for computer exercises  and homeworks:

Homework 1 and comp. ex. 1: 13 September
Homework 2:                                        20 September
Homework 3 and comp.ex. 2:      4 October
Homework 4 and comp.ex. 3:     11 October
Comp.ex. 4:                                            18 October

Announcement of the course  "Introduction to Inverse and Ill-posed Problems", 7.5 Hp

List of bonus points

• Lecture 1
Introduction and organization of the course. Introduction to linear algebra and numerical linear algebra. If this looks unfamiliar you should better consult you former literature in linear algebra and refresh your knowledges. We will concentrate on the three building bricks in Numerical Linear Algebra (NLA) : (1) Linear Systems, (2) Overdetermined systems by least squares, and (3) Eigenproblems. The building bricks serve as subproblems also in nonlinear computations by the idea of linearization. We considered example of application of linear systems: image compression using SVD, image deblurring. We introduced some basic concepts and notations: transpose, lower and upper triangular matrices, singular matrices, symmetric and positive definite matrix, conjugate transpose matrix, row echelon form, rank, cofactor.

Lecture 1

• Lecture 2
IEEE system and floating-point numbers. We will discuss computer exercise 1 and perturbation theory in polynomial evaluation.
• Lecture 2
• Lecture 3
We will discuss why we need norms: (1) to measure errors, (2) to measure stability, (3) for convergence criteria of iterative methods. Sherman - Morrison formula. System of linear equations. Gaussian elimination, LU-factorization. Gaussian elimination (and LU factorization) by outer products (elementary transformations). Pivoting, partial and total.
• Lecture 3
• Lecture 4
We will discuss the need of pivoting and uniqueness of factorization of A. Different versions of algorithms of Gaussian elimination. Error bounds for the solution Ax=b. Roundoff error analysis in LU factorization. Estimating condition numbers. Hagers's algorithm.
• Lecture 4
• Lecture 5
Componentwise error estimates. Improving the Accuracy of a Solution for Ax=b: Newton's method and equilibration technique. Convergence of the Newton's method. Real Symmetric Positive Definite Matrices. Cholesky algorithm. Example: how to solve Poisson's equation on a unit square using LU factorization.
• Lecture 5
• Lecture 6
Band Matrices, example: application of Cholesky decomposition in the solution of ODE. Continuation of considering example: application of Cholesky decomposition in the solution of ODE. Linear least squares problems. Introduction and applications. The normal equations.
• Lecture 6
• Lecture 7
• Example: Polynomial fitting to curve. Solution of nonlinear least squares problems: examples.
QR and Singular value decomposition (SVD). QR and SVD for linear least squares problems.  We discussed computer exercise 2.
• Lecture 7
• Lecture 8
• Example of application of linear systems: image compression using SVD in Matlab.   Least squares and classification algorithms.  We discussed computer exercise  3.
• Lecture 8
• Lecture 9
• Householder transformations and Givens rotations. QR-factorization by Householder transformation and Givens rotation. Examples of performing Householder transformation for QR decomposition and tridiagonalization of matrix. Moore-Penrose pseudoinverse. Rank-Deficient Least Squares Problems.  Least squares and classification algorithms.
Introduction to spectral theory, eigenvalues, right and left eigenvectors. Similar matrices. Defective eigenvalues. Canonical forms: Jordan form.
• Lecture 9
• Lecture 10
Canonical forms: Jordan form and Schur form, real Schur form. Gerschgorin's theorem. Perturbation theory, Bauer-Fike theorem. Discussed algorithms for the non-symmetric eigenproblems: power method, inverse iteration, inverse iteration with shift.
• Lecture 10
• Lecture 11
Discussed algorithms for the non-symmetric eigenproblems: inverse iteration with shift, orthogonal iteration, QR iteration and QR-iteration with shift. Hessenberg matrices, preservation of Hessenberg form. Hessenberg reduction. Tridiagonal and bidiagonal reduction. Regular Matrix Pencils and Weierstrass Canonical Form.
• Lecture 11
• Lecture 12
Regular Matrix Pencils and Weierstrass Canonical Form. Singular Matrix Pencils and the Kronecker Canonical Form. Application of Jordan and Weierstrass Forms to Differential Equations. Symmetric eigenproblems Perturbation theory: Weyl's theorem. Corollary regarding similar result for singular values. Application of Weyl's theorem: Error bounds for eigenvalues computed by a stable method. Courant-Fischer (CF) theorem. Inertia, Sylvester's inertia theorem. Definite Pencils. Theorem that the Rayleigh quotient is a good approximation to an eigenvalue. Algorithms for the Symmetric eigenproblem: Tridiagonal QR iteration, Rayleigh quitient iteration.
• Lecture 12
• Lecture 13
Theorem that the Rayleigh quotient is a good approximation to an eigenvalue. Algorithms for the Symmetric eigenproblem: Tridiagonal QR iteration, Rayleigh quitient iteration, Divide-and-conquer algorithm. QR iteration with Wilkinson's shift. Divide-and-conquer, bisection and inverse iteration, different versions of Jacobi's method.
• Lecture 13
• Lecture 14
Algorithms for symmetric matrices (continuation): different versions of Jacobi's method. Algorithms for the SVD: QR iteration and Its Variations for the Bidiagonal SVD. The basic iterative methods (Jacobi, Gauss-Seidel and Successive overrelaxation (SOR)) for solution of linear systems. Jacobi, Gauss-Seidel and SOR for the solution of the Poisson's equation in two dimension. Study of Convergence of Jacobi, Gauss-Seidel and SOR. Introduction to the Krylov subspace methods. Conjugate gradient algorithm. Preconditioning for Linear Systems. Preconditioned conjugate gradient algorithm. Common preconditioners.
• Lecture 14

Back to the top

#### Homeworks

• To pass this course you should do 2 compulsory home assignments before the final exam. Choose any 2 of 4 assignments presented here:
• Homeworks
• Homeworks  should do  done  individually  (not in groups).
• Sent pdf file with your assignment to my e-mail  larisa@chalmers.se
before deadline (see the course page "Program" for deadlines for every home assignment). Handwritten home assignments can be left in the red box located beside my office.

#### Recommended exercises in the course book

Chapters Exercises
8 8.7, 8.10, 8.16
9 9.5, 9.7, 9.8, 9.9, 9.11, 9.12, 9.14, 9.15
11 11.1 - 11-13

Back to the top

### Computer labs

#### Reference literature:

1. Learning MATLAB, Tobin A. Driscoll. Provides a brief introduction to Matlab to the one who already knows computer programming. Available as e-book from Chalmers library.
2. Physical Modeling in MATLAB 3/E, Allen B. Downey
The book is free to download from the web. The book gives an introduction for those who have not programmed before. It covers basic MATLAB programming with a focus on modeling and simulation of physical systems.

Back to the top

Date Details Due