Credits: 5
Schedule: 24.02.2020 - 03.04.2020
Contact information for the course (applies in this implementation):
Lecturer: Nuutti Hyvönen
Assistant: Juha-Pekka Puska
Teaching Period (valid 01.08.2018-31.07.2020):
IV Spring (2018-2019, 2019-2020)
Learning Outcomes (valid 01.08.2018-31.07.2020):
You will learn to identify an ill-posed inverse problem and to understand the restrictions its nature imposes on the solution process. You will familiarize yourself with several classical regularization methods for finding approximate solutions to linear ill-posed problems. You will learn to formulate an inverse problem as a Bayesian problem of statistical inference and to interpret the information contained in the resulting posterior probability distribution. You will learn to numerically implement the introduced solution techniques.
Content (valid 01.08.2018-31.07.2020):
The course’s topic is computational methods for solving inverse problems arising from practical applications. The course consists of two parts: the first three weeks focus on classic regularization techniques, the latter three weeks discuss statistical methods.
Details on the course content (applies in this implementation):
The preliminary weekly timetable is as follows:
- Week 1: Motivation and (truncated) singular value decomposition
- Week 2: Morozov discrepancy principle and Tikhonov regularization
- Week 3: Regularization by truncated iterative methods
- Week 4: Motivation and preliminaries of Bayesian inversion, preliminaries of sampling
- Week 5: Prior models, Gaussian densities, MCMC (Metropolis-Hastings algorithm)
- Week 6: MCMC (Gibbs sampler), hypermodels
Assessment Methods and Criteria (valid 01.08.2018-31.07.2020):
Teaching methods: lectures, exercises and home exam.
Assessment methods: exercises, a home exam.
Elaboration of the evaluation criteria and methods, and acquainting students with the evaluation (applies in this implementation):
The students are assumed to participate actively in the course by weekly returning their solutions to one home assignment (typically involving MATLAB computations). 25% of the overall grade is based on the home assignments and 75% on a home exam.
Each week there is one home assignment: The solution to the assignment in the exercise paper of the week m is to be returned via MyCourses to the course assistant Juha-Pekka Puska (M329) before the exercise session of the week m+1. (For example, the solution to the home assignment of the first exercise paper should be returned before the exercise session on Friday, March 6.)
The home exam constitutes 75% of the grade. It will be held after the lectures have ended — the exact timing will be agreed upon later on. There will be four, more extensive assignments that must be solved within a given time period (e.g., within ten days)
Workload (valid 01.08.2018-31.07.2020):
Contact hours 36h (no compulsory attendance)
Self-study ca 100h
Study Material (valid 01.08.2018-31.07.2020):
All essential material is included in the lecture notes that are available at the course's homepage.
Details on the course materials (applies in this implementation):
The preliminary versions of the lecture slides can be found at the materials section. The slides may still be updated during the course.
Recommended supplementary reading: J. Kaipio and E. Somersalo, Statistical and Computational Inverse Problems, Springer, 2005 (mainly Chapters 2 and 3), and D. Calvetti and E. Somersalo, Introduction to Bayesian Scientific Computing. Ten Lectures on Subjective Computing, Springer, 2007.
Substitutes for Courses (valid 01.08.2018-31.07.2020):
Mat-1.3626
Course Homepage (valid 01.08.2018-31.07.2020):
https://mycourses.aalto.fi/course/search.php?search=MS-E1654
Prerequisites (valid 01.08.2018-31.07.2020):
MS-A00XX, MS-A01XX, MS-A02XX, MS-A050X. The courses MS-A030X, MS-C134X, MS-C1650, MS-E1460, MS-E1651, MS-E1652, MS-E2112 may also be useful.
Grading Scale (valid 01.08.2018-31.07.2020):
0-5
Details on the schedule (applies in this implementation):
The preliminary weekly timetable is as follows:
- Week 1: Motivation and (truncated) singular value decomposition
- Week 2: Morozov discrepancy principle and Tikhonov regularization
- Week 3: Regularization by truncated iterative methods
- Week 4: Motivation and preliminaries of Bayesian inversion, preliminaries of sampling
- Week 5: Prior models, Gaussian densities, MCMC (Metropolis-Hastings algorithm)
- Week 6: MCMC (Gibbs sampler), hypermodels
- Teacher: Nuutti Hyvönen