Topic outline

  • Overview

    Lectures and exercises on Zoom

    Course Zulip discussion channel for students: https://e407506.zulipchat.com/

    LecturerPhD Markus Heinonen
    Co-lecturers: Prof Arno Solin, PhD Ti John, PhD Martin Trapp, Paul Chang
    Teaching assistants: Martin Trapp, Paul Chang, Pashupati Hegde, Severi Rissanen

    Overview: Gaussian processes (GPs) are a powerful tool for Bayesian nonparametric modelling and Bayesian machine learning. This course will give an overview of Gaussian processes in machine learning, and provide a theoretical background. The course will include Gaussian process regression, classification, unsupervised modelling, as well as deep GPs and other more recent advances.

    Prerequisites: Basics of machine learning and statistics, eg. "Machine Learning: Supervised Methods (CS-E4710)"

    Target audience: The course is targeted towards Msc students interested in deepening their machine learning knowledge:
    • GPs are a probabilistic counterpart of Kernel Methods (CS-E4830)
    • GPs offer a probabilistic way to do Deep Learning (CS-E4890)
    • GPs fall under the umbrella of Bayesian Data Analysis (CS-E5710)
    • GPs utilize Advanced Probabilistic Methods (CS-E4820)

    Format: 5 credits, 12 lectures, 6 weekly home assignments. Entire course is online. Lectures on zoom on:
    • mondays 10:15-12:00
    • tuesdays 10:15-12:00

    Assignments: The practical assignments will be Python notebooks that are completed at home, returned weekly and graded. Other languages (such as Matlab and R) can be used, but it will require more work from the participants. Two parallel exercise sessions (choose only one) for presenting the assignment solutions (Note that participation grants free points towards the grade!):
    • H1: thursdays 10:15-12:00 
    • H2: fridays 12:15-14:00

    Exam: no exam

    Grading: 1 to 5 based on points (max points 48 points, 24 point minimum to pass):
    • Six assignment returns, each worth 6 points (max 36 points)
    • Two points for participation (choose one) per weekly excercise session (max 12 points)

    Grading table
    • 1/5, 24 points
    • 2/5, 28 points
    • 3/5, 32 points
    • 4/5, 36 points
    • 5/5, 40 points

    Note: Maximum grade of 5/5 is only possible by attending exercise sessions

    Book: Rasmussen & Williams, Gaussian processes for Machine learning, MIT Press 2006 (publicly available)

    Lectures (all start at 10:15)
    1. mon jan 10th, 10:15-12:00. Introduction
    2. tue jan 11th, 10-12. Bayesian regression
    3. mon jan 17th, 10-12. GP regression
    4. tue jan 18th, 10-12. Kernel learning
    5. mon jan 24th, 10-12. GP classification (lecturer Ti John) 
    6. tue jan 25th, 10-12. Large-scale GP (lecturer Ti John)
    7. mon jan 31th, 10-12. Latent modelling and unsupervised learning
    8. tue feb 1st, 10-12. Deep GPs
    9. mon feb 7th, 10-12. GP theory (lecturer Martin Trapp)
    10. tue feb 8th, 10-12. State-space GPs (lecturer Arno Solin)
    11. mon feb 14th, 10-12. Bayesian optimization (lecturer Paul Chang)
    12. tue feb 15th, 10-12. Integration and model selection (prof. Aki Vehtari)

    Exercise sessions
    (two identical sessions per week, attend one)
    • 1. thu jan 20th 10:15-14, Lectures 1+2
    • 1. fri jan 21th 12:15-14
    • 2. thu jan 27th 10:15, Lectures 3+4
    • 2. fri jan 28th 12:15
    • 3. thu feb 3rd 10:15, Lectures 5+6
    • 3. fri feb 4th 12:15
    • 4. thu feb 10th 10:15, Lectures 7+8
    • 4. fri feb 11th 12:15
    • 5. thu feb 17th 10:15, Lectures 9+10
    • 5. fri feb 18th 12:15
    • 6. thu 24th 10:15, Lectures 11+12
    • 6. fri 25th 12:15