Topic outline

  • Join lectures and exercises: Zoom and Slack

    LecturerPhD Markus Heinonen
    Co-lecturers: Prof Arno Solin, Prof Harri Lähdesmäki, Prof Aki Vehtari, Phd Vincent Adam, PhD William Wilkinson, PhD Charles Gadd
    Teaching assistants: Paul Chang, Phd Martin Trapp, Pashupati Hegde

    Overview:
    Gaussian processes (GPs) are a powerful tool for Bayesian nonparametric modelling. This course will give an overview of Gaussian processes in machine learning, and provide a theoretical background. The course will include Gaussian process regression, classification, unsupervised modelling, as well as deep GPs and other more complex and recent advances.

    Target audience:
    The course is targeted towards Msc students interested in machine learning:

    Prerequisites:

    Basics of machine learning and statistics, eg. Machine Learning: Supervised methods (CS-E4710)

    Format: 
    The 5 credit course will contain 11 lectures, 5 weekly practical assignments and optional project work for 2 extra credits. The practical assignments will be based on Python. Other languages (such as Matlab and R) can be used, but it will require more work from the participants. Whole course is online.

    Exam: no exam

    Grading: max 20 points
    Five assignments, each worth 3 points (max 15 points). 
    Extra point for participation (choose one) in a weekly excercise session (max 5 points)
    • H1: wednesdays 12:15-14:00 
    • H2: fridays 12:15-14:00


    Book: Gaussian processes for Machine learning, MIT Press 2006 (publicly available)

    Session #1: monday January 11th, 12:15-14:00
    Introduction to Gaussian distribution and Bayesian inference

    Session #2: thursday January 14th, 10:15-12:00
    Bayesian regression over parameters and functions

    Session #3: monday January 18th, 12:15-14:00
    Gaussian process regression, kernels, computational complexity

    Session #4: thursday January 21th, 10:15-12:00
    Gaussian process classification, introduction to variational inference

    Session #5: monday January 25th, 12:15-14:00
    Latent modelling for unsupervised and supervised learning

    Session #6: thursday January 28th, 10:15-24:00
    Kernel learning

    Session #7: monday February 1st, 12:15-14:00
    Convolution GPs 

    Session #8: thursday February 4th, 10:15-12:00
    Deep Gaussian processes

    Session #9: monday February 8th, 12:15-14:00
    Model selection

    Session #10: thursday February 11th, 10:15-12:00
    State-space Gaussian processes

    Session #11: monday February 15th, 12:15-14:00
    Dynamical models


  • Lecture 9: See the "lecture video". The lecture discussed the Stan case studies: 

  • Five weekly practical assignments (Python Notebooks) with finished assignments returned before in mycourses the weekly exercise session:
    • H1: wednesdays 12:15-14:00 (first session Jan 20th)
    • H2: fridays 12:15-14:00 (first session Jan 22th)
    Note that you only need to attend one session per week. There is no session on the first week.

    The course is graded only by completed and returned assignments.  The assignments are graded with 0/0.5/1 points per task per assignment. An extra 1p will be granted if you are present and prepared to present your solution in a session.

    The assignments are released on mondays.

    Assignment #1: deadline wednesday, January 20th, 12:00

    Assignment #2: deadline wednesday, January 27th, 12:00

    Assignment #3: deadline wednesday, February 3rd, 12:00

    Assignment #4: deadline wednesday, February 10th, 12:00

    Assignment #5: deadline wednesday, February 17th, 12:00


  • Completing the course gives 5 ECTS points.

    You can get 2 extra ECTS points by completing an optional small project:
    • The work should be done in groups of 1-4 people
    • Hand-in a detailed project report (one per group) no later than 12th of March
    • Give a 10-20min presentation of your work on a project session on 18th of March at 10:15am

    The project work is supported by four sessions:
    • There is a project kick-off session on thursday 18th february 10:15
    • There is a support session on thursday 4th of March 10:15 for Q&A
    • There is a final project work seminar on 18th of March with group presentations (10-20 min)

    The project work topics are:
    1. Iterative kernel learning
    2. Bayesian optimization with Gaussian Processes
    3. Bayesian quadrature
    4. Relationship between Neural networks and GPs
    5. Multioutput Gaussian processes & Kronecker structures
    6. Gaussian processes for big data
    7. Gaussian processes with monotonicity
    8. Gaussian process latent variable model (GPLVM)
    9. Convolutional Gaussian processes
    10. Gaussian process inference (eg. VI, EP, MCMC)
    11. Deep Gaussian processes
    12. State-space GPs
    13. Dynamical GPs
    14. Own topic (contact Markus/Arno)

    The project work consist of one of following tasks
    1. Analyze your favourite dataset with Gaussian process models of you topic
    2. Literature survey/comparison of more advanced Gaussian process models/methods of your topic
    3. Implementation of more advanced Gaussian process models of your topic

    In the first task you should compare the GP model(s) against other baseline methods. Study the inference of the GP model, and study the predictive posteriors of your GP model in your dataset. In the second task read about your topic from scientific literature. Discuss the topic in depth. In the third task choose your favourite programming language and/or library, and implement an advanced GP model of your topic. Describe your implementation and test it.  

    Please form a group, pick 1 topic and 1 task:
    • Come to the kick-off session on Feb 18th 10:15. We will introduce the topics, and you can form groups at the sessions, and discuss the project.
    • Please report your group/topic to markus.o.heinonen@aalto.fi