Please note! Course description is confirmed for two academic years, which means that in general, e.g. Learning outcomes, assessment methods and key content stays unchanged. However, via course syllabus, it is possible to specify or change the course execution in each realization of the course, such as how the contact sessions are organized, assessment methods weighted or materials used.

LEARNING OUTCOMES

After the course, the student understands how Bayesian networks are constructed with conditional independence assumptions and how they are applied in modeling of joint probability distributions. The students can explain the structure and usage of common probabilistic models in machine learning, such as sparse Bayesian linear models, Gaussian mixture models and factor analysis models. The students can apply Bayes theorem for computing probability statements and understand the fundamental role of Bayes theorem in probabilistic inference. The students can derive approximate inference algorithms for complex models, where exact probabilistic inference may not be applied. Furthermore, they can translate probabilistic models, inference, and learning algorithms into practical computer implementations.

Credits: 5

Schedule: 10.01.2025 - 11.04.2025

Teacher in charge (valid for whole curriculum period):

Teacher in charge (applies in this implementation): Pekka Marttinen

Contact information for the course (applies in this implementation):

CEFR level (valid for whole curriculum period):

Language of instruction and studies (applies in this implementation):

Teaching language: English. Languages of study attainment: English

CONTENT, ASSESSMENT AND WORKLOAD

Content
  • valid for whole curriculum period:

    The course covers concepts in probabilistic machine learning: independence, conditional independence, mixture models, EM algorithm, Bayesian networks, latent linear models, and algorithms for exact and approximate inference, with an emphasis on variational inference. The course emphasizes understanding fundamental principles that allow students to understand and apply probabilistic modeling in practice.

Assessment Methods and Criteria
  • valid for whole curriculum period:

    Exercises and an exam (details provided on the first lecture).

DETAILS

Study Material
  • valid for whole curriculum period:

    Kevin P. Murphy, Machine Learning - A Probabilistic Perspective. The MIT Press, 2012

    David Barber, Bayesian Reasoning and Machine Learning. Cambridge University Press, 2012.

    Christopher M. Bishop, Pattern recognition and machine learning. Springer, 2006.

Substitutes for Courses
Prerequisites

FURTHER INFORMATION

Further Information
  • valid for whole curriculum period:

    Teaching Language: English

    Teaching Period: 2024-2025 Spring III - IV
    2025-2026 Spring III - IV