Please note! Course description is confirmed for two academic years (1.8.2018-31.7.2020), which means that in general, e.g. Learning outcomes, assessment methods and key content stays unchanged. However, via course syllabus, it is possible to specify or change the course execution in each realization of the course, such as how the contact sessions are organized, assessment methods weighted or materials used.
After the course, the student understands how Bayesian networks are constructed with conditional independence assumptions and how they are applied in modeling of joint probability distributions. The students can explain the structure and usage of common probabilistic models in machine learning, such as sparse Bayesian linear models, Gaussian mixture models and factor analysis models. The students can apply Bayes theorem for computing probability statements and understand the fundamental role of Bayes theorem in probabilistic inference. The students can derive approximate inference algorithms for complex models, where exact probabilistic inference may not be applied. Furthermore, they can translate probabilistic models, inference, and learning algorithms into practical computer implementations.
Schedule: 15.01.2021 - 13.04.2021
Teacher in charge (valid 01.08.2020-31.07.2022): Pekka Marttinen
Teacher in charge (applies in this implementation): Pekka Marttinen
Contact information for the course (applies in this implementation):
CEFR level (applies in this implementation):
Language of instruction and studies (valid 01.08.2020-31.07.2022):
Teaching language: English
Languages of study attainment: English
CONTENT, ASSESSMENT AND WORKLOAD
The course covers concepts in probabilistic machine learning: independence, conditional independence, mixture models, EM algorithm, Bayesian networks, latent linear models, and algorithms for exact and approximate inference, with an emphasis on variational inference. The course emphasizes understanding fundamental principles that allow students to understand and apply probabilistic modeling in practice.
Assessment Methods and Criteria
Exercises and an exam (details provided on the first lecture).
20 + 18 (2 + 2)
David Barber, Bayesian Reasoning and Machine Learning. Cambridge University Press, 2012.
Christopher M. Bishop, Pattern recognition and machine learning. Springer, 2006.
CS-E3210 / T-61.3050 Machine Learning: Basic Principles
CS-E5710 Bayesian Data Analysis (recommended)