After the course, the student understands how Bayesian networks are constructed with conditional independence assumptions and how they are applied in modeling of joint probability distributions. The students can explain the structure and usage of common probabilistic models in machine learning, such as Gaussian mixture models and factor analysis models. The students can apply Bayes’ theorem for computing probability statements and understand the fundamental role of Bayes’ theorem in probabilistic inference. The coupling of inference and learning is understood in the context of latent variable models and the EM algorithm. Student knows approximate inference and sampling techniques for complex models, where exact probabilistic inference can not be applied. Furthermore, they can translate probabilistic models, inference and learning algorithms into practical computer implementations.
The course covers probabilistic concepts in machine learning: independece, conditional independence, mixture models, EM algorithm, Bayesian networks, computational algorithms for exact and approximate inference, sampling, prior distributions. The course emphasizes understanding fundamental principles and their use in practical machine learning problems.
CS-E3210 / T-61.3050 Machine Learning: Basic Principles
Additionally up to 6 bonus points from exercises
To pass, you need at least 15p from the exam and 3p from the project.