Topic outline

  • The information in SISU may be outdated. The up-to-date information is on these web pages. Course slack:

    Contact information
    • If you have questions regarding the course, please send an email to .

    Course description

    The course introduces the fundamental and current topics of deep learning. After the course, the student understands the general principles of training deep neural networks (backpropagation, stochastic gradient descent, regularization) and knows the most common neural network architectures (convolutional and recurrent neural networks, graph neural networks and transformers); the student also has practical experience in implementing these models from scratch in PyTorch. In every weekly assignment, the students get to train a deep neural network for various tasks including image classification, machine translation, solving reasoning problems, few-shot learning and generative modeling. The course covers the most recent advances (such as unsupervised and self-supervised deep leaning) to give the student a good starting position to do research in this field.

    Returned assignments (no exam).

    • NB: good knowledge of Python and numpy
    • linear algebra: vectors, matrices, eigenvalues and eigenvectors
    • basics of probability and statistics: sum rule, product rule, Bayes' rule, expectation, mean, variance, maximum likelihood, Kullback-Leibler divergence
    • basics of machine learning (recommended): supervised and unsupervised learning, overfitting

    Course contents
    • Introduction to deep learning
    • Optimization methods
    • Regularization methods
    • Convolutional neural networks
    • Recurrent neural networks
    • Attention-based models
    • Graph neural networks
    • Deep learning with few labeled examples
    • Deep autoencoders
    • Flow-based and autoregressive generative models
    • Generative adversarial networks
    • Unsupervised learning via denoising
    • Large language models