Deep Learning (5 ECTS)
Special Course in Computer and Information Science
Teachers: Prof Tapani Raiko, Dr Jyri Kivinen, MSc Pyry Takala, MSc Mudassar Abbas, BSc Yao Lu
Deep neural networks that learn to represent data in multiple layers of increasing abstraction have dramatically improved the state-of-the-art for speech recognition, computer vision, predicting the activity of drug molecules, and many other tasks. Deep learning discovers intricate structure in large datasets by building distributed representations.
Course is based on the draft of the forthcoming MIT Press book "Deep Learning" by Yoshua Bengio, Ian Goodfellow and Aaron Courville, available at http://www.iro.umontreal.ca/~bengioy/dlbook/
Required background knowledge:
T-61.3050 Machine Learning: Basic Principles
T-61.5130 Machine Learning and Neural Networks P
T-61.5140 Machine Learning: Advanced Probabilistic Methods P
Python programming, linear algebra, probability theory.
To pass the course, you will need to return reports on exercises (mini-projects that build up progressively during the course) and on team work. For implementing the projects, Python with Theano is recommended, but Matlab is allowed although will require more work. Neural network toolboxes or libraries are not allowed (e.g. Blocks, PyLearn, Lasagne).
The grading is pass/fail/pass with distinction.
The workload (5 ECTS = 133 hours) is divided as follows.
Sessions: 12*2hours=24 hours.
Independent study (reading the book): 50 hours.
Progressive mini-projects and exercises: 8*5 hours = 40 hours.
Team work: 19 hours.