Inner product spaces, Kernels, Reproducing kernels, and RKHS. Introductory learning theory and Generalization. Empirical Risk Minimization, Uniform Convergence and Rademacher Complexity. Kernel Ridge Regression and Logistic Regression. Optimization and Duality. Margin-based methods and Support vector machines. Unsupervised learning including clustering, PCA and their kernel variants.
Course position and Prerequisites
Course is advanced MSc course in Machine learning, targeted to 1st/2nd year MSc students in Machine Learning and Computer science. Also suitable for PhD studies.
- The course Machine Learning: Basic principles (or equivalent knowledge), strongly recommended
- Python programming skills are recommended (course material will use Python, and there will be an introductory python programming short session)
After attending the course, the student knows the basics of kernels, positive-definiteness, and RKHS. The courses also formally introduces the notions of generalization in machine learning by studying the principle of Empirical Risk Minimization and its consistency. The student knows how convex optimization methods can be used to efficiently train kernel-based and large-scale linear models.It is also discussed how to apply kernel based methods for unsupervised learning such as PCA
- Lecturer: Rohit Babbar
- Course assistants: Tolou Shadbahr, Ananth Mahadevan, Christabella Irwanto, and Ricardo Falcon-Perez
Grading (This has been updated in view of Covid-19, pls refer to the announement page for info - https://mycourses.aalto.fi/mod/forum/discuss.php?d=179293#p304356)
The course can be completed by two alternative ways:
- Assignments (max. 50 points) + Exam (max. 50 points) giving a grade 0..5. The assignment part has a weightage of 40% and exam of 60%. In aggregate, lowest passing points total is 45. 85 points will give the grade of 5.
- Exam only (max. 50 points), giving a grade 0...5. 20 points will give the grade 1, 45 points will give the grade of 5.
The better of the resulting two grades will be taken into account.
Language of Instruction - English
- Lecture slides and exercises are the examined content
- Further references will be provided during respective lectures
|Date and Time||Type/Location||Content|
|January 8, 12:15||Lecture 1||Basics and Introduction to Kernels|
|January 15, 12:15||Lecture 2||Kernel and Reproducing Kernel Hilbert Space|
|January 17, 16:00||Lab||Python crash course|
|January 22, 12:15||Lecture 3|| Representer Theorem and Polynomial Kernel Example|
|January 29, 12:15|| Lecture 4 || Introductory learning theory|
|February 05, 12:15|| Lecture 5 || Convex function and duality|
|February 12, 12:15|| Lecture 6|| Kernel Support Vector Machines|
|February 26, 12:15|| Lecture 7 || Kernel logistic regression|
|February 27, 16:15|| Lab|| Tutorial for Assignment 2|
|March 4, 12:15|| Lecture 8 || Unsupervised Learning |
|March 11, 12:15|| Lecture 9 |
| Applications of kernels for |
structured and multi-view data
- Shawe-Taylor and Cristianini: Kernel Methods for Pattern Analysis, Cambridge University Press, 2004. Available as ebook: http://site.ebrary.com/lib/aalto/detail.action?docID=10131674
- B. Scholkopf, A. Smola: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. http://books.google.fi/books?isbn=0262194759
- Research papers provided during the course (See Materials).