Course information

Statistical learning is about the construction and study of systems that can automatically learn from data. With the emergence of massive datasets commonly encountered today, the need for powerful machine learning is of acute importance. Examples of successful applications include effective web search, anti-spam software, computer vision, robotics, practical speech recognition, and a deeper understanding of the human genome. This course gives an introduction to this exciting field, with a strong focus on kernels as a versatile tool to represent data, in combination with (un)supervised learning techniques that are agnostic to the type of data that is learned from. The learning techniques that will be covered include regression, classification, clustering and dimension reduction. We will cover both the theoretical underpinnings of kernels, as well as a series of kernels that are important in practical applications. Finally we will touch upon topics of active research, such as large-scale kernel methods and the use of kernel methods to develop theoretical foundations of deep learning models.

Evaluation

  • project (data challenge) (50%) + final exam (50%)

Course outline

  • Theory of RKHS and kernels
  • Supervised learning with kernels
  • Unsupervised learning with kernels
  • Kernels for structured data
  • Kernels for generative models
  • Theoretical foundations of deep learning with kernels

Calendar

Each 90mn lecture will be made of (i) one hour-long video that will be provided a few days in advance and made available on some streaming plateform, and (ii) an interactive Zoom session. The links will be provided on the course mailing list only. If you want to register to the class, please contact me directly. The slides, available here, will be those of the MVA master program

Date Time Lecturer Room Topic
26/11/2020 11:15 - 12:45 JM Kernels, definition of RKHS Lecture 1 and
the first 15mn of Lecture 2, until slide 28 included
3/12/2020 11:15 - 12:45 JM RKHS, examples The remaining part of Lecture 2 and the first 30mn of Lecture 3, until slide 51 included.
10/12/2020 9:45 - 12:45 JM Smoothness functions, kernel tricks, kernel ridge regression the last part of Lecture 3, the video Lecture 4, the video Lecture 5 and the video Lecture 6 (part of kernel logistic regression is optional).
This corresponds to slides 51-104.
We will also discuss Excercise 1.
17/12/2020 9:45 - 12:45 JM Large-margin classifiers, Support vector machines The videos Lecture 7 and Lecture 8. This corresponds to the slides 115-166.
We will also discuss Excercise 5.
07/1/2021 9:45 - 12:45 JM kernel PCA, kernel K-means, spectral clustering, kernel CCA The videos Lecture 9 and Lecture 10. This corresponds to the slides 166-202.
14/1/2021 9:45- 12:45 JM string kernels / Green, Mercer, Bochner kernels The videos Lecture 11 and Lecture 12.
21/1/2021 9:45- 12:45 JM Large-scale learning / deep learning with kernels The videos Lecture 13 and the first 54mn of Lecture 14. Optional: Lecture 15 and Lecture 16

Reading material

Machine Learning and Statistics

  • Vapnik, The nature of statistical learning theory. Springer
  • Hastie, Tibshirani, Friedman, The elements of statistical learning. (free online)
  • J Shawe-Taylor, N Cristianini. Kernel methods for pattern analysis. 2004.

Jobs / Internships Opportunities

We have different intern/PhD opportunities in machine learning, image processing, bioinformatics and computer vision. It is best to discuss that matter early with us since the number of places is limited.