Due to the COVID situation, the class is going online on Dec. 9th and Dec. 16th.

The class is going online for two weeks on Dec. 9th and Dec. 16th. You will be given videos corresponding to the lectures (about 1 hour per session), which will be followed by an interactive discussion on Zoom from 12:15pm to 12:45pm. The link is available in the same url giving you access to the data challenge. See instructions in the table below for the videos.

Course information

Statistical learning is about the construction and study of systems that can automatically learn from data. With the emergence of massive datasets commonly encountered today, the need for powerful machine learning is of acute importance. Examples of successful applications include effective web search, anti-spam software, computer vision, robotics, practical speech recognition, and a deeper understanding of the human genome. This course gives an introduction to this exciting field, with a strong focus on kernels as a versatile tool to represent data, in combination with (un)supervised learning techniques that are agnostic to the type of data that is learned from. The learning techniques that will be covered include regression, classification, clustering and dimension reduction. We will cover both the theoretical underpinnings of kernels, as well as a series of kernels that are important in practical applications. Finally we will touch upon topics of active research, such as large-scale kernel methods and the use of kernel methods to develop theoretical foundations of deep learning models.


  • project (data challenge) (50%) + final exam (50%)
The data challenge has started. Follow the link given during the class

Course outline

  • Theory of RKHS and kernels
  • Supervised learning with kernels
  • Unsupervised learning with kernels
  • Kernels for structured data
  • Kernels for generative models
  • Theoretical foundations of deep learning with kernels


The slides, available here, will be those of the MVA master program. The video lectures are also available here

Date Time Lecturer Room Topic
30/09/2021 11:15 - 12:45 JM D207 Positive definite kernels, slides 1-28
14/10/2021 11:15 - 12:45 JM D208 RKHS, slides 29-47
21/10/2021 11:15 - 12:45 JM D208 Kernel tricks, slides 48-65
28/10/2021 11:15 - 12:45 JM D123 Kernel ridge regression, slides 66-97
18/11/2021 11:15 - 12:45 JM D211 learning theory - large margin classifiers, slides 104-132
25/11/2021 11:15 - 12:45 JM D211 Support vector machines, slides 133-165
02/12/2021 11:15 - 12:45 JM D211 Kernel PCA, kernel K-means
09/12/2021 12:15 - 12:45 JM D211 spectral clustering, kernel CCA
Video of Lecture 10.
16/12/2021 12:15 - 12:45 JM D211 Green, Mercer, Bochner kernels
Lecture 12
06/01/2022 11:15 - 12:45 JM D211 Kernels for sequences
13/01/2022 11:15 - 12:45 JM D211 Large-scale learning with kernels
20/01/2022 11:15 - 12:45 JM D211 Deep learning with kernels

Reading material

Machine Learning and Statistics

  • Vapnik, The nature of statistical learning theory. Springer
  • Hastie, Tibshirani, Friedman, The elements of statistical learning. (free online)
  • J Shawe-Taylor, N Cristianini. Kernel methods for pattern analysis. 2004.

Jobs / Internships Opportunities

We have different intern/PhD opportunities in machine learning, image processing, bioinformatics and computer vision. It is best to discuss that matter early with us since the number of places is limited.