Graphical models (or probabilistic graphical models) provide a powerful
paradigm to jointly exploit probability theory and graph theory for solving
complex real-world problems. They form an indispensable component in several
research areas, such as statistics, machine learning, computer vision, where a
graph expresses the conditional (probabilistic) dependence among random
variables.
This course will focus on discrete models, that is, cases where the random variables of the graphical models are discrete. After an introduction to the basics of graphical models, the course will then focus on problems in representation, inference, and learning of graphical models. We will cover classical as well as state of the art algorithms used for these problems. Several applications in machine learning and computer vision will be studied as part of the course. |
|
All the classes will be held at the new Gif-sur-Yvette campus of CentraleSupelec, in EB.106, Eiffel building. |
27/11 | Introduction to the course [slides] Graphical Models [slides] |
4/12 | Belief propagation [slides] |
11/12 | Graph cuts [slides] |
18/12 | Primal dual + Dual decomposition [slides] + Paper 1 presentation |
8/1 | Learning I [slides] |
15/1 | Bayesian Networks [slides] + Paper 2 presentation |
22/1 | Learning: CNNs [slides] |
12/2 | Modern Learning [slides] + Recommender Systems [slides] + Paper 3 presentation |
19/2 | Exam (3h) |