Discrete Inference and Learning

Yuliya Tarabalka, Nikos Paragios, Karteek Alahari



Discrete optimization provides a very general and flexible modeling paradigm that is ubiquitous in several research areas, such as machine learning and computer vision. As a result, related optimization methods form an indispensable computational tool for a wide variety of inference and learning tasks nowadays. The aim of this course is to introduce students to the relevant concepts and techniques of discrete inference and learning and to familiarize them with how these methods can be applied. We will cover state of the art algorithms used for energy minimization, marginal computation and parameter estimation of rich, expressive models, focusing not only on the accuracy but also on the efficiency of the resulting methods.


All the classes will be held at the new Gif-sur-Yvette campus of CentraleSupelec, in either the Bouygues or the Eiffel buildings. See below for details.


16/10Bouygues: Amphi e.093Introduction [slides]
Belief Propagation [slides]
23/10Bouygues: Amphi e.093Maximum flow, minimum cut [slides: pdf, pptx]
30/10Eiffel: Amphi VIMove-making algorithms, TRW [slides]
06/11Eiffel: Amphi VIIPrimal-dual schema, Dual decomposition [slides]
20/11Eiffel: Amphi IIIParameter Learning I [slides]
27/11Eiffel: Amphi VIClassical Learning [slides]
04/12Eiffel: Amphi IIIModern Learning [slides]
20/12
22/12Exam (3h)




Bibliography
Convex Optimization, Stephen Boyd and Lieven Vanderbeghe
Numerical Optimization, Jorge Nocedal and Stephen J. Wright
Introduction to Operations Research, Frederick S. Hillier and Gerald J. Lieberman
An Analysis of Convex Relaxations for MAP Estimation of Discrete MRFs, M. Pawan Kumar, Vladimir Kolmogorov and Phil Torr
Convergent Tree-reweighted Message Passing for Energy Minimization, Vladimir Kolmogorov