Each day is organized around two sessions, resp. from 9:00am to 12:00pm and from 5:10pm to 6:30pm. Lunch is at 12:30am sharp. Then the afternoon (from 2pm to 5pm) is free for scientific discussions, possibly on skis.
Note that there is no morning session on Wednesday. Talks resume at 5:10pm.
Program
Monday, January, 12:
Session 1
- 08:50–09:00 Welcome
- 09:00–9:40 Shai Shalev-Shwartz (The Hebrew University)
Stochastic Optimization for Deep Learning, [Slides]
- 9:40–10:20 Tong Zhang (Rutgers University)
Modern Optimization Techniques for Big Data Machine Learning.
- 10:20–10:40 Break
- 10:40–11:20 Jean-Philippe Vert (Mines ParisTech)
New matrix norms for sparse and low-rank matrix estimation, [Slides]
- 11:20–12:00 Arnak Dalalyan (CREST-ENSAE)
Guarantees for Sampling from a log-concave density by Langevin Monte Carlo, [Paper]
Session 2
- 17:10–17:50 Yuri Nesterov (Universite Catholique de Louvain)
New primal-dual subgradient methods for convex optimization problems with functional constraints, [Slides]
- 17:50–18:30 Sebastien Bubeck (Miscrosoft Research)
The entropic barrier: a simple and optimal universal self-concordant barrier, [Paper]
Tuesday, January, 13:
Session 1
- 09:00–9:40 Jean Bernard Lasserre (CNRS, LAAS)
Reconstruction of algebraic-exponential data from moments, [Slides]
- 9:40–10:20 Venkat Chandrasekaran (Caltech)
Relative Entropy Relaxations for Signomial Optimization.
- 10:20–10:40 Break
- 10:40–11:20 Ken Clarkson (IBM Almaden)
Sketching and Sampling for M-estimators, [Slides]
- 11:20–12:00 Aleksander Madry (MIT)
Interior-point Methods and the Maximum Flow Problem, [Slides]
Session 2
- 17:10–17:50 John Lafferty (University of Chicago)
High dimensional convex function estimation.
- 17:50–18:30 Marco Cuturi (Kyoto University)
The Wasserstein Barycenter Problem, [Slides]
Wednesday, January, 14:
- 17:10–17:50 Alekh Agarwal (Microsoft Research)
Learning sparsely used overcomplete dictionaries, [Slides]
- 17:50–18:30 Peter Richtarik (University of Edinburgh)
Coordinate descent methods with arbitrary sampling, [Slides]
Thursday, January, 15:
Session 1
- 09:00–9:40 Elad Hazan (Princeton University)
Overcoming NP-hardness by agnostic non-proper learning, [Slides]
- 9:40–10:20 Afonso Bandeira (Princeton University)
Tightness of SDP relaxations for certain inverse
problems on graphs.
- 10:20–10:40 Break
- 10:40–11:20 Sasha Rakhlin (University of Pennsylvania)
Randomized methods for 0th order optimization.
- 11:20–12:00 Lorenzo Rosasco (MIT, IIT)
Iterative Convex Regularization, [Slides]
Session 2
- 17:10–17:50 Robert Krauthgamer (Weizmann Institute)
Spectral Approaches to Nearest Neighbor Search, [Slides]
- 17:50–18:30 Vladimir Spokoiny (Weierstrass Institute)
Bootstrap confidence sets under model misspecification.
Friday, January, 16:
- 09:00–9:40 Constantine Caramanis (TU Austin)
On Detecting Epidemics from Weak Signatures.
- 9:40–10:20 Fajwel Fogel (Ecole Normale Superieure)
Seriation and ranking problems: spectral approach,
[Slides]
- 10:20–10:40 Break
- 10:40–11:20 Karthik Sridharan (Cornell University)
Relaxations: Deriving algorithms for learning and optimization.
- 11:20–12:00 John Duchi (Stanford)
Randomized smoothing techniques in optimization.