Foundations of Distributed and Large Scale Computing Optimization
E. CHOUZENOUX
LearningTheory

Objectif du cours

The objective of this course is to introduce the theoretical background which makes it possible to develop efficient algorithms to successfully address these problems by taking advantage of modern multicore or distributed computing architectures. This course will be mainly focused on nonlinear optimization tools for dealing with convex problems. Proximal tools, splitting techniques and Majorization-Minimization strategies which are now very popular for processing massive datasets will be presented. Illustrations of these methods on various applicative examples will be provided.

 

Presentation : here

Organisation des séances

7 session (3h each) + exam (2h)

Mode de validation

2 labs report + exam

Références

Bauschke, H.H. and Combettes, P. L.: ConvexAnalysis and Monotone Operator Theory in HilbertSpaces. Springer, New York, 2011.

Parikh, N. and Boyd, S.: Proximal Algorithms. Foundationsand Trends in Optimization, vol. 1, no. 3, Jan. 2014.

Thèmes abordés

* Large scale optimization

* Proximal algorithms

* Majorization-Minimization methods

* Parallel optimization

* Distributed optimization strategies

Les intervenants

Emilie CHOUZENOUX

(CentraleSupélec & INRIA Saclay)

voir les autres cours du 1er semestre