Prè-requis
- An undergraduate course in probability.
- It is recommended to have followed either the course of P. Latouche and N. Chopin on « Probabilistic graphical models » or the course of S. Allassonière on « Computational statistics » during the first semester.
- Practical will be in Python and R. A basic knowledge of both languages is required. Nothing fancy, students should simply be able to read and write simple programs and load libraries: going through a basic online tutorial in both languages should be enough.
Objectif du cours
By the end of the course, the students should
- have a high-level view of the main approaches to making decisions under uncertainty.
- be able to detect when being Bayesian helps and why.
- be able to design and run a Bayesian ML pipeline for standard supervised or unsupervised
learning. - have a global view of the current limitations of Bayesian approaches and the research
landscape. - be able to understand the abstract of most Bayesian ML papers.
Organisation des séances
* 8×3 hours of lectures and practice TPs
* 4 hours of « student seminars » for the evaluation.
* All classes and material will be in English. Students may write their final report either in French or English.
Mode de validation
- Students form groups. Each group reads and reports on a research paper from a list. We strongly encourage a dash of creativity: students should identify a weak point, shortcoming or limitation of the paper, and try to push in that direction. This can mean extending a proof, implementing another feature, investigating different experiments, etc.
- Deliverables are a small report and a short oral presentation in front of the class, in the form of a student seminar, which will take place during the last lecture.
Références
Parmigiani, G. and Inoue, L. 2009:Decision theory: principles and approaches. Wiley.
Robert, C. 2007. The Bayesian choice. Springer.
Murphy, K. 2012. Machine learning: a probabilistic perspective. MIT Press.
Ghosal, S., & Van der Vaart, A. W. 2017. Fundamentals of nonparametric Bayesian inference. Cambridge University Press.
Thèmes abordés
- Decision theory
- 50 shades of Bayes: Subjective and objective interpretations
- Bayesian supervised and unsupervised learning
- Bayesian computation for ML: Advanced Monte Carlo and variational methods
- Bayesian nonparametrics
- Bayesian methods for deep learning
Les intervenants
Rémi BARDENET
(CNRS, Univ. Lille)
Julyan ARBEL
(Inria, Univ. Grenoble-Alpes)
Gabriel VICTORINO CARDOSO
(Mines Paris)