By browsing our site you accept the installation and use cookies on your computer. Know more

Menu Logo Principal logo partenaire

StatMathAppli2017

StatMathAppli 2017

Statistics Mathematics and Applications

September 4 to 8, 2017
La Villa Clythia, Fréjus - Var (France)  

Aims :

The seminar "Statistics Mathematics and Applications" follows the seminar that held at Luminy each two years in 2006, 2004, ... and at Fréjus in 2008, 2010, 2011, 2013 and 2015.
 
The idea of this seminar is to give an opportunity to young statisticians from several countries to meet and present their work in an international meeting. Two series of invited lectures on mathematical statistics and their applications in the real life will be given by outstanding speakers.

Invited speakers :

This year, the invited lectures will be given by :

  • Francis BACH, INRIA Paris (http://www.di.ens.fr/~fbach/)
Francis Bach

Large-scale machine learning and convex optimization.
Many statistics, machine learning and signal processing problems are traditionally cast as convex optimization problems. A common difficulty in solving these problems is the size of the data, where there are many observations ("large n") and each of these is large ("large p"). In this setting, online algorithms such as stochastic gradient descent which pass over the data only once, are usually preferred over batch algorithms, which require multiple passes over the data. Given n observations/iterations, the optimal convergence rates of these algorithms are O(1/\sqrt{n}) for general convex functions and reaches O(1/n) for strongly-convex functions. 
In this tutorial, I will first present the classical results in stochastic approximation and relate them to classical optimization and statistics results. I will then show how the smoothness of loss functions may be used to design novel algorithms with improved behavior, both in theory and practice: in the ideal infinite-data setting, an efficient novel Newton-based stochastic approximation algorithm leads to a convergence rate of O(1/n) without strong convexity assumptions, while in the practical finite-data setting, an appropriate combination of batch and online algorithms leads to unexpected behaviors, such as a linear convergence rate for strongly convex problems, with an iteration cost similar to stochastic gradient descent.

montanar

Statistics with matrices, graphs, and tensors

Many modern datasets take the form of matrices or tensors. Applications range from
collaborative filtering to network analysis, to imaging. I will survey some basic statistical models, and
techniques to deal with them. Possible topics will include:
1) The spiked matrix model.
2) Semidefinite programming relaxations.
3) Z2 synchronization and the two-groups stochastic block model.
4) Sparse low-rank matrices, and the hidden clique problem.
5) Message passing algorithms
6) Estimating low-rank tensors.

News

All news
Symposium poster 2017

Symposium poster 2017

The 2017 edition is launched.
Read more
EDITION 2015

BACK TO THE 2015 EDITION

EDITION 2015
Read more
MIA-Jouy

MIA-Jouy merged with MIG to create the MaIAGE Unit on January 1, 2015

A new Unit in INRA of Jouy en Josas
Read more

See all news