Séminaire de Statistique

CREST, ENSAE,

Université Paris-Saclay

CMAP,

Ecole Polytechnique

Organisateurs:

C. Butucea

A. B. Tsybakov

E.  Moulines

M. Rosenbaum

 

Lundi / Monday 14h - 15h15 – SALLE 3001 ENSAE

 

Sept 2017

Oct 2017

Nov 2017

Déc 2017

Jan 2018

Fév 2018

Mar 2018

Avr 2018

Mai 2018

Juin 2018

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Sept 11 2017

Arnak Dalalyan

ENSAE

Title : User-friendly bounds for sampling from a log-concave density using Langevin Monte Carlo

 

Abstract :  We will present new bounds on the sampling error in the case where the target distribution has a smooth and log-concave density. These bounds are established for the Langevin Monte Carlo and its discretized versions involving the Hessian matrix of the log-density. We will also discuss the case where accurate evaluation of the gradient is impossible.

Sept 18 2017

Séminaire Parisien de Statistique - IHP

 

Sept 25 2017

Mathias Trabs

Université de Hamburg

Title : Volatility estimation for stochastic PDE’s using high-frequency observations

 

Abstract : We study the parameter estimation for parabolic, linear, second order, stochastic partial differential equations (SPDEs) observing a mild solution on a discrete grid in time and space. A high-frequency regime is considered where the mesh of the grid in the time variable goes to zero. Focusing on volatility estimation, we provide an explicit and easy to implement method of moments estimator based on the squared increments of the process. The estimator is consistent and admits a central limit theorem. Starting from a representation of the solution as an infinite factor model, the theory considerably differs from the statistics for semi-martingales literature. The performance of the method is illustrated in a simulation study.


This is joint work with Markus Bibinger.

 

 

 

Oct 2 2017

Nicolas Marie

Modal’X (Paris 10)/ ESME Sudria

Title : Estimation non-paramétrique dans les équations différentielles dirigées par le mouvement brownien fractionnaire.

 

Abstract : Après avoir introduit quelques notions de calcul stochastique trajectoriel, l’exposé présentera un estimateur type Nadaraya-Watson de la fonction de drift d’une équation différentielle dirigée par un bruit multiplicatif fractionnaire. Afin d’établir la consistance de l’estimateur, les résultats d’ergodicité de Hairer et Ohashi (2007) seront énoncés et expliqués. Une fois sa consistence établie, la question de la vitesse de convergence de l’estimateur sera abordée. Il s’agit d’un travail en collaboration avec F. Comte.

Oct 9 2017

Zoltan Szabo

Ecole Polytechnique

Title : Characteristic Tensor Kernels

 

Abstract : Maximum mean discrepancy (MMD) and Hilbert-Schmidt independence criterion (HSIC) are popular techniques in data science to measure the difference and the independence of random variables, respectively. 

Thanks to their kernel-based foundations, MMD and HSIC are applicable on a variety of domains including documents, images, trees, graphs, time series, mixture models, dynamical systems, sets, distributions, permutations. Despite their tremendous practical success, quite little is known about when HSIC characterizes independence and MMD with tensor kernel can discriminate probability distributions, in terms of the 
contributing kernel components. In this talk, I am going to present a complete answer to this question, with conditions which are often easy to verify in practice. [Joint work with Bharath K. Sriperumbudur (PSU). 

Preprint: https://arxiv.org/abs/1708.08157]

Oct 16 2017

Séminaire Parisien de Statistique - IHP

 

Oct 23 2017

Philip Thompson

ENSAE

Cancelled

Oct 30 2017

Vacances

 

 

 

 

Nov 6 2017

Martin Kroll

ENSAE

Title : On minimax optimal and adaptive estimation of linear functionals in inverse Gaussian sequence space models

 

Abstract : We consider an inverse problem in a Gaussian sequence space model where the multiplication operator is not known but only available via noisy observations. Our aim is not to reconstruct the solution itself but the value of a linear functional of the solution. In our setup the optimal rate depends on two different noise levels, the noise level concerning the observation of the transformed solution and the noise level concerning the noisy observation of the operator.  We consider this problem from a minimax point of view and obtain upper and lower bounds under smoothness assumptions on the multiplication operator and the unknown solution.  Finally, we sketch an approach to the adaptive estimation in the given model using a method combining both model selection and the Goldenshluger-Lepski method.


This is joint work in progress with Cristina Butucea (ENSAE) and Jan Johannes (Heidelberg)

 

Nov 13 2017

Séminaire Parisien de Statistique - IHP

 

Nov 20 2017

Olivier Collier

Université Paris Nanterre

Title : Estimation robuste de la moyenne en temps polynômial

 

Abstract : Il s'agit de résultats obtenus en collaboration avec Arnak Dalalyan. Quand les observations sont polluées par la présence d'outliers, il n'est plus souhaitable d'estimer l'espérance par la moyenne empirique. Des méthodes optimales ont été trouvées, comme la profondeur de Tuckey dans le modèle dit de contamination. Cependant, ce dernier estimateur n'est pas calculable en temps polynômial. Dans un modèle gaussien, nous remarquerons que l'estimation de la moyenne revient à l'estimation d'une fonctionnelle linéaire sous contrainte de sparsité de groupe. Il est alors naturel d'utiliser group-lasso. Nous pourrons alors noter plusieurs phénomènes intéressants : dans ce contexte, la sparsité par groupe permet un gain polynômial par rapport à la seule sparsité, alors que les études précédentes montraient au mieux un gain logarithmique, et il semble que l'estimation en temps polynômial ne puisse pas atteindre la performance optimale des méthodes en temps exponentiel.

Nov 27 2017

Elisabeth Gassiat

Université Paris-Sud

Title : Estimation of the proportion of explained variation in high dimensions.

 

Abstract : Estimation of heritability of a phenotypic trait based on genetic data may be set as estimation of the proportion of explained variation in high dimensional linear models. I will be interested in understanding the impact of:

— not knowing the sparsity of the regression parameter,

— not knowing the variance matrix of the covariates

on minimax estimation of heritability.

In the situation where the variance of the design is known, I will present an estimation procedure that adapts to unknown sparsity. 

when the variance of the design is unknown and no prior estimator of it is available,  I will show that  consistent estimation of heritability is impossible.

(Joint work with N. Verzelen, and PHD thesis of A. Bonnet).

 

 

 

 

Dec 4 2017

Philip Thomson

ENSAE

Title : Stochastic approximation with heavier tails

 

Abstract : We consider the solution of convex optimization and variational inequality problems via the stochastic approximation methodology where the gradient or operator can only be accessed through an unbiased stochastic oracle. First, we show that (non-asymptotic) convergence is possible with unbounded constraints and a "multiplicative noise" model: the oracle is Lipschitz continuous with a finite pointwise variance which may not be uniformly bounded (as classically assumed). In this setting, our bounds depend on local variances at solutions and the method uses noise reduction in an efficient manner: given a precision, it respects a near-optimal sample and averaging complexities of Polyak-Ruppert's method but attains the order of the (faster) deterministic iteration complexity. Second, we discuss a more "robust" version where the Lipschitz constant L is unknown but, in terms of error precision, near-optimal complexities are maintained. A price to pay when L is unknown is that a large sample regime is assumed (still respecting the complexity of the SAA estimator) and "non-martingale-like" dependencies are introduced. These dependencies are coped with an "iterative localization" argument based on empirical process theory and self-normalization. 

Joint work with A. Iusem (IMPA), A. Jofré (CMM-Chile) and R.I. Oliveira (IMPA).

Dec 11 2017

Jamal Najim

CNRS UPEM

Title :

 

Abstract :

Dec 18 2017

Pas de séminaire

 

 

 

 

Jan 8 2018

Eric Moulines

Ecole Polytechnique

 

Jan 15 2018

Séminaire Parisien de Statistique - IHP

 

Jan 22 2018

 

 

Jan 29 2018