Fermer ce champ de recherche.
Robustness and computational efficiency of algorithms in statistical learning



Noisy and corrupted measurements, commonly encountered as a result of unsupervised, automated data collection process, require new algorithms that produce reliable outcomes in this challenging framework. Discussing the influence of outliers on statistical procedures, P. Huber  observed that « …the naturally occurring deviations from the idealized model are large enough to render meaningless the traditional asymptotic optimality theory. » It is known that heuristic outlier removal procedures are bias-producing and lack rigorous justification (or require strong assumptions), as it is sometimes impossible to determine if an extreme observation appears due to an error, or is a feature of the data-generating mechanism. This motivates the study of robust estimators in the context of statistical learning theory.

Our goal is to encourage collaboration and knowledge sharing between theoretical computer scientists and mathematical statisticians by bringing them together at Luminy. Both communities possess unique visions, skills and expertise, we hope that merging these strength will advance the field and bring us closer to solving some of the key challenges.

Organizing Committee & Scientific Committee

Cristina Butucea  (CREST, ENSAE, Université Paris-Est Marne-la-Vallée)
Stanislav Minsker (University of Southern California)
Christophe Pouet  (Ecole Centrale de Marseille)
Vladimir Spokoiny (Humboldt University of Berlin)

Discussion rooms will be available to registered conference participants via the link below. Your access code to the rooms is provided by the organizers.
This page is accessible with a password issued by CIRM.
Use of Big Blue Button

Sara van de Geer (ETH Zürich)  Adaptive rates for trend ltering using dual certificates
Chao Gao (University of Chicago) Statistical Optimality and Algorithms for Top-K and Total Ranking


Felix Abramovich (Tel Aviv University)    
High-dimensional classification by sparse logistic regression

Pierre Bellec (Rutgers University)   
Out-of-sample error estimate for robust M-estimators with convex penalty

Clément Berenfeld (Université Paris-Dauphine)    
Density estimation on manifolds

Thomas Berrett (University of Warwick)   
Locally private non-asymptotic testing of distributions is faster using interactive mechanisms

Natalia Bochkina (University of Edinburgh)   
Bernstein-von Mises theorem for the scale hyperparameter in inverse problems with a Gaussian prior

Alexandra Carpentier (OvGU Magdeburg)  
Several structured thresholding bandit problem

Julien Chhor (CREST-ENSAE)  
Goodness-of-fit testing for multinomials and densities: sharp local minimax rates

Fabienne Comte (Université Paris Descartes)   
Nonparametric estimation for i.i.d. Gaussian continuous time moving average models

Alexis Derumigny (University of Twente) 
On lower bounds for the bias-variance trade-off

Motonobu Kanagawa (EURECOM)   
On the connections and equivalences between Gaussian processes and kernel methods in nonparametric regression

Avetik Karagulyan (ENSAE/CREST)   
Penalized Langevin dynamics with vanishing penalty for smooth and log-concave targets

Clément Marteau (Université Lyon 1)    
SuperMix: Sparse regularization for mixtures

Mohamed Simo Ndaoud (University of Southern California)    
Robust and efficient mean estimation: approach based on the properties of self-normalized sums

Tuan-Binh Nguyen (Universite Paris-Sud (Paris-Saclay)  
Aggregation of Multiple Knockoffs

Marianna Pensky (University of Central Florida)   
Statistical Inference in Popularity Adjusted Stochastic Block Model

Vianney Perchet (ENSAE & Criteo AI Lab)  
Robustness of Community Detection to Random Geometric Perturbations

Philippe Rigollet (MIT)    
Minimax Coreset Density Estimation

Vincent Rivoirard (Université Paris-Dauphine)  
Nonparametric Bayesian inference for Hawkes processes

Angelika Rohde (Albert-Ludwigs-University Freiburg)   
Interactive versus non-interactive locally, differentially private estimation: Two elbows for the quadratic functional

Richard Samworth (University of Cambridge)    
Adaptive transfer learning

Johannes Schmidt-Hieber (University of Twente)   
The Kolmogorov-Arnold theorem revisited

Suzanne Sigalla (CREST-ENSAE)   
Improved clustering algorithms for the Bipartite Stochastic Block Model

Zoltan Szabo (Ecole Polytechnique)   
Kernel Machines with Hard Shape Constraints

Nicolas Verzelen (INRAE Montpellier)  
Optimal Change-Point Detection and Localization

Nikita Zhivotovsky (Google Research)  
Robust k-means clustering for distributions with two bounded moments