Federated Learning for Nonparametric Function Estimation: Framework and Optimality

Tony Cai Speaker
University of Pennsylvania
 
Wednesday, Aug 6: 2:55 PM - 3:20 PM
Invited Paper Session 
Music City Center 
We consider statistical optimality for federated learning in the context of nonparametric regression and density estimation. The setting we study is heterogeneous, encompassing varying sample sizes and differential privacy constraints across different servers. Within this framework, both global and pointwise estimation are considered, and optimal rates of convergence over the Besov spaces are established.

We propose distributed, privacy-preserving estimation procedures and analyze their theoretical properties. The findings reveal intriguing phase transition phenomena, illustrating the trade-off between statistical accuracy and privacy. The results characterize how privacy budgets, server count, and sample size impact accuracy, highlighting the compromises in a distributed privacy framework.

Keywords

differential privacy

distributed inference

optimal rate