Differentially private kernel empirical risk minimization via Kmeans Nymstrom approximation

Cheolwoo Park Co-Author
KAIST
 
Jeongyoun Ahn Co-Author
University of Georgia
 
Bonwoo Lee First Author
 
Bonwoo Lee Presenting Author
 
Tuesday, Aug 6: 8:45 AM - 8:50 AM
2230 
Contributed Speed 
Oregon Convention Center 
Since the differential privacy has become a state of the art concept for privacy guarantee, a lengthy amount of works were invested in differentially private kernel learning. However most works is restricted in differentially private kernel learning using translation invariant kernel with rare exceptions. Also, many suggested frameworks release a differentially private kernel learning with fixed hyperparameters, which excludes the hyperparameter tuning procedures from the framework. In this work, we propose a framework of differentially private kernel empirical risk minimization that allows to perform a kernel learning for general kernel by Kmeans Nystrom approximation with theoretical guarantees. Also, we suggest a differentially private kernel mean embedding for general kernel. Additionally we give a differentially private kernel ridge regression, and logistic regression method that can learn various regularization parameters simultaneously.

Keywords

differentially privacy

kernel learning

empirical risk minimization

Nystrom approximation

Kmeans Nystrom approximation

Kernel mean embedding 

Main Sponsor

Korean International Statistical Society