Differentially Private Geodesic Regression

Aditya Kulkarni Co-Author
University of Massachusetts Amherst
 
Carlos Soto Co-Author
UMass Amherst
 
Carlos Soto Speaker
UMass Amherst
 
Thursday, Aug 7: 8:35 AM - 8:55 AM
Topic-Contributed Paper Session 
Music City Center 
It has become increasingly common to encounter data structures that inherently live on non-linear spaces such as manifolds in statistical applications. As such, geodesic regression emerged as an extension of classical linear regression, one of the most fundamental methodologies of statistical learning, where the response variable lives on a Riemannian manifold. %As with linear regression, the learned relationship captured by the parameters leak information of the subjects
The parameters of geodesic regression, as with linear regression, can capture the relationship of sensitive data and hence one should consider the privacy protection practices of said parameters.
We consider releasing Differentially Private (DP) parameters of geodesic regression via the K-Norm Gradient (KNG) mechanism for Riemannian manifolds. We derive theoretical bounds for the sensitivity of the parameters showing they are tied to their respective Jacobi fields and hence the curvature of the space which corroborates recent findings of differential privacy for the Fr\'echet mean. We demonstrate the efficacy of our methodology on a 2D sphere though it is general to Riemannian manifolds making it suitable for data in domains such as medical imaging and computer vision.