Differential Privacy in the Survey Context: The Impact of Weighting Class Adjustments on the Sensitivity of a Population Total
Tuesday, Aug 5: 11:20 AM - 11:25 AM
1430
Contributed Speed
Music City Center
The concept of differential privacy (DP) aims at limiting the impact that any single record can have on the analysis of interest. To optimally control this impact, DP generally requires to compute the global sensitivity which measures the maximum possible change of the statistic if a single record is changed in the database. When applying DP in the context of survey data, one needs to consider that preprocessing steps like nonresponse adjustments or calibration have been applied to the data before the analysis. These adjustment steps typically increase the global sensitivity as changing one record in the database will also change the results of the adjustments. In this work, we specifically focus on the effects of weighting class adjustments, a common strategy to correct for unit nonresponse in surveys. We comprehensively examine how different scenarios affect the sensitivity of weighted population totals under both bounded DP (changing values of a single record while keeping dataset size fixed) and unbounded DP (adding or removing a single record) frameworks. Our analysis further distinguishes between the response status of the changed record to identify worst-case scenarios for sensitivity calculations. We derive explicit sensitivity formulas for all possible scenarios and identify which combinations produce maximum sensitivity. Our results show that with weighting class adjustments DP loses its symmetric property, i.e., the sensitivity differs when adding one record compared to a scenario in which one record is removed.
Differential privacy
Nonresponse bias
Post-stratification
Sensitivity
Survey Statistic
Data confidentiality
Main Sponsor
Survey Research Methods Section
You have unsaved changes.