Title:
Adaptive Differential Privacy Mechanism for Structured Sensitive Data Attributes
Abstract:
Recently, Differential Privacy emerged as the state-of-art approach for Privacy Preservation in various domains surpassing the conventional anonymization techniques. The intuition behind Differential Privacy is rigorous Mathematical proof of the user indistinguishability in the dataset by adding randomness. However, current implementations of Differential Privacy are highly tailored for the specific task, making it harder to apply in a variety of situations and also decreasing the learning utility of the data. The latest researches on Differential Privacy mainly focus on protecting the Machine Learning model rather than the data itself. This study addresses the problem of generating Differentially Private structured data focusing on the privacy of sensitive attributes. We propose a model-agnostic method of reaching Differential Privacy that guarantees the high performance of the ML model trained on private data and protects against Attribute Inference Attacks. We introduce a novel Adaptive Sensitivity measure that calibrates the amount of random noise not only depending on the record sensitivity, but also in conjunction with the Feature Importance of each corresponding attribute. The proposed approach preserves the same level of information uncertainty of relevant data attributes. We experimentally show the effectiveness of the proposed approach by evaluating the performance, utility, and privacy guarantees.