Privacy Auditing in Differential Private Machine Learning: The Current Trends
Differential privacy has recently gained prominence, especially in the context of private machine learning. While the definition of differential privacy makes it possible to provably limit the amount of information leaked by an algorithm, practical implementations of differentially private algorithm...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/2/647 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Differential privacy has recently gained prominence, especially in the context of private machine learning. While the definition of differential privacy makes it possible to provably limit the amount of information leaked by an algorithm, practical implementations of differentially private algorithms often contain subtle vulnerabilities. Therefore, there is a need for effective methods that can audit <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>(</mo><mi>ϵ</mi><mo>,</mo><mi>δ</mi><mo>)</mo></mrow></semantics></math></inline-formula> differentially private algorithms before they are deployed in the real world. The article examines studies that recommend privacy guarantees for differential private machine learning. It covers a wide range of topics on the subject and provides comprehensive guidance for privacy auditing schemes based on privacy attacks to protect machine-learning models from privacy leakage. Our results contribute to the growing literature on differential privacy in the realm of privacy auditing and beyond and pave the way for future research in the field of privacy-preserving models. |
---|---|
ISSN: | 2076-3417 |