Privacy Auditing in Differential Private Machine Learning: The Current Trends

Differential privacy has recently gained prominence, especially in the context of private machine learning. While the definition of differential privacy makes it possible to provably limit the amount of information leaked by an algorithm, practical implementations of differentially private algorithm...

Full description

Saved in:
Bibliographic Details
Main Authors: Ivars Namatevs, Kaspars Sudars, Arturs Nikulins, Kaspars Ozols
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/2/647
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832589265625677824
author Ivars Namatevs
Kaspars Sudars
Arturs Nikulins
Kaspars Ozols
author_facet Ivars Namatevs
Kaspars Sudars
Arturs Nikulins
Kaspars Ozols
author_sort Ivars Namatevs
collection DOAJ
description Differential privacy has recently gained prominence, especially in the context of private machine learning. While the definition of differential privacy makes it possible to provably limit the amount of information leaked by an algorithm, practical implementations of differentially private algorithms often contain subtle vulnerabilities. Therefore, there is a need for effective methods that can audit <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>(</mo><mi>ϵ</mi><mo>,</mo><mi>δ</mi><mo>)</mo></mrow></semantics></math></inline-formula> differentially private algorithms before they are deployed in the real world. The article examines studies that recommend privacy guarantees for differential private machine learning. It covers a wide range of topics on the subject and provides comprehensive guidance for privacy auditing schemes based on privacy attacks to protect machine-learning models from privacy leakage. Our results contribute to the growing literature on differential privacy in the realm of privacy auditing and beyond and pave the way for future research in the field of privacy-preserving models.
format Article
id doaj-art-7077c1872c164b7a8d99a1c71c614433
institution Kabale University
issn 2076-3417
language English
publishDate 2025-01-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj-art-7077c1872c164b7a8d99a1c71c6144332025-01-24T13:20:16ZengMDPI AGApplied Sciences2076-34172025-01-0115264710.3390/app15020647Privacy Auditing in Differential Private Machine Learning: The Current TrendsIvars Namatevs0Kaspars Sudars1Arturs Nikulins2Kaspars Ozols3Institute of Electronics and Computer Science, 14 Dzerbenes St., LV-1006 Riga, LatviaInstitute of Electronics and Computer Science, 14 Dzerbenes St., LV-1006 Riga, LatviaInstitute of Electronics and Computer Science, 14 Dzerbenes St., LV-1006 Riga, LatviaInstitute of Electronics and Computer Science, 14 Dzerbenes St., LV-1006 Riga, LatviaDifferential privacy has recently gained prominence, especially in the context of private machine learning. While the definition of differential privacy makes it possible to provably limit the amount of information leaked by an algorithm, practical implementations of differentially private algorithms often contain subtle vulnerabilities. Therefore, there is a need for effective methods that can audit <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>(</mo><mi>ϵ</mi><mo>,</mo><mi>δ</mi><mo>)</mo></mrow></semantics></math></inline-formula> differentially private algorithms before they are deployed in the real world. The article examines studies that recommend privacy guarantees for differential private machine learning. It covers a wide range of topics on the subject and provides comprehensive guidance for privacy auditing schemes based on privacy attacks to protect machine-learning models from privacy leakage. Our results contribute to the growing literature on differential privacy in the realm of privacy auditing and beyond and pave the way for future research in the field of privacy-preserving models.https://www.mdpi.com/2076-3417/15/2/647differential privacydifferential private machine learningdifferential privacy auditingprivacy attacks
spellingShingle Ivars Namatevs
Kaspars Sudars
Arturs Nikulins
Kaspars Ozols
Privacy Auditing in Differential Private Machine Learning: The Current Trends
Applied Sciences
differential privacy
differential private machine learning
differential privacy auditing
privacy attacks
title Privacy Auditing in Differential Private Machine Learning: The Current Trends
title_full Privacy Auditing in Differential Private Machine Learning: The Current Trends
title_fullStr Privacy Auditing in Differential Private Machine Learning: The Current Trends
title_full_unstemmed Privacy Auditing in Differential Private Machine Learning: The Current Trends
title_short Privacy Auditing in Differential Private Machine Learning: The Current Trends
title_sort privacy auditing in differential private machine learning the current trends
topic differential privacy
differential private machine learning
differential privacy auditing
privacy attacks
url https://www.mdpi.com/2076-3417/15/2/647
work_keys_str_mv AT ivarsnamatevs privacyauditingindifferentialprivatemachinelearningthecurrenttrends
AT kasparssudars privacyauditingindifferentialprivatemachinelearningthecurrenttrends
AT artursnikulins privacyauditingindifferentialprivatemachinelearningthecurrenttrends
AT kasparsozols privacyauditingindifferentialprivatemachinelearningthecurrenttrends