HRR: a data cleaning approach preserving local differential privacy

For the sensitive data generated by the sensor, we can use the noise to protect the privacy of these data. However, because of the complicated collection environment of the sensor data, it is easy to obtain some disorderly data, and the data need to be cleaned before use. In this work, we establish...

Full description

Saved in:
Bibliographic Details
Main Authors: Qilong Han, Qianqian Chen, Liguo Zhang, Kejia Zhang
Format: Article
Language:English
Published: Wiley 2018-12-01
Series:International Journal of Distributed Sensor Networks
Online Access:https://doi.org/10.1177/1550147718819938
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832547785794125824
author Qilong Han
Qianqian Chen
Liguo Zhang
Kejia Zhang
author_facet Qilong Han
Qianqian Chen
Liguo Zhang
Kejia Zhang
author_sort Qilong Han
collection DOAJ
description For the sensitive data generated by the sensor, we can use the noise to protect the privacy of these data. However, because of the complicated collection environment of the sensor data, it is easy to obtain some disorderly data, and the data need to be cleaned before use. In this work, we establish the differential privacy cleaning model H-RR, which is based on the contradiction generated by the function dependency, correct the contradictory data, and use the indistinguishability between the correction results to protect the data privacy. In this model, we add the local differential privacy mechanism in the process of data cleaning. While simplifying the data pre-processing process, we want to find a balance between data availability and security.
format Article
id doaj-art-7945700a075f41628ab6b0f71ec3f1c9
institution Kabale University
issn 1550-1477
language English
publishDate 2018-12-01
publisher Wiley
record_format Article
series International Journal of Distributed Sensor Networks
spelling doaj-art-7945700a075f41628ab6b0f71ec3f1c92025-02-03T06:43:17ZengWileyInternational Journal of Distributed Sensor Networks1550-14772018-12-011410.1177/1550147718819938HRR: a data cleaning approach preserving local differential privacyQilong HanQianqian ChenLiguo ZhangKejia ZhangFor the sensitive data generated by the sensor, we can use the noise to protect the privacy of these data. However, because of the complicated collection environment of the sensor data, it is easy to obtain some disorderly data, and the data need to be cleaned before use. In this work, we establish the differential privacy cleaning model H-RR, which is based on the contradiction generated by the function dependency, correct the contradictory data, and use the indistinguishability between the correction results to protect the data privacy. In this model, we add the local differential privacy mechanism in the process of data cleaning. While simplifying the data pre-processing process, we want to find a balance between data availability and security.https://doi.org/10.1177/1550147718819938
spellingShingle Qilong Han
Qianqian Chen
Liguo Zhang
Kejia Zhang
HRR: a data cleaning approach preserving local differential privacy
International Journal of Distributed Sensor Networks
title HRR: a data cleaning approach preserving local differential privacy
title_full HRR: a data cleaning approach preserving local differential privacy
title_fullStr HRR: a data cleaning approach preserving local differential privacy
title_full_unstemmed HRR: a data cleaning approach preserving local differential privacy
title_short HRR: a data cleaning approach preserving local differential privacy
title_sort hrr a data cleaning approach preserving local differential privacy
url https://doi.org/10.1177/1550147718819938
work_keys_str_mv AT qilonghan hrradatacleaningapproachpreservinglocaldifferentialprivacy
AT qianqianchen hrradatacleaningapproachpreservinglocaldifferentialprivacy
AT liguozhang hrradatacleaningapproachpreservinglocaldifferentialprivacy
AT kejiazhang hrradatacleaningapproachpreservinglocaldifferentialprivacy