Showing 501 - 520 results of 2,784 for search '(((( uses OR useddddds) OR used) privacy data\ ) OR (\ use privacy data\ ))', query time: 0.33s Refine Results
  1. 501
  2. 502

    Children's digital privacy on fast-food and dine-in restaurant mobile applications. by Christine Mulligan, Grace Gillis, Lauren Remedios, Christopher Parsons, Laura Vergeer, Monique Potvin Kent

    Published 2025-02-01
    “…Restaurant mobile applications are powerful platforms for collecting users' data and are popular among children. This study aimed to provide insight into the privacy policies of top dine-in and fast-food mobile apps in Canada and data collected on child users. …”
    Get full text
    Article
  3. 503

    ZK-STARK: Mathematical Foundations and Applications in Blockchain Supply Chain Privacy by Arade Madhuri S., Pise Nitin N.

    Published 2025-03-01
    “…Privacy is one of the major security concerns. The zero-knowledge proof enables the transmission of data from the sender to the receiver without disclosing the actual content of the data. …”
    Get full text
    Article
  4. 504

    A multimodal differential privacy framework based on fusion representation learning by Chaoxin Cai, Yingpeng Sang, Hui Tian

    Published 2022-12-01
    “…Then based on this representation, we use the Local Differential Privacy (LDP) mechanism to protect data. …”
    Get full text
    Article
  5. 505

    A deep decentralized privacy-preservation framework for online social networks by Samuel Akwasi Frimpong, Mu Han, Emmanuel Kwame Effah, Joseph Kwame Adjei, Isaac Hanson, Percy Brown

    Published 2024-12-01
    “…This paper addresses the critical challenge of privacy in Online Social Networks (OSNs), where centralized designs compromise user privacy. …”
    Get full text
    Article
  6. 506

    A Verifiable, Privacy-Preserving, and Poisoning Attack-Resilient Federated Learning Framework by Washington Enyinna Mbonu, Carsten Maple, Gregory Epiphaniou, Christo Panchev

    Published 2025-03-01
    “…Federated learning is the on-device, collaborative training of a global model that can be utilized to support the privacy preservation of participants’ local data. …”
    Get full text
    Article
  7. 507

    A location semantic privacy protection model based on spatial influence by Linghong Kuang, Wenlong Shi, Xueqi Chen, Jing Zhang, Huaxiong Liao

    Published 2025-04-01
    “…Nonetheless, while trajectory data mining enhances user convenience, it also exposes their privacy to potential breaches. …”
    Get full text
    Article
  8. 508

    Jointly Achieving Smart Homes Security and Privacy through Bidirectional Trust by Osman Abul, Melike Burakgazi Bilgen

    Published 2025-04-01
    “…Once approved, users are primarily concerned about privacy protection (i.e., user-to-system trust) when utilizing system services that require sensitive data for their functionality. …”
    Get full text
    Article
  9. 509

    Edge computing privacy protection method based on blockchain and federated learning by Chen FANG, Yuanbo GUO, Yifeng WANG, Yongjin HU, Jiali MA, Han ZHANG, Yangyang HU

    Published 2021-11-01
    “…Aiming at the needs of edge computing for data privacy, the correctness of calculation results and the auditability of data processing, a privacy protection method for edge computing based on blockchain and federated learning was proposed, which can realize collaborative training with multiple devices at the edge of the network without a trusted environment and special hardware facilities.The blockchain was used to endow the edge computing with features such as tamper-proof and resistance to single-point-of-failure attacks, and the gradient verification and incentive mechanism were incorporated into the consensus protocol to encourage more local devices to honestly contribute computing power and data to the federated learning.For the potential privacy leakage problems caused by sharing model parameters, an adaptive differential privacy mechanism was designed to protect parameter privacy while reducing the impact of noise on the model accuracy, and moments accountant was used to accurately track the privacy loss during the training process.Experimental results show that the proposed method can resist 30% of poisoning attacks, and can achieve privacy protection with high model accuracy, and is suitable for edge computing scenarios that require high level of security and accuracy.…”
    Get full text
    Article
  10. 510

    Challenges in IoMT Adoption in Healthcare: Focus on Ethics, Security, and Privacy by Alton Mabina, Neo Rafifing, Boago Seropola, Thapelo Monageng, Pulafela Majoo

    Published 2024-12-01
    “…This study highlights ethical, security, and privacy barriers to IoMT adoption in developing countries and proposes strategies like regulatory frameworks, data encryption, AI transparency, and professional training to address these challenges. …”
    Get full text
    Article
  11. 511

    Privacy-Preserving Continual Federated Clustering via Adaptive Resonance Theory by Naoki Masuyama, Yusuke Nojima, Yuichiro Toda, Chu Kiong Loo, Hisao Ishibuchi, Naoyuki Kubota

    Published 2024-01-01
    “…In the clustering domain, various algorithms with a federated learning framework (i.e., federated clustering) have been actively studied and showed high clustering performance while preserving data privacy. However, most of the base clusterers (i.e., clustering algorithms) used in existing federated clustering algorithms need to specify the number of clusters in advance. …”
    Get full text
    Article
  12. 512
  13. 513

    Privacy-enhanced federated learning scheme based on generative adversarial networks by Feng YU, Qingxin LIN, Hui LIN, Xiaoding WANG

    Published 2023-06-01
    “…Federated learning, a distributed machine learning paradigm, has gained a lot of attention due to its inherent privacy protection capability and heterogeneous collaboration.However, recent studies have revealed a potential privacy risk known as “gradient leakage”, where the gradients can be used to determine whether a data record with a specific property is included in another participant’s batch, thereby exposing the participant’s training data.Current privacy-enhanced federated learning methods may have drawbacks such as reduced accuracy, computational overhead, or new insecurity factors.To address this issue, a differential privacy-enhanced generative adversarial network model was proposed, which introduced an identifier into vanilla GAN, thus enabling the input data to be approached while satisfying differential privacy constraints.Then this model was applied to the federated learning framework, to improve the privacy protection capability without compromising model accuracy.The proposed method was verified through simulations under the client/server (C/S) federated learning architecture and was found to balance data privacy and practicality effectively compared with the DP-SGD method.Besides, the usability of the proposed model was theoretically analyzed under a peer-to-peer (P2P) architecture, and future research work was discussed.…”
    Get full text
    Article
  14. 514

    PRIVocular: Enhancing User Privacy Through Air-Gapped Communication Channels by Anastasios N. Bikos

    Published 2025-05-01
    “…Our pre-prototyped framework can provide such privacy preservation (namely <i>virtual proof of privacy (VPP)</i>) and visually secure data transfer promptly (<1000 ms), as well as the physical distance of the smart glasses (∼50 cm).…”
    Get full text
    Article
  15. 515

    Privacy-Preserving Machine Learning (PPML) Inference for Clinically Actionable Models by Baris Balaban, Seyma Selcan Magara, Caglar Yilgor, Altug Yucekul, Ibrahim Obeid, Javier Pizones, Frank Kleinstueck, Francisco Javier Sanchez Perez-Grueso, Ferran Pellise, Ahmet Alanay, Erkay Savas, Cetin Bagci, Osman Ugur Sezerman

    Published 2025-01-01
    “…We implement a privacy-preserving tree-based machine learning inference and run two security scenarios (scenario A and scenario B) containing four parts with progressively increasing the number of synthetic data points, which are used to enhance the accuracy of the attacker&#x2019;s substitute model. …”
    Get full text
    Article
  16. 516

    Location Privacy-Preserving Channel Allocation Scheme in Cognitive Radio Networks by Hongning Li, Qingqi Pei, Wenjing Zhang

    Published 2016-07-01
    “…In this paper, to make full use of idle spectrum with low probability of location leakage, we propose a Location Privacy-Preserving Channel Allocation (LP-p CA) scheme. …”
    Get full text
    Article
  17. 517

    GuardianML: Anatomy of Privacy-Preserving Machine Learning Techniques and Frameworks by Nges Brian Njungle, Eric Jahns, Zhenqi Wu, Luigi Mastromauro, Milan Stojkov, Michel A. Kinsy

    Published 2025-01-01
    “…Machine learning has become integral to our lives, finding applications in nearly every aspect of our daily routines. However, using personal information in machine learning applications has raised concerns about user data privacy and security. …”
    Get full text
    Article
  18. 518

    A Privacy-Preserving Querying Mechanism with High Utility for Electric Vehicles by Ugur Ilker Atmaca, Sayan Biswas, Carsten Maple, Catuscia Palamidessi

    Published 2024-01-01
    “…Simultaneously, personal data use for analytics is growing at an unprecedented rate, raising concerns for privacy. …”
    Get full text
    Article
  19. 519

    PPSC: High-Precision and Scalable Encrypted Privacy-Preserving Speech Classification by WANG Leilei, SONG Kao, ZHANG Yuanyuan, BI Renwan, XIONG Jinbo

    Published 2025-02-01
    “…Secondly, the PPSC scheme securely implements the fundamental modules such as the convolutional layer, ReLU layer, average pooling layer, fully connected layer, and Softmax layer. This ensures the privacy of speech data, speech classification models, and intermediate computing results. …”
    Get full text
    Article
  20. 520

    Self-monitoring of health - user viewpoints on gathering data using consumer health technologies during leisure time by Nora Weinberger, Martina F. Baumann, Maria Maia

    Published 2025-06-01
    “…It focuses on attitudes toward health data collection, data sharing, privacy concerns, and the use of EEG-supported devices.ResultsFindings reveal a complex landscape of trust and concern. …”
    Get full text
    Article