Showing 1 - 10 results of 10 for search '"data deduplication"', query time: 0.04s Refine Results
  1. 1
  2. 2
  3. 3

    Block level cloud data deduplication scheme based on attribute encryption by Wenting GE, Weihai LI, Nenghai YU

    Published 2023-10-01
    “…Due to the existing cloud data deduplication schemes mainly focus on file-level deduplication.A scheme was proposed, based on attribute encryption, to support data block-level weight removal.Double granularity weight removal was performed for both file-level and data block-level, and data sharing was achieved through attribute encryption.The algorithm was designed on the hybrid cloud architecture Repeatability detection and consistency detection were conducted by the private cloud based on file labels and data block labels.A Merkle tree was established based on block-level labels to support user ownership proof.When a user uploaded the cipher text, the private cloud utilized linear secret sharing technology to add access structures and auxiliary information to the cipher text.It also updated the overall cipher text information for new users with permissions.The private cloud served as a proxy for re-encryption and proxy decryption, undertaking most of the calculation when the plaintext cannot be obtained, thereby reducing the computing overhead for users.The processed cipher text and labels were stored in the public cloud and accessed by the private cloud.Security analysis shows that the proposed scheme can achieve PRV-CDA (Privacy Choose-distribution attacks) security in the private cloud.In the simulation experiment, four types of elliptic curve encryption were used to test the calculation time for key generation, encryption, and decryption respectively, for different attribute numbers with a fixed block size, and different block sizes with a fixed attribute number.The results align with the characteristics of linear secret sharing.Simulation experiments and cost analysis demonstrate that the proposed scheme can enhance the efficiency of weight removal and save time costs.…”
    Get full text
    Article
  4. 4

    Key-exposure resilient integrity auditing scheme with encrypted data deduplication by Xiangsong ZHANG, Chen LI, Zhenhua LIU

    Published 2019-04-01
    “…For the problems of key-exposure,encrypted data duplication and integrity auditing in cloud data storage,a public auditing scheme was proposed to support key update and encrypted data deduplication.Utilizing Bloom filters,the proposed scheme could achieve client-side deduplication,and guaranteed that the key exposure in one time period didn’t effect the users’ private key in other time periods.The proposed scheme could solve the conflict between key-exposure resilient and encrypted data deduplication in public auditing scheme for the first time.Security analysis indicates that the proposed scheme is strong key-exposure resilient,confidentiality,detectability,and unforgeability of authentication tags and tokens under the computation Diffie-Hellman hardness assumption in the random oracle model.…”
    Get full text
    Article
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10

    Research on Human Motion Recognition Based on Data Redundancy Technology by Hong-Lan Yang, Meng-Zhe Huang, Zheng-Qun Cai

    Published 2021-01-01
    “…Aiming at the problems of low recognition rate and slow recognition speed of traditional body action recognition methods, a human action recognition method based on data deduplication technology is proposed. Firstly, the data redundancy technology and perceptual hashing technology are combined to form an index, and the image is filtered from the structure, color, and texture features of human action image to achieve image redundancy processing. …”
    Get full text
    Article