DATA SECURITY IN CLOUD USING ADVANCED SECURE DE-DUPLICATION



Similar documents
Optimized And Secure Data Backup Solution For Cloud Using Data Deduplication

SURVEY ON DISTRIBUTED DEDUPLICATION SYSTEM WITH AUDITING AND IMPROVED RELIABILITY IN CLOUD Rekha R 1, ChandanRaj BR 2

An Authorized Duplicate Check Scheme for Removing Duplicate Copies of Repeating Data in The Cloud Environment to Reduce Amount of Storage Space

Secure Hybrid Cloud Architecture for cloud computing

Efficient and Secure Dynamic Auditing Protocol for Integrity Verification In Cloud Storage

How To Secure Cloud Computing, Public Auditing, Security, And Access Control In A Cloud Storage System

A Survey on Secure Auditing and Deduplicating Data in Cloud

Implementation of Data Sharing in Cloud Storage Using Data Deduplication

Data Deduplication Scheme for Cloud Storage

SECURE CLOUD STORAGE PRIVACY-PRESERVING PUBLIC AUDITING FOR DATA STORAGE SECURITY IN CLOUD

Authorized data deduplication check in hybrid cloud With Cluster as a Service

RIGOROUS PUBLIC AUDITING SUPPORT ON SHARED DATA STORED IN THE CLOUD BY PRIVACY-PRESERVING MECHANISM

Ranked Keyword Search Using RSE over Outsourced Cloud Data

Enhancing Data Security in Cloud Storage Auditing With Key Abstraction

PRIVACY ASSURED IMAGE STACK MANAGEMENT SERVICE IN CLOUD

A Secure Decentralized Access Control Scheme for Data stored in Clouds

Cloud Database Storage Model by Using Key-as-a-Service (KaaS)

Keywords-- Cloud computing, Encryption, Data integrity, Third Party Auditor (TPA), RC5 Algorithm, privacypreserving,

Keywords Cloud Storage, Error Identification, Partitioning, Cloud Storage Integrity Checking, Digital Signature Extraction, Encryption, Decryption

Improving data integrity on cloud storage services

Secure Data Sharing in Cloud Computing using Hybrid cloud

SECURITY ENHANCEMENT OF GROUP SHARING AND PUBLIC AUDITING FOR DATA STORAGE IN CLOUD

A Road Map on Security Deliverables for Mobile Cloud Application

DELEGATING LOG MANAGEMENT TO THE CLOUD USING SECURE LOGGING

Secure Role-Based Access Control on Encrypted Data in Cloud Storage using Raspberry PI

An Efficient Security Based Multi Owner Data Sharing for Un-Trusted Groups Using Broadcast Encryption Techniques in Cloud

An Improved Authentication Protocol for Session Initiation Protocol Using Smart Card and Elliptic Curve Cryptography

Enabling Public Auditability, Dynamic Storage Security and Integrity Verification in Cloud Storage

Secure Data transfer in Cloud Storage Systems using Dynamic Tokens.

Development of enhanced Third party Auditing Scheme for Secure Cloud Storage

EFFICIENT AND SECURE ATTRIBUTE REVOCATION OF DATA IN MULTI-AUTHORITY CLOUD STORAGE

ADVANCE SECURITY TO CLOUD DATA STORAGE

SECURE RE-ENCRYPTION IN UNRELIABLE CLOUD USINGSYNCHRONOUS CLOCK

How To Ensure Data Integrity In Cloud Computing

Role Based Encryption with Efficient Access Control in Cloud Storage

Data management using Virtualization in Cloud Computing

A NOVEL APPROACH FOR MULTI-KEYWORD SEARCH WITH ANONYMOUS ID ASSIGNMENT OVER ENCRYPTED CLOUD DATA

Public Auditing & Automatic Protocol Blocking with 3-D Password Authentication for Secure Cloud Storage

Keywords: Authentication, Third party audit, cloud storage, cloud service provider, Access control.

A Review of Cloud Environment and Recognition of Highly Secure Public Data Verification Architecture using Secure Public Verifier Auditor

Data Security & Availability in Multi-Cloud Storage with Cooperative Provable Data Possession

Secure Privacy Preserving Public Auditing for Cloud storage

AN EFFICIENT AUDIT SERVICE OUTSOURCING FOR DATA IN TEGRITY IN CLOUDS

M. Nathiya 2 B.Tech. (IT), M.E. (CSE), Assistant Professor, Shivani Engineering College, Trichy, Tamilnadu, India.

Index Terms: Cloud Computing, Cloud Security, Mitigation Attack, Service Composition, Data Integrity. 1. Introduction

Implementation of Role Based Access Control on Encrypted Data in Hybrid Cloud

EFFICIENT AND SECURE DATA PRESERVING IN CLOUD USING ENHANCED SECURITY

Index Terms: Cloud Computing, Third Party Auditor, Threats In Cloud Computing, Dynamic Encryption.

Cryptographic Data Security over Cloud

ISSN Index Terms Cloud computing, outsourcing data, cloud storage security, public auditability

TITLE: Secure Auditing and Deduplicating Data in Cloud(Survey Paper)

Security Analysis of Cloud Computing: A Survey

An Efficiency Keyword Search Scheme to improve user experience for Encrypted Data in Cloud

Third Party Auditing For Secure Data Storage in Cloud through Trusted Third Party Auditor Using RC5

DESIGN AND IMPLEMENTATION OF A SECURE MULTI-CLOUD DATA STORAGE USING ENCRYPTION

Data Integrity by Aes Algorithm ISSN

A Secure & Efficient Data Integrity Model to establish trust in cloud computing using TPA

Verifying Correctness of Trusted data in Clouds

Privacy preserving technique to secure cloud

CRYPTOGRAPHIC SECURE CLOUD STORAGE MODEL WITH ANONYMOUS AUTHENTICATION AND AUTOMATIC FILE RECOVERY

SECURE AND EFFICIENT PRIVACY-PRESERVING PUBLIC AUDITING SCHEME FOR CLOUD STORAGE

Near Sheltered and Loyal storage Space Navigating in Cloud

Analysis of Secure Cloud Data Sharing Within a Group

Trusted Public Auditing Process for Secure Cloud Storage

SECURE AND TRUSTY STORAGE SERVICES IN CLOUD COMPUTING

DECENTRALIZED ACCESS CONTROL TO SECURE DATA STORAGE ON CLOUDS

SHARED DATA & INDENTITY PRIVACY PRESERVING IN CLOUD AND PUBLIC AUDITING

SECURITY ANALYSIS OF PASSWORD BASED MUTUAL AUTHENTICATION METHOD FOR REMOTE USER

Secrecy Maintaining Public Inspecting For Secure Cloud Storage

Secure Deduplication and Data Security with Efficient and Reliable Convergent Key Management

Single Sign-On Secure Authentication Password Mechanism

How To Make A Secure Storage On A Mobile Device Secure

Identifying Data Integrity in the Cloud Storage

Privacy & Security of Mobile Cloud Computing (MCC)

Distributed Attribute Based Encryption for Patient Health Record Security under Clouds

Secure & Encrypted Accessing and Sharing of Data in Distributed Virtual Cloud

Surveying Cloud Storage Correctness using TPA with BLS

Keywords: - Ring Signature, Homomorphic Authenticable Ring Signature (HARS), Privacy Preserving, Public Auditing, Cloud Computing.

86 Int. J. Engineering Systems Modelling and Simulation, Vol. 6, Nos. 1/2, 2014

Data Storage Security in Cloud Computing

Secure Collaborative Privacy In Cloud Data With Advanced Symmetric Key Block Algorithm

RSA BASED CPDP WITH ENCHANCED CLUSTER FOR DISTRUBED CLOUD STORAGE SERVICES

Secure Authentication of Distributed Networks by Single Sign-On Mechanism

SINGLE SIGN-ON MECHANISM FOR DISTRIBUTED COMPUTING SECURITY ENVIRONMENT

Enabling Public Auditing for Secured Data Storage in Cloud Computing

A Survey on Data Integrity of Cloud Storage in Cloud Computing

Cloud Data Service for Issues in Scalable Data Integration Using Multi Authority Attribute Based Encryption

Survey on Efficient Information Retrieval for Ranked Query in Cost-Efficient Clouds

How To Ensure Correctness Of Data In The Cloud

Transcription:

DATA SECURITY IN CLOUD USING ADVANCED SECURE DE-DUPLICATION Hasna.R 1, S.Sangeetha 2 1 PG Scholar, Dhanalakshmi Srinivasan College of Engineering, Coimbatore. 2 Assistant Professor, Dhanalakshmi Srinivasan College of Engineering, Coimbatore. Abstract: Data de-duplication is one among the data compression technique used for identifying and eliminating the duplicate copies of data in cloud to provide more storage space and bandwidth. To provide data self-reliant convergent encryption is done before data outsourcing to cloud. The proof of ownership concept is also implemented in order to prevent unauthorized attacks in client side. However the existing method cannot provide security for client side attacks who acts as the legal client and know some knowledge about files. Thus in this proof of ownership concept along with integrated cloud architecture, the storage space can be utilized efficiently and unauthorized access also prevented. In this method introduces advance secure de-duplication check technique for accessing (upload/download) the data in the cloud. Thus our proposed method produces better result than the existing method. Keywords: De-duplication, convergent encryption, proof of ownership, advanced secure deduplication, integrated cloud architecture. I. INTRODUCTION Cloud storage is the systematic model of data storage and its structure comprises multiple servers in various locations that are managed and owned by hosting companies. In cloud, data are stored in compressed form to transmit efficiently by which data can be accessed easily at anytime and anywhere as long as we have internet access. But main challenge in cloud storage is data management. In order to manage the data in the cloud, we have technique named De-duplication. Data deduplication is the data compression technique for eliminate the duplicate copies of repeated data in cloud which occupies more space and bandwidth. By data de-duplication method we can eliminate the duplicate copies of same data. Although data de-duplication has many benefits like data privacy and security, data undergoes many attacks. In existing system, they use proof of ownership along with authorized deduplication check, to make data self-reliant and data de-duplication accessible. The First step is convergent encryption, in that encryption and decryption is done with convergent key. The convergent key is obtained by computing hash value with data content. After convergent encryption, the user retains their keys and sends cipher texts to cloud. From the encryption process, the similar data in cloud generates the convergent keys and same cipher texts. In the authorized de-duplication technique, the user can generate tokens in private cloud from the tag generated by the message and keys. Then the tokens are forwarded to the public cloud to check the duplicate files along with proof of ownership protocol. From the proof, the user can know about the status file in cloud and do files accessing process. But there may be unwanted access of files due to the key sharing and collision of data occurs. In this paper we proposed, ownership protocol along with advanced secure de-duplication check to prevent unauthorized access. In this technique, the private key is generated by the www.jrret.com 64

private cloud during the first token generation and send to the public cloud. But the private key is placed in private cloud and not retained by the user. Then proof of ownership protocol is processed in public cloud to know duplicate file and it is addressed by the pointer. The pointer from the cloud is again passed to the private cloud by the user and the private cloud authenticates with the public cloud about the information and passes second token along with privileges (rights) to access the file in public cloud. By this user can avoid data collision and can prevent unauthorized access. II. RELATED WORK In this paper (Jia, et al., 2013), address the confidentiality preservation concern in client side de-duplication of encrypted data files with proof of ownership model. By the proof of ownership model the user can prove the ownership of the file in cloud. With the help of keys generated from convergent encryption, the de-duplication takes place in the cloud. But this paper provides de-duplication under a weak leakage model, in which certain information about the file is leaked. The author (Chuanyi liu, et al., 2013) introduced the policy based de-duplication scheme, to enable the different trust among the cloud components, duplication computing and security requirements. In every de-duplication technique, key management is the major issue. For that he proposed key management scheme was accessing and decrypting the shared deduplication chunks based on re-encryption algorithm. This de-duplication proxy can only de-duplicate the data of user that are already registered based on capabilities received from the key. In this paper (Pasquale Puzio, et al., 2012) explained ClouDedup, a secured data storage service which provides block level deduplication and data security at the same time. To provide security, he introduced additional encryption method and access control mechanisms. Additionally the author introduced the new component for managing the key during the block level de-duplication operation. But the new component is minimal and does not support the overall storage and computational costs. The author (Bellare, et al, 2013) proposed a new cryptographic technique that overcomes the drawbacks of traditional encryption techniques named Message Locked Encryption, helps in reducing the length of the cipher texts. In this encryption, a key is derived from the message itself by computing hash function with the message. Its security analysis highlights that this encryption method offers assurance to unpredictable messages but failing to achieve the data security. The author (Chao Yang, et al., 2013) proposes a scheme called cryptographically secure and efficient scheme in which the client should prove the server its entire ownership of the file using the spot checking technique. The precautionary analysis of the proposed method shows that the Provable Ownership of the File (POF) is generated and it also detects the misbehavior of the client in the cloud network. Finally the result analysis show that the proposed system gains better efficiency than the existing system in terms of decreasing the client s burden. The author (Wee Keong Ng, et al., 2012) introduced a private data de-duplication protocol that allows a client which holds private data to prove the server that he/she owns the files without providing additional information to the server. From the additional information only the cloud can know the authorized person who owns the files. Due to this, there may be like stealing of information about the file. www.jrret.com 65

III. PRELIMINARIES In this section we explain some of notation used in this paper, to provide secure deduplication. A. Symmetric Encryption In symmetric encryption, common secret key K is used for encryption and decryption. It has three important term functions. Keygen SE (M) Key generating algorithm that generates key K Enc SE (K,M) In encryption algorithm, K (key) and M (message )given as input and C (cipher text) as output Dec SE (C, K) In decryption algorithm C (cipher text) and K (key) as input and M original message as output B. Convergent Encryption Convergent encryption provides self-reliant data in data de-duplication. As explained in figure1, user derives convergent key by computing hash function with the original data and encrypting the data blocks with convergent key. After encryption, user retains their keys and sends data to cloud storage. Additionally, the user can also derive tags, which helps in identifying the duplicate file in public cloud. Data data Generate key H from data D Key Encryption data block D with hash key H CIPHER TEXT Data Fig 1 Convergent Encryption CLOUD STORAGE Decryption data from cipher text and key It has three important terms: Keygen CE (M) Key generation algorithm to generate convergent key K from message M using hashing algorithms. For example MD5,SHA-1 Enc CE (K,M) In encryption algorithm, K (convergent key) and M (message)given as input and C (cipher text) as output Dec CE (C,K) In decryption algorithm C (cipher text) and K(Convergent key) as input and M original message as output Taggen CE (M) Tag generation algorithm,original data M as input and T(M) as output C. Proof Of Ownership Model The proof of ownership protocol enables the user to prove ownership of the file stored in cloud before uploading the file. The proof of ownership can be done by identification protocol. Identification Protocol Proof of ownership is implemented by algorithm, run by a prover (user) and a verifier (storage server).an identification protocol has two phases: Proof: In proof stage, a prover/user sends his identity to a verifier for checking other proof, whether it is related to his identity or not. Convergent key is the input of the prover/user. Verify: The verifier performs verification process in cloud and outputs either for uploading or downloading data. IV. AUTHORIZED DE-DUPLICATION CHECK In the authorized de-duplication technique, the convergent keys are retained by the users during encryption process. By the tag www.jrret.com 66

generated from message and the key, the user can generate the tokens in private cloud and can access de-duplication check in the public cloud. If duplicate is found, the user proceeds proof of ownership concept. If the proof is passed, he/she will be assigned with pointer to access the file. If there is no duplicate, then user can encrypt the file with the convergent key and uploads to the public cloud. The authorized de-duplication has several problems as 1. First each user will be issued a private key to access the privileges of their corresponding files. The user uses private keys to generate the file token and can process the deduplication check in public cloud. However during file uploading the user need to compute the token to share their file with other users with certain privilege (messages). Such restriction makes the authorized duplication check to be mostly used and limited. 2. Second it cannot prevent the privilege private keys shared among the users. Sometimes users may be issued with same private key for same privileges in construction. As a result, the data collision takes place there is a chance of creating another privilege private key. V. ADVANCED SECURE DATA DE- DUPLICATION In order to solve the above problems, we propose advanced de-duplication technique supporting authorized de-duplication check. Generally after the convergent encryption, keys are retained by the user. Along with the tag generated from message and key, the user can generate tokens from private cloud and access the de-duplication process. But in this method separate private keys is generated by the private cloud for the tag and convergent keys send from the user, which is not directed to the user but it will be placed in the private cloud. With the help of the token, user can access de-duplication process. If the duplicate file is found, the user can perform proof of ownership concept. If proof is passed, the user is provided with pointer for the file. Then user returns the pointer send by the public cloud to the private cloud server. After receiving the request, the private cloud interact with public cloud regarding proof.the token is generated in private cloud, with the tag generated from the message and the private key generated by the private cloud and passes to the user when the proof is passed. By this the user can access the file. If there is no duplicate, then user is provided with the pointer and again the verification process happens in private cloud. If verification passes, the user can encrypt the file along with key and uploads to cloud. A. Architecture for Advanced Secure De- Duplication Check De-duplication mainly depends on three entities, namely: the user, the cloud service provider Storage (CSP-S) or public cloud and the private cloud that shown in the figure 2. User: A user is a person who needs to outsource data to the public cloud for storage and access the data whenever he/she needs. For saving storage space and bandwidth, the user should perform de-duplicate check to avoid redundancy files that present in cloud. CSP-S: The CSP-S provides the data outsourcing service and stores data of the users. To reduce the cost of storage, the CSP-S removes the storage of redundant data via deduplication and keeps only unique data. Private cloud: Private keys are managed by private cloud in order to give the privileges as per their www.jrret.com 67

designation. Also authentication details about the user also stored in private cloud to avoid data stealing. If user identity does not match, it will not send the tokens. Public cloud AUTHENTICATION A. Security Level From fig 3, it is explained that already existing de-duplication check provides less security level due to the keys shared among the users and there will unauthorized access of files in public cloud. There are also chances of collision of data takes place. But in our proposed system it is noted private cloud authenticate with the public cloud about the unique identity and then provide access to the file by the other token. Hence by this security level is increased and there will not be any attack in cloud. Privilege key User Private cloud Fig 3 Security Level Vs. File Size (Mb) Fig 2 Advanced Secure De-Duplication Check De-duplication can be done in two ways either by file level or block level. The file-level deduplication removes the storage of any repeated files. The block-level de-duplication divides a file or data into smaller length or variable length blocks and deletes the storage of any redundant blocks. We deploy deduplication techniques in file level or block level to avoid the storage of redundancy data, occupy more memory space. Before uploading the file, the user performs file-level deduplication. If the file is duplicate the block of data will also be duplicate otherwise block level de-duplication should be performed. B. Key Management From fig.4 in this proposed system convergent key are managed by the user and the private keys are managed by the private cloud. Hence there will not be unwanted access of data in cloud. But the existing system undergoes collision and unwanted access of data because of keys sharing. VI. RESULT ANALYSIS The proposed system mainly depends upon security level, data storage and key management. Fig 4 Key Management www.jrret.com 68

VII. CONCLUSION In this paper, advanced secure data deduplication was proposed to provide security to data by including authentication between the integrated cloud structures. The proposed method supports duplicate check in integrated cloud architecture, in which the token are generated by the private cloud with the private key.as a proof of concept, the user can prove ownership to the file that is present in cloud. We explained that our advanced secure duplicate check scheme incurs minimal attacks compared to authorized de-duplication method. REFERENCES [1] Z. Li, R. Owens, B. Bhargava and W. Wang,, Secure and Efficient Access to Outsourced Data, in Proceedings of ACM Cloud Computing Security Workshop, pp. 55-66, Nov. 2009. [2]J.C. Lui, P.P. Lee,, Y. Tang and R. Perlman, Secure Overlay Cloud Storage with Access Control and Assured Deletion, IEEE Transactions on Dependable Secure Computing vol. 9, no. 6, pp. 903-916, Nov./Dec. 2012. [3] S. Keelveedhi,, M. Bellare, and T. Ristenpart, Message-Locked Encryption and Secure De-duplication, in Proceedings of IACR Cryptology eprint Archive, pp. 2963122012:631, 2012. [6] J. R. Douceur, A. Adya, W. J. Bolo sky, D. Simon, and M. Theimer. Reclaiming space from duplicate files in a server less distributed file system, In ICDCS, pages 617 624, 2002. [7] J. Li, X. Chen, M. Li, J. Li, P. Lee, and W. Lou, Secure de-duplication with efficient and reliable convergent key management, In IEEE Transactions on Parallel and Distributed Systems, 2013. [8] M.W. Storer, K. Greenan, D.D.E. Long, and E.L. Miller, Secure Data Deduplication, in Proceedings of Storage security and survivability, pp. 1-10, 2008. [9] Xiao feng Chen,Jin Li, Yan Kit Li,, Patrick P. C. Lee, Wenjing Lou, A Hybrid Cloud Approach for Secure Authorized Deduplication, In IEEE Transactions on Parallel and Distributed Systems, pp no:99, Year 2014. [10], C. Shi, A. Yun and Y. Kim, On Protecting Integrity and Confidentiality of Cryptographic File System for Outsourced Storage, in Proceedings of ACM Cloud Computing Security Workshop, pp. 67-76, Nov. 2009. [11], Jian Ren,Chao Yang and Jianfeng Ma, Provable Ownership of File in De-duplication Cloud Storage, In IEEE Globecom 2013 - Communication and Information System Security Symposium, Year 2013. [4] Y. Wen,W.K. Ng,, and H. Zhu, Private Data De-duplication Protocols in Cloud Storage in Proceedings of 27th Annual. ACM Symposium. Applications of Computing S. Ossowski and P. Lecca, Eds., pp. 441-446. 2012. [5] M.Bellare, S. Keelveedhi, and T. Ristenpart, Message-Locked Encryption and Secure de- duplication, in Proceedings of IACR Cryptology eprint Archive, pp. 296-3122012:631,2012. www.jrret.com 69