Implementing Remote Data Integrity Checking Protocol for Secured Storage Services with Data Dynamics and Public Verifiability In Cloud Computing

Similar documents
Efficient Remote Data Possession Checking In Critical Information Infrastructures Ensuring Data Storage Security In Cloud Computing

Selective dependable storage services for providing security in cloud computing

EFFICIENT AND SECURE DATA PRESERVING IN CLOUD USING ENHANCED SECURITY

Data Storage Security in Cloud Computing for Ensuring Effective and Flexible Distributed System

Index Terms Cloud Storage Services, data integrity, dependable distributed storage, data dynamics, Cloud Computing.

Near Sheltered and Loyal storage Space Navigating in Cloud

Secure Way of Storing Data in Cloud Using Third Party Auditor

Data Storage Security in Cloud Computing

Improving data integrity on cloud storage services

SECURE CLOUD STORAGE PRIVACY-PRESERVING PUBLIC AUDITING FOR DATA STORAGE SECURITY IN CLOUD

Erasure correcting to enhance data security in cloud data storage

Enhance Data Security in Cloud Computing using Layered Interleaving Approach

How To Ensure Correctness Of Data In The Cloud

ADVANCE SECURITY TO CLOUD DATA STORAGE

CLOUD BASED STORAGE SERVICES FOR SECURITY FOLLOWED BY THE DATA AUTHENTICATION

How To Design A Cloud Data Storage Service For A Cloud Computer System

February. ISSN:

PRIVACY PRESERVING PUBLIC AUDITING FOR SECURED DATA STORAGE IN CLOUD USING BLOCK AUTHENTICATION CODE

Secure Data transfer in Cloud Storage Systems using Dynamic Tokens.

Ensuring Data Storage Security in Cloud Computing

Ensuring Data Storage Security in Cloud Computing By IP Address Restriction & Key Authentication

Australian Journal of Basic and Applied Sciences. Secure and Efficient Dynamic Data Operations in Cloud Computing

Secure Cloud StorageForPrivacy-Preserving Public Audit

PRIVACY-PRESERVING PUBLIC AUDITING FOR DATA STORAGE SECURITY IN CLOUD COMPUTING

Index Terms : cloud computing, Distributed Storage, error detection, data recovery, SHA, dynamic block operations

Cloud Data Storage Security and Public Auditing

To Provide Security & Integrity for Storage Services in Cloud Computing

Securing Cloud Data Storage

Trusted Public Auditing Process for Secure Cloud Storage

Security in Cloud Audit Using the Key Based Homomorphic Linear Authenticator

Index Terms: Data integrity, dependable distributed storage, Cloud Computing

Data storage security in Cloud

AN EXPOSURE TO RELIABLE STORAGE SERVICES IN CLOUD COMPUTING

A Secure & Efficient Data Integrity Model to establish trust in cloud computing using TPA

Index Terms: Cloud Computing, Third Party Auditor, Threats In Cloud Computing, Dynamic Encryption.

RIGOROUS PUBLIC AUDITING SUPPORT ON SHARED DATA STORED IN THE CLOUD BY PRIVACY-PRESERVING MECHANISM

Data Integrity for Secure Dynamic Cloud Storage System Using TPA

Enable Public Audit ability for Secure Cloud Storage

Secure Distribution of File on Cloud

Cloud Data Storage Services Considering Public Audit for Security

Keywords: Cloud Computing, Data Dynamics, Data Integrity, Error Localization, Third Party Auditor.

To Assure Factual Information Storage Security in Cloud Computing

Analysis of Secure Cloud Data Sharing Within a Group

PRIVACY-PRESERVING PUBLIC AUDITING FOR SECURE CLOUD STORAGE

An Efficient Data Correctness Approach over Cloud Architectures

How To Secure Cloud Computing, Public Auditing, Security, And Access Control In A Cloud Storage System

Dependable and Secure Storage Services in Cloud Computing

A Secure and Dependable Cloud Storage Service in Cloud Computing

ISSN Index Terms Cloud computing, outsourcing data, cloud storage security, public auditability

Enabling Public Auditing for Secured Data Storage in Cloud Computing

CONSIDERATION OF DYNAMIC STORAGE ATTRIBUTES IN CLOUD

Performance Evaluation Panda for Data Storage and Sharing Services in Cloud Computing

SHARED DATA & INDENTITY PRIVACY PRESERVING IN CLOUD AND PUBLIC AUDITING

Secure Storage Services and Erasure Code Implementation in Cloud Servers

Ensuring Data Integrity in Cloud Computing

How To Ensure Correctness Of Data In The Cloud

Distributing, Ensuring and Recovery of Data Stored in Cloud

Third Party Auditing For Secure Data Storage in Cloud through Trusted Third Party Auditor Using RC5

Survey Paper on Integrity Auditing of Storage

preliminary experiment conducted on Amazon EC2 instance further demonstrates the fast performance of the design.

A Survey on Secure Storage Services in Cloud Computing

PRIVACY PRESERVING AND BATCH AUDITING IN SECURE CLOUD DATA STORAGE USING AES

Privacy-Preserving Public Auditing & Data Intrgrity for Secure Cloud Storage

SECURITY ENHANCEMENT OF GROUP SHARING AND PUBLIC AUDITING FOR DATA STORAGE IN CLOUD

Remote Data Integrity Checking in Cloud Computing

Verifying Correctness of Trusted data in Clouds

PERFORMANCE OF BALANCED STORAGE SERVICES IN CLOUD SYSTEM

Development of enhanced Third party Auditing Scheme for Secure Cloud Storage

Replicated Data Integrity Verification in Cloud

Keywords Cloud Storage, Error Identification, Partitioning, Cloud Storage Integrity Checking, Digital Signature Extraction, Encryption, Decryption

A Review on Towards Audit of Secure Storage Correctness Verification and Dependable Storage Error Recovery in Cloud Computing

Privacy Preserving Public Auditing for Data in Cloud Storage

Public Auditability and Privacy preserving in Cloud Storage

AN APPROACH TOWARDS FUNCTIONING OF PUBLIC AUDITABILITY FOR CLOUD ENRICHMENT

IMPLEMENTATION CONCEPT FOR ADVANCED CLIENT REPUDIATION DIVERGE AUDITOR IN PUBLIC CLOUD

Secure Storage Service Using Homomorphic Tokens and Dependable Erasure Coded Data in Cloud Computing

An Integrated Approach of Data storage and Security in Cloud Computing

OVERVIEW OF SECURITY ISSUES IN CLOUD COMPUTING

Verification of Data Reliability and Secure Service for Dynamic Data in Cloud Storage

Keywords-- Cloud computing, Encryption, Data integrity, Third Party Auditor (TPA), RC5 Algorithm, privacypreserving,

Ensuring Data Integrity of Cloud Storage Services In Cloud Computing

Secrecy Maintaining Public Inspecting For Secure Cloud Storage

Enabling Public Auditability, Dynamic Storage Security and Integrity Verification in Cloud Storage

IJCSIET-ISSUE4-VOLUME2-SERIES1 Page 1

International Journal of Infinite Innovations in Engineering and Technology. ISSN (Online): , ISSN (Print):

I. Introduction. A. Related Work

Enabling Data Security in Cloud Computing Using Third Party Auditing and Encryption Services

A Novel Re-Authentication Scheme on Cloud Based Storage Services T.G.V.V.Srinivas 1, P.Suresh Babu 2 1 Final M.Tech Student, 2 Associate professor

SECURE AND TRUSTY STORAGE SERVICES IN CLOUD COMPUTING

Surveying Cloud Storage Correctness using TPA with BLS

Data Dynamics for Storage Security and Public Auditability in Cloud Computing

Privacy-Preserving Public Auditing For Secure Cloud Storage

Enhancing Data Security in Cloud Storage Auditing With Key Abstraction

Insuring Security for Outsourced Data Stored in Cloud Environment

How To Ensure Data Integrity In Cloud Computing

Data Integrity and Dynamic Storage Way in Cloud Computing

Privacy preserving technique to secure cloud

Cloud Data Service for Issues in Scalable Data Integration Using Multi Authority Attribute Based Encryption

Keywords Cloud Computing, Cloud computing Architecture, Intruder, Security Issues.

A Survey on Privacy-Preserving Techniques for Secure Cloud Storage

Transcription:

Implementing Remote Data Integrity Checking Protocol for Secured Storage Services with Data Dynamics and Public Verifiability In Cloud Computing Ms. Vaishnavi Moorthy 1, Dr. S. Sivasubramaniam 2 1 (M.Tech-Final Year, Department of Computer Science & Engineering, Bharath University, India) 2 (Associate Professor, Department of Computer Science & Engineering, Bharath University, India) ABSTRACT Cloud computing aims to enable end-users to easily create and use software without a need to worry about the technical implementations and nitty-gritties such as the software's physical hosting location, hardware specifications, efficiency of data processing. Recently many works focus on providing data dynamics and/or public verifiability to the protocols checking the remote integrity with the help of third party verifiers. In this paper, Seb e et al. s protocol is adapted to support public verifiability and data dynamics without the help of a third party auditor. This design allows users to audit the cloud storage with very lightweight communication and computation cost. In addition, the auditing result not only ensures strong cloud storage correctness guarantee, but also simultaneously achieves fast data error localization, i.e., the identification of misbehaving server. The design further supports secure and efficient dynamic operations on outsourced data, including block modification, deletion, and append. Through a formal analysis, the correctness and security of the protocol is shown. The proposed scheme is highly efficient and resilient against the malicious data modification attack, server clouding attacks and failure. Keyword Cloud computing, data dynamics, data integrity, error localization, public verifiability, Seb e etal s protocol. 1. INTRODUCTION Cloud computing is the long dreamed vision of computing as a utility, where users can remotely store their data into the cloud so as to enjoy high quality applications and services from a shared pool of configurable computing resources (fig 1). It has been envisioned as the on-demand self-service, ubiquitous network access, location independent resource pooling, rapid resource elasticity, usage-based pricing and transference of risk [1]. It works on a client-server basis, using web browser protocols. A cloud user needs a client device such as a laptop or desktop computer, pad computer, smart phone, or other computing resource with a web browser (or other approved access route) to access a cloud system via the World Wide Web. Typically the user will log into the cloud at a service provider or private company, such as their employer. The cloud provides server-based applications and all data services to the user, with output displayed on the client device. Memory allocated to the client system's web browser is used to make the application data appear on the client system display, but all computations and changes are recorded by the server, and final results including files created or altered are permanently stored on the cloud servers. Performance of the cloud application is dependent upon the network access, speed and reliability as well as the processing speed of the client device [2]. While Cloud Computing makes these advantages more appealing than ever, it also brings new and challenging security threats towards users outsourced data. Since cloud service providers (CSP) are separate administrative entities, data outsourcing is actually relinquishing user s ultimate control over the fate of their data. As a result, the correctness of the data in the cloud is being put at risk due to the following reasons. First of all, although the infrastructures under the cloud are much more powerful and reliable than personal computing devices, they are still facing the broad range of both internal and external threats for data integrity [3][4][5]. Secondly, for the benefits of their own, there do exist various motivations for cloud service providers to behave unfaithfully towards the cloud users regarding the status of their outsourced data [6][7]. These problems, impedes the successful deployment of the cloud architecture. The increasing network bandwidth and reliable yet flexible network connections make it even possible that users can now subscribe high quality services from data and software that reside solely on remote data centres. Fig: 1 Cloud Computing Hence, a lot of works [16] have been done on designing remote data integrity checking protocols, which allow data integrity to be checked without completely downloading the data. These protocols support data ISSN: 2250-3021 www.iosrjen.org 496 P a g e

dynamics at the block level, including block insertion, block modification and block deletion, support public verifiability, by which anyone (not just the client) can perform the integrity checking operation against third party verifiers. In addition the protocol should achieve the storage correctness insurance as well as data error localization: whenever data corruption has been detected during the storage correctness verification, the scheme should almost guarantee the identification of the misbehaving server(s) for effective cloud storage. In this paper, the main contributions of the proposed Seb e et al. s protocol are: (1) A remote data integrity checking protocol for cloud storage, which can be viewed as an adaptation of Seb e et al. s protocol [14]. The proposed protocol inherits the support of data dynamics from and supports public verifiability and privacy against third party verifiers, while at the same time it doesn t need to use a third-party auditor. efficiently audit the cloud data storage without local copy of data and without bringing in additional on-line burden to cloud users. However, any possible leakage of user s outsourced data towards TPA through the auditing protocol should be prohibited. To achieve the audit delegation and authorize CS to respond to TPA s audits, the user can sign a certificate granting audit rights to the TPA s public key, and all audits from the TPA are authenticated against such a certificate. For external attacks, data integrity threats may come from outsiders who are beyond the control domain of CSP, for example, the economically motivated attackers. They may compromise a number of cloud data storage servers in different time intervals and subsequently be able to modify or delete users data while remaining undetected by CSP. (2) A security analysis of the proposed protocol, which shows that it is secure against the untrusted server and private against third party verifiers, is given [14]. (3) Compared to many of its predecessors, which only provide binary results about the storage status across the distributed servers, the proposed scheme achieves the integration of storage correctness insurance and data error localization, i.e., the identification of misbehaving server(s) [15]. (4) Unlike most prior works for ensuring remote data integrity, the new scheme further supports secure and efficient dynamic operations on data blocks, including: update, delete and append [14]. The research paper is organised as Section II describing the existing system and proposed system. Section III outlines the proposed protocol functions and Section IV tabulates the comparison of the existing and proposed protocols. Section V lists the related work and Section VI concludes the paper with future work. 2. BACKGROUND WORK 2.1 Existing System In the existing system cloud data storage service involve three different entities, the cloud user (U), who has large amount of data files to be stored in the cloud; the cloud server (CS), which is managed by cloud service provider (CSP) to provide data storage service; the third party auditor (TPA), who has expertise and capabilities that cloud users do not have and is trusted to assess the cloud storage service security on behalf of the user upon request. Users rely on the CS for cloud data storage and maintenance (fig 2). The TPA, who is in the business of auditing, is assumed to be reliable and independent, and thus has no incentive to collude with either the CS or the users during the auditing process. TPA should be able to Fig.2 Existing System 2.2 Proposed System In proposed system an effective and flexible distributed scheme with explicit dynamic data support, including block update, delete, and append. The adaptation of proposed protocol- Seb e et al. s protocol with distributed verification of erasure-coded data, the scheme achieves the public verifiability and data dynamics against the third party verifiers which shows the detection of data corruption during the storage correctness verification across the distributed servers (Fig.3). The protocol design will achieve the following security and performance guarantee: 1) Public auditability 2) Storage correctness 3) Privacy-preserving 4) Batch auditing 5) Lightweight. The model we propose aims to protect cloud data against untrusted service providers. This model involves data owners, cloud service providers, and data users. Data owners store data in the cloud and send every share of data entries to the service providers. Data users access data from the service providers and have access to the Public information of data owners in order to verify the shares received from service providers. Fast localization of data error is to effectively locate the malfunctioning server when data corruption has been detected. ISSN: 2250-3021 www.iosrjen.org 497 P a g e

the data should be kept private against the third party verifier. Fig.3 Proposed System 3. PROTOCOL ANALYSIS A cloud storage system in which there is a client and an untrusted server is considered. The client stores her data in the server without keeping a local copy. Hence, it is of critical importance that the client should be able to verify the integrity of the data stored in the remote untrusted server. If the server modifies any part of the client s data, the client should be able to detect it and should not be detected by any third party verifier. In this case, when a third party verifier verifies the integrity of the client s data, is the length of each file block. Denote by fk( ) a pseudo-random function which is defined as:f : {0, 1}k {0, 1}log2(n) {0, 1}d, in which k and d are two security parameters. Furthermore, denote the length of N in bits by N. (a) SetUp (1 k ) (pk, sk): Given the security parameter k, this function generates the public key pk and the secret key sk. pk is public to everyone, while sk is kept secret by the client. (b) TagGen (pk, sk,m) Dm: Given pk, sk and m, this function computes a verification tag D m and makes it publicly known to everyone. This tag will be used for public verification of data integrity. (c) Challenge (pk,d m ) chal: Using this function, the verifier generates a challenge chal to request for the integrity proof of file m. The verifier sends chal to the server. (d) GenProof (pk,d m,m, chal) R: Using this function, the server computes a response R to the challenge chal. The server sends R back to the verifier. (e) CheckProof (pk,d m, chal,r) { success, failure }: The verifier checks the validity of the response R. If it is valid, the function outputs success, otherwise the function outputs failure. The secret key sk is not needed in the CheckProof function. These functions are used for data dynamics. The proposed protocol is correct in the sense that the server can pass the verification of data integrity as long as both the client and the server are honest. Then the protocol is secure against the untrusted server. The protocol guarantee is that, assuming the client is honest, if and only if the server has access to the complete and uncorrupted data, it can pass the verification process successfully. Finally the protocol is private against third party verifiers. To design the remote data integrity checking, Seb e et al. s protocol the following five functions needed are (a) SetUp, (b) TagGen, (c) Challenge (d) Gen-Proof (e) Check-Proof Let m be the file that will be stored in the untrusted server, which is divided into n blocks of equal lengths: m = m 1,m 2...m n, where n =.ךm /l Here l checking in a challenge for the encrypted file matrix en f and m be the matching factor. Using the function, the verifier can detect the unusual behaving server and if none of the specified rows in the process are deleted or modified, the adversary avoids the detection. There are three security requirements for the data integrity checking protocol: remote Security against the server with public verifiability. Privacy against third party verifiers. Identifying misbehaviour server. When the verifier is not the client themselves, the protocol must ensure that no private information about the client s data is leaked to the third party verifier. 3.1 Security against the Server with Public Verifiability If CheckProof(pk,D m, chal,r) = success, then the remote data integrity checking protocol is said to be secure against the server for any PPT (probabilistic polynomial time). 3.2 Privacy against Third Party Verifiers For the remote data integrity checking protocol Π, if there exists a PPT simulator S V such that {S V (x, f V (x, y))} x,y {0,1}* {view Π V (x, y)} x,y {0,1}*, then Π is the protocol that ensures privacy against third-party verifiers, where denotes computational indistinguishability. (f) CheckMisbehave(r, en f,m ) n: Let r be the number of different rows for which the user asks for ISSN: 2250-3021 www.iosrjen.org 498 P a g e

3.3 Identification Probability for Misbehaving Servers From user s perspective, the model has to capture all kinds of threats towards the cloud data integrity. Because cloud data do not reside at user s local site but at CSP s address domain, these threats can come from two different sources: internal and external attacks. For internal attacks, the domain can be self-interested, untrusted and possibly malicious. Not only does it desire to move data that has not been or is rarely accessed to a lower tier of storage than agreed for monetary reasons, but it may also attempt to hide a data loss incident due to management errors. For external attacks, data integrity threats may come from outsiders who are beyond the control domain, for example, the economically motivated attackers. They may compromise a number of cloud data storage servers in different time intervals and subsequently be able to modify or delete users data while remaining undetected. Therefore, the proposed model has the capabilities, which captures both external and internal threats towards the cloud data integrity. Suppose n servers are misbehaving due to the possible compromise failure, assume the adversary modifies the data blocks in z rows out of the l rows in the encoded file matrix. Let r be the number of different rows for which the user asks for checking in a challenge. Let X be a discrete random variable that is defined to be the number of rows chosen by the user that matches the rows modified by the adversary. The matching probability that at least one of the rows picked by the user matches one of the rows modified by the adversary is analysed first. P r m = 1-P{X=0} r 1 = 1 i=0 (1 min { z^,1}) 1- lm z l i l If none of the specified r rows in the i th verification process are deleted or modified, the adversary avoids the detection. This can be achieved by comparing the response values R i with the pre-stored tokens v i, where j {1,..., n}. The probability for error localization or identifying misbehaving server(s) is computed in a similar way. It is the product of the matching probability for sampling check and the probability of complementary event for the false negative result. ^P r m = 1 r 1 i=0(1 min { z^ l i,1}) The matching probability is where z^ z, ^ P r m matching probability of the modified rows in encoded file matrix. Next, the false negative probability P r f is considered such that R i = v i when at least one of z^ blocks is modified. When two different data vectors collide, the probability is ^P r f =2 -p Thus, the identification probability for misbehaving server(s) is r where P d is modification. ^P d = ^P r m. (1 ^P r f) the detection probability against data The above formulation for localization of a misbehaving server is integrated to the remote data integrity checking, Seb e et al. s protocol, thus making it more efficient and secured protocol. The protocol can be easily extended into a probabilistic one by using the probabilistic framework. The proposed protocol has very good efficiency in the aspects of communication, computation and storage costs. 4. RELATED WORK Shah et al. [8], [9] propose allowing a TPA to keep online storage honest by first encrypting the data then sending a number of pre-computed symmetric-keyed hashes over the encrypted data to the auditor. The auditor verifies both the integrity of the data file and the server s possession of a previously committed decryption key. This scheme only works for encrypted files and it suffers from the auditor statefulness and bounded usage, which may potentially bring in on-line burden to users when the keyed hashes are used up. Ateniese et al. [6] were the first who defined the provable data possession (PDP) model for ensuring possession of file on untrusted storages.. Their scheme utilizes the RSA-based homomorphic authenticators for auditing outsourced data and suggests randomly sampling a few blocks of the file. However, the public auditability in their scheme demands the linear combination of sampled blocks exposed to external auditor. When used directly, their protocol is not provably privacy preserving, and thus may leak user data information to the auditor. In their subsequent work, Ateniese et al. [10] described a PDP scheme that uses only symmetric key based cryptography. This method has lower-overhead than their previous scheme and allows for block updates, deletions and appends to the stored file, which has also been supported in our work. However, their scheme focuses on single server scenario and does not provide data availability guarantee against server failures, leaving both the distributed scenario and data error recovery issue unexplored. The explicit support of data dynamics has further been studied in the two recent works [11] and [12]. Schwarz et al. [13] proposed the concept which would ensure static file integrity across multiple distributed servers, using erasure-coding and block-level file integrity checks. Some ideas of their distributed storage verification protocol are being adopted. However, the scheme further support data dynamics and explicitly studies the problem of misbehaving server identification, while theirs did not. Zhuo Hao et.al [14] proposed the remote data integrity checking protocol that supports public verifiability without the support of TPA and compared the properties of the proposed protocol with the then existing protocols. Wang et al.[15] in their work proposed a flexible distributed cloud storage integrity auditing mechanism utilizing the homomorphic token and ISSN: 2250-3021 www.iosrjen.org 499 P a g e

distributed erasure coded data that detects the Byzantine failure, malicious data modification attack and server clouding attacks. All the above schemes provide efficient methods for secured data verifiability, data storage integrity and detection of server attacks in the cloud based storage separately. In this paper the proposed Seb e et al s protocol combines the mentioned characteristic functions together making it more efficient and secured when compared to other protocols. 5. CONCLUSION & FUTURE WORK In this paper, a privacy-preserving protocol for data storage in the cloud is been proposed investigating the data security and integrity in cloud storage. The focus is on stopping data being disclosed by un-trusted service providers when data owners distribute their database entries. To achieve the assurances of cloud data integrity and availability and enforce the quality of dependable cloud storage service for users, an effective and flexible distributed scheme with explicit dynamic data support, including block update, delete and append is being implemented. The future research aims to extend the protocol to support data level dynamics at minimal costs. This includes the extensive security and resilient to the failures like Byzantine failure and other malicious data attacks. REFERENCES [1] P. Mell and T. Grance, Draft nist working definition of cloud computing, Referenced on June. 3rd, 2009. Online at http:// csrc.nist. gov/groups/sns/cloud-computing/index.html, 2009. [2] www.google/wikipedia/cloud_computing [3] Amazon.com, Amazon s3 availability event: July 20, 2008, Online at http://status.aws. amazon.com/s3-20080720. html, July 2008. [4] S.Wilson, Application engine outage, Onlinehttp://www.cio-weblog.com/50226711/ appengine outage.php, June 2008. [5] B.Krebs, Payment Processor Breach May Be Largest Ever, Online at http://voices washingtonpost.com/securityfix/2009/01/payment processor breach may b.html, Jan. 2009. [6] G. Ateniese, R. Burns, R. Curtmola, J. Herring, L. Kissner, Z. Peterson, and D. Song, Provable data possession at untrusted stores, Cryptology eprint Archive, Report 2007/202, 2007, http://eprint.iacr.org/. [7] Q. Wang, C. Wang, J. Li, K. Ren, and W. Lou, Enabling public verifiability and data dynamics for storage security in cloud computing, in Proc. of ESORICS 09, Saint Malo, France, Sep. 2009.M. [8] A. Shah, R. Swaminathan, and M. Baker, Privacypreserving audit and extraction of digital contents, Cryptology eprint Archive, Report 2008/186, 2008, http://eprint.iacr.org/. [9] M. A. Shah, M. Baker, J. C. Mogul, and R. Swaminathan, Auditing to keep online storage services honest, in Proc. of HotOS 07. Berkeley,CA, USA: USENIX Association, 2007, pp. 1 6. [10] G. Ateniese, R. D. Pietro, L. V. Mancini, and G. Tsudik, Scalable and efficient provable data possession, in Proc. of SecureComm 08, 2008, pp. 1 10. [11] Q. Wang, C. Wang, J. Li, K. Ren, and W. Lou, Enabling public verifiability and data dynamics for storage security in cloud computing, in Proc. of ESORICS 09, volume 5789 of LNCS. Springer- Verlag, Sep. 2009, pp. 355 370. [12] Erway, A. Kupcu, C. Papamanthou, and R. Tamassia, Dynamic provable data possession, in Proc. of CCS 09, 2009, pp. 213 222. [13] T. Schwarz and E. L. Miller, Store, forget, and check: Using algebraic signatures to check remotely administered storage, in Proc. of ICDCS 06, 2006, pp. 12 12. [14] Zhuo Hao, Sheng Zhong, Member, IEEE, and Nenghai Yu, Member, A Privacy-Preserving Remote Data Integrity Checking Protocol with Data Dynamics and Public Verifiability, IEEE-2011 [15] Towards Secure and Dependable Storage Services in Cloud Computing Cong Wang, Student Member, IEEE, Qian Wang, Student Member, IEEE, Kui Ren, Member, IEEE,Ning Cao, Student Member, IEEE, and Wenjing Lou, Senior Member, IEEE-2011 [16] Wang, Q. Wang, K. Ren, and W. Lou, Privacypreserving public auditing for data storage security in cloud computing, in InfoCom2010, IEEE, March 2010. AUTHORS PROFILE Ms. Vaishnavi Moorthy. M.Tech final Year Student of Computer Science & Engineering Dept-Bharath University, Chennai, Tamil Nadu & a Member of ACM (Association for Computing Machinery) Journal-New York/Canada. Technical Paper submitted on Biometrics Applications. Real time Application Research Projects have been done in (1) Windows Mobile Smartphones (2) Securing the Middlewared web services in the Android Smartphones using Rapid Service Designer Tool for Efficient & Secured Business Transactions. Has presented paper in International Conference on Science, Engineering & Technology 2011 (ISBN-13 978-81-923320-3-1) Dr.S.Sivasubramanain.,M.Tech,Ph.Dan Associate professor from Department of Computer Science & Engineering in Bharath University, Chennai, Tamil Nadu. A Member of CSI (Computer Society of India) and INAG (Ionosonde Network Advisory Group). Has more than 11 years of teaching and research experience and areas of specialization are mobile computing, Database Management System, Computer Networks, Networks Security and Data Mining. ISSN: 2250-3021 www.iosrjen.org 500 P a g e