Secure Hybrid Cloud Architecture for cloud computing



Similar documents
Optimized And Secure Data Backup Solution For Cloud Using Data Deduplication

DATA SECURITY IN CLOUD USING ADVANCED SECURE DE-DUPLICATION

An Authorized Duplicate Check Scheme for Removing Duplicate Copies of Repeating Data in The Cloud Environment to Reduce Amount of Storage Space

Security of Cloud Storage: - Deduplication vs. Privacy

A Secure & Efficient Data Integrity Model to establish trust in cloud computing using TPA

TITLE: Secure Auditing and Deduplicating Data in Cloud(Survey Paper)

Keywords Cloud Storage, Error Identification, Partitioning, Cloud Storage Integrity Checking, Digital Signature Extraction, Encryption, Decryption

Security Analysis of Cloud Computing: A Survey

Side channels in cloud services, the case of deduplication in cloud storage

A Survey on Secure Auditing and Deduplicating Data in Cloud

ISSN: (Online) Volume 2, Issue 1, January 2014 International Journal of Advance Research in Computer Science and Management Studies

Data Integrity by Aes Algorithm ISSN

SECURE CLOUD STORAGE PRIVACY-PRESERVING PUBLIC AUDITING FOR DATA STORAGE SECURITY IN CLOUD

A Survey on Deduplication Strategies and Storage Systems

Public Auditing & Automatic Protocol Blocking with 3-D Password Authentication for Secure Cloud Storage

DESIGN AND IMPLEMENTATION OF A SECURE MULTI-CLOUD DATA STORAGE USING ENCRYPTION

SECURE BACKUP SYSTEM DESKTOP AND MOBILE-PHONE SECURE BACKUP SYSTEM HOSTED ON A STORAGE CLOUD

Original-page small file oriented EXT3 file storage system

Proof of Ownership in Remote Storage Systems

CLOUD COMPUTING SECURITY ARCHITECTURE - IMPLEMENTING DES ALGORITHM IN CLOUD FOR DATA SECURITY

Improving data integrity on cloud storage services

Mona: Secure Multi-Owner Data Sharing for Dynamic Groups in the Cloud

Journal of Electronic Banking Systems

SECURITY ENHANCEMENT OF GROUP SHARING AND PUBLIC AUDITING FOR DATA STORAGE IN CLOUD

Back2Cloud: A highly Scalable and Available Remote Backup Service

A Novel Way of Deduplication Approach for Cloud Backup Services Using Block Index Caching Technique

A Survey on Aware of Local-Global Cloud Backup Storage for Personal Purpose

SECURE AND TRUSTY STORAGE SERVICES IN CLOUD COMPUTING

Service Overview CloudCare Online Backup

Data management using Virtualization in Cloud Computing

International Journal of Advance Research in Computer Science and Management Studies

Keywords Cloud Computing, CRC, RC4, RSA, Windows Microsoft Azure

Cryptographic Data Security over Cloud

(C) Global Journal of Engineering Science and Research Management

Efficient Framework for Deploying Information in Cloud Virtual Datacenters with Cryptography Algorithms

preliminary experiment conducted on Amazon EC2 instance further demonstrates the fast performance of the design.

Data Reduction: Deduplication and Compression. Danny Harnik IBM Haifa Research Labs

IJESRT. Scientific Journal Impact Factor: (ISRA), Impact Factor: 2.114

SHARED DATA & INDENTITY PRIVACY PRESERVING IN CLOUD AND PUBLIC AUDITING

Identifying Data Integrity in the Cloud Storage

Verifying Correctness of Trusted data in Clouds

Project Proposal. Data Storage / Retrieval with Access Control, Security and Pre-Fetching

Enhancing Data Security in Cloud Storage Auditing With Key Abstraction

A New Cloud Paradigm: Data Protection as a Service (DPASS)

Secure Data transfer in Cloud Storage Systems using Dynamic Tokens.

Scientific Journal Impact Factor (SJIF): 1.711

A Review of Cloud Environment and Recognition of Highly Secure Public Data Verification Architecture using Secure Public Verifier Auditor

Proof of Retrivability: A Third Party Auditor Using Cloud Computing

Dynamic Query Updation for User Authentication in cloud Environment

SURVEY ON DISTRIBUTED DEDUPLICATION SYSTEM WITH AUDITING AND IMPROVED RELIABILITY IN CLOUD Rekha R 1, ChandanRaj BR 2

RIGOROUS PUBLIC AUDITING SUPPORT ON SHARED DATA STORED IN THE CLOUD BY PRIVACY-PRESERVING MECHANISM

EFFICIENT AND SECURE DATA PRESERVING IN CLOUD USING ENHANCED SECURITY

Overview. Timeline Cloud Features and Technology

Data Storage Security in Cloud Computing

PRIVACY ASSURED IMAGE STACK MANAGEMENT SERVICE IN CLOUD

How To Secure Cloud Computing, Public Auditing, Security, And Access Control In A Cloud Storage System

IMPLEMENTATION OF SOURCE DEDUPLICATION FOR CLOUD BACKUP SERVICES BY EXPLOITING APPLICATION AWARENESS

INTRUSION PROTECTION AGAINST SQL INJECTION ATTACKS USING REVERSE PROXY

Smart Data Back-up Technique for Cloud Computing using Secure Erasure Coding

M4 Systems. M4 Online Backup. M4 Systems Ltd Tel: International: +44 (0)

Cumulus: filesystem backup to the Cloud

Secure cloud access system using JAR ABSTRACT:

EMPOWER DATA PROTECTION AND DATA STORAGE IN CLOUD COMPUTING USING SECURE HASH ALGORITHM (SHA1)

Chapter 1: Introduction

Comprehensive Study on Data Security in Cloud Data Store

DiamondStream Data Security Policy Summary

StorReduce Technical White Paper Cloud-based Data Deduplication

ADVANCE SECURITY TO CLOUD DATA STORAGE

Data Storage Security in Cloud Computing for Ensuring Effective and Flexible Distributed System

Authorized data deduplication check in hybrid cloud With Cluster as a Service

International Journal of Computer & Organization Trends Volume21 Number1 June 2015 A Study on Load Balancing in Cloud Computing

APPLICATION OF CLOUD COMPUTING IN ACADEMIC INSTITUTION

Storage Backup and Disaster Recovery: Using New Technology to Develop Best Practices

How To Ensure Correctness Of Data In The Cloud

Sharing Of Multi Owner Data in Dynamic Groups Securely In Cloud Environment

How To Create A Backup And Backup Algorithm In Cloud (Cloud)

A Comprehensive Data Forwarding Technique under Cloud with Dynamic Notification

An Intelligent Approach for Data Fortification in Cloud Computing

Dynamic Data Storage for Trustworthy Cloud

Data Protection: From PKI to Virtualization & Cloud

WDL RemoteBunker Online Backup For Client Name

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

Deploying De-Duplication on Ext4 File System

Transcription:

Secure Hybrid Cloud Architecture for cloud computing Amaresh K Sagar Student, Dept of Computer science and Eng LAEC Bidar Email Id: mr.amarsagar@gmail.com Sumangala Patil Associate prof and HOD Dept of computer science & eng, LAEC Bidar. Email id:sumangala14jan@gmail.com Abstract : Data deduplication is data compression technique for eliminating repeating copies of data in the cloud storage.this method helps to improve storage utilization and this method can be applied on networks which will help in reducing the rate of data transfers.data deduplication removes redundant data copies and keeps one physical data copy, pointing other redundant data to that copy. Data deduplication takes place at file level or block level. If same copies of files are removed then it is file level deduplication and if the block of data are eliminated then it is block level deduplication. To provide security to the data we make use of hybrid cloud along with data encryption. In this paper I am showing how hybrid cloud is better than traditional cloud storage and how new methods provides security to the data that is stored in the cloud. Keywords: Deduplication, Hybrid cloud, File level deduplication, Block level deduplication. I. Introduction: Cloud computing is internet-based, a network of remote servers connected over the Internet to store, share, manipulate, retrieve and processing of data, instead of a local server or personal computer. There are many advantages of using this technology, it enables us to work from anywhere. Cloud computing done over the large communication network like Internet. Cloud computing provides a variety of computing services from servers and storage to enterprise applications such as email, security, data backup,data transfer, all delivered over the internet. It is an important solution for business storage in low cost. Cloud computing provide vast storage in all sector like government, enterprise, also for storing our personal data on cloud. Without background implementation details, platform user can access and share different resources on cloud.cloud computing provides IaaS (Information as a Servies), PaaS(Platform as a Services), SaaS (Software as a Services). The cloud delivers a hosting environment that is flexible, scalable, secure and available, while saving corporations capital, time and resources. To improve the storage, security, scalability issues data deduplication is one of the important technique. Public cloud, where any user can save there data on to cloud, they can upload, download and delete the data of there own without seeking any privileges from administrator. In public cloud there are many security concerns. Private cloud which is under the surveillances of organizations or some governing body. In private cloud anonymous user cannot use the services. The user needs to registered under the organization and utilize the services. Depending upon the privileges the registered can upload, download and modify the data. To overcome the disadvantages of private and public cloud, we use hybrid cloud. Hybrid cloud is the combination of private and public cloud where critical or very important data is stored in private cloud and other data is accessible from public cloud. Data deduplication is the process of identifying and eliminating redundant copies of repeating data in the cloud or storage.this technique is currently widely used to improve the storage utilization. Data deduplication can be done at file level and block level. File level deduplication identifies and removes same files. This method is also called as Single instance storage. Once the file is stored, other same copy of file point to this pointer. Block level detects redundant data within and across files. This method is also called as sub file deduplication

II. Existing System: Fig 1: existing system architecture Traditional encryption is incompatible during providing data confidentiality in data deduplication. In our existing system the user encrypts their data with their own keys, identical data copies of different user will produce different cipher texts, which further creates an over head in deduplication almost making it impossible. In our existing system we make use of there entities, User, private cloud and s-csp. The user request for the privileges present in the private clouds. Each privileges is represented in the form of token. The definitions of privileges may vary across the applications, for example role based privileges and time based privileges. The access rights to files are defined in the privileges. Once the user gains the tokens, computes and sends duplicate check tokens to public cloud for authorize duplicate check. Depending upon the privileges gained by user, can upload or download data. To upload a file the client will go for file level duplicate checking, file found is duplicate then all its segments must be duplicate else the user perform block level duplicate check. Block level identifies unique block of data to be uploaded.each data copy is associated with token for the duplicate check.

The problem with this system is that privileges cannot be shared. That is one user privileges cannot be given to others. While uploading or downloading data from public cloud the data is not compressed. It will consume bandwidth unnecessary. The user in the system is not authenticated before accessing private cloud. The data that is accessed more is provided same bandwidth as of the data which is not accessed frequently. To address these issues a new system is designed. III. Roles of Entities : A. User: User is an entity who wants to store and access the data the present in the public cloud. The user will request for the privileges from the private cloud. Once the privileges are gained the user uses them and access the files or data stored in the public cloud. The user may upload or download data. The user will upload unique data to save the upload bandwidth. Each file stored in the cloud are protected by convergent encryption keys and these files can access by authorized user B. Private Cloud: As public cloud is not fully trusted for security we make use of private cloud. Private cloud performs as interface and infrastructure between public cloud and user. The privileges for the files present in public cloud are managed by private cloud. Once the request is raised by user for the privileges, the private cloud provides the privileges to the user based up on the authority. C. S-CSP in Public Cloud: S-CSP (Storage cloud service provider ) is an entity which provides data storage service in public cloud. Instead of user the S-CSP stores the data on behalf of user. S-CSP han abundant storage capacity and computational power. To reduce the storage cost S-CSP eliminates the duplicate data through deduplication and saves only unique data. IV. Proposed System:

Fig 2: System architecture for proposed system. Considering the disadvantages of existing system I added identity provider for user and data compressor /extractor in proposed system. Based the literature survey I can say that there is lack of security for the user and while uploading/downloading data from cloud there is bandwidth performance issue. To improve the bandwidth while accessing from cloud I proposed an idea to compress the data with best available compressor in market which doesn t alter the data contents for example like Gzip. Gzip is a tool which is widely used in IT industries to save bandwidth while uploading or downloading data. The data contents remains same after compression and after extraction. In existing system where one user rights/privileges cannot be shared to the other user. In proposed system if user privileges are shared then admin s work gets reduced. For example if the user doesn t have privileges to download the data then he/she needs the approval of privileges from admin. If any know user can do this on behalf of admin then the complexity of process reduces and user can download the data from the cloud. V. Roles of Entities : A. Identity provider: This entity help is providing authentication for the user. Identity provider provides identity for the user who is trying to access the system. Before registering into system the user needs to provide his/her

identity to the Identity provider, once the identity is approved the user can use the system. The user details are stored in this system like there unique id s. by which the system can differentiate between authorized and unauthorized user. B. Data compressor /decompressor Gzip is a software application used for compression and decompression.when a file is compressed by this software we get a file extension by.gz. The file containing this kind of files can be extracted with the help of Gzip application. Gzip is founded with the help of Deflate, which is an algorithm. It is the formed by combining huffman code and Lz77. This tool was enrooted by jean loup and M. Adler.This software application is used to compress single file. Compressed archives are created by combining collections of files into unit tar and compress this archive with this tool. The final output will be with an extension of.tar,.gz,.tgz. These files are usually called tar ball. VI. Literature survey: A. Dedpulication and compression techniques in cloud design Authors: Amrita upadhyay, Pratibha R Balihalli, shashibhushan Ivaturi Amrita upadhyay et al[] proposed deduplication and compression techniques in cloud design, proposed a solution for deduplication on cloud, where deduplication and compression saves 47.5 % processing time and removes 80 % redundant data also reduces bandwidth utilization by 31%. GZIP tool used for compression and decompression of files. Using GZIP the file formats remain unchanged. The user can access the files or data without any losses. They achieved this with the help of segmentation and binning methods. They also said that deduplication can applied at bucket level as well as cloud storage. B. Quick and protected laptop reserve with encrypted de-duplication Authors: P. Anderson and L. Zhang User tends to save there data in personal computer, portable hard disks, pen drives. These devices may get corrupt, hardware may fail.ordinary reserve solutions does not fit this environment. This paper suggests an step by step procedure which considers the common data among user to speed up of the reserve and to save storage needs. This procedure provides support for end user security for securing the personal data C. Protected de-duplication with thorough and reliance convergent key administration AUTHORS: Li, Jin, et al Deduplication is a process for removing the same copy of data content in huge data warehouse. If same copy of data is saved in storage area it will unnecessarily consume space. It will also eat up lot of bandwidth while uploading/downloading data. Ordinary encryption is used vast in deduplication but the problem is how to manage all the keys generated. This paper address the problems and provides solution for Protected de-duplication with thorough and reliance convergent key administration. Client has master key for securing the convergent keys and sending these keys to cloud storage. There is need to safeguard this master key because there are many convergent keys behind it. This caused overhead to manage master keys. To overcome this overhead Dekey concept was introduced. In this concept the client doesn't needs to take care of keys. These keys are saved in many servers. Reliable deduction says that Dekey is trusted and safe. This prototype is implemented with the help of method sharing called ramp secret and shows Dekey has less problems in real scenarios D. Secured Authorized Deduplication Based Hybrid Cloud Authors: Rajashree Shivshankar Walunj,, Deepali Anil Lande, Nilam Shrikrushna Pansare Deduplication deletes same copy of data and keeps only one copy of data. Other copies are pointed to this content. It is data compression technique for improve the bandwidth efficiency and storage utilization.data deduplication most widely used in cloud computing.it make data management scalable and storage problem in cloud computing.data deduplication protect the confidentiality of sensitive data. data deduplication work with convergent encryption technique to encrypt the data before uploading. In public cloud our data are securely store in encrypted format, and

also in private cloud our key is store with respective file. There is no need to user remember the key. So without key anyone can not access our file or data from public cloud. E. Proof of ownership Authors : Shai Halevi, Danny Harnik, Benny Pinkas Proof of ownership for deduplication systems, such that client can efficiently prove to the cloud storage that he/she owns a file. Many Pow methods which is based on Merkle-Hash Tree are suggested to enable user side deduplication which consists setting of the bounded leakage. They identify attacks that exploit client side deduplication. Pietro and Sorniotti proposed another efficient Proof of ownership method by selecting the projection of a file on randomly chosen bit points as the confirmation of file F. Inverse deduplication storage system customized for reads to reserve Authors: C. Ng and P. Lee. Revdedup Deduplication is known to effectively eliminate duplicates, yet it introduces fragmentation that will reduce the performance while reading. This paper suggest RevDedup, which is a deduplication system to optimizes the reads to the reserve of virtual machine images using this inverse deduplication. With respect to ordinary deduplication that will eliminate same data from new data, RevDedup eliminates same copy from old data, now moving fragmentation to old data while conformity layout of new data as in sequential manner. We analyze RevDedup model for 3 months span of real-world VM image snapshots of more than 150+ users. This method demonstrate that RevDedup gain high deduplication efficiency, high reserve throughput as well as positive gain in read throughput. VII. Conclusion : In this paper I tried to Compared to existing system and proposed system in terms of security and storage space required. Proposed system is better in reducing storage space in cloud and more authentication. Each user details are stored in Identity provider. Security predictions helps us to predict our proposed system is more secured. While uploading data to the cloud its compressed by Gzip software which compress files to reduce the file size. Which helps in reducing the storage space as well as bandwidth frequency. Existing system uses more bandwidth and consumes more storage space. More research work is needed to improve the security and to reduce storage space as well as bandwidth frequency. Acknowledgements: This work is supported by my guide Mrs. Sumangala Patil Head of Department of computer science, completed master s and pursing Phd. She has 18+ years of teaching experience and interested in various fields and got very depth knowledge of technologies. References [1] Shai Halevi et al Proof of Ownership in Remote Storage System [2] Li, Jin, et al. Secure deduplication with the efficient and reliable convergent key management. [3] P. Anderson and L. Zhang. Fast and secure laptop backups with encrypted de-duplication. [4] Jin Li, Yan Kit Li et al A Hybrid Cloud Approach for Secure Authorized Deduplication,IEEE Transactions on Parallel and Distributed Systems. [5] www.wikipedia.org/wiki/gzip [6] OpenSSL Project. http://www.openssl.org/ [7] M. Bellare et al Security proofs for identity-based identification and signature schemes [8] Amrita upadhyay, Pratibha R Balihalli, shashibhushan Ivaturi Dedpulication and compression techniques in cloud design [9] P. Anderson and L. Zhang et al Fast and secure laptop backups with encrypted deduplication. [10] Rajashree Shivshankar Walunj,, Deepali Anil Lande, Nilam Shrikrushna Pansare Secured Authorized Deduplication Based Hybrid Cloud [11] Shai Halevi, Danny Harnik, Benny Pinkas Proof of ownership [12] Revdedup et al A reverse deduplication storage system optimized for reads to latest backups.