Universiteit Leiden Opleiding Informatica

Size: px
Start display at page:

Download "Universiteit Leiden Opleiding Informatica"

Transcription

1 Internal Report October 2012 Universiteit Leiden Opleiding Informatica Technology Based Methods to Reduce The Risks of Cloud Computing Anton den Hoed MASTER S THESIS Leiden Institute of Advanced Computer Science (LIACS) Leiden University Niels Bohrweg CA Leiden The Netherlands

2

3 Technology Based Methods to Reduce The Risks of Cloud Computing Anton den Hoed Computer Science Leiden Institute of Advanced Computer Science Leiden University, The Netherlands Leiden, October 2012

4

5 Master Thesis Project Thesis submitted in partial fulfillment of the requirements for the degree Master of Science in Computer Science. Faculty of Science Leiden Institute of Advanced Computer Science Leiden University Supervisors: First Supervisor Prof. Dr. Thomas Bäck (Leiden University) Second reader Dr. Hans Le Fever (Leiden University) External Supervisor Rashid Sohrabkhan, M.Sc. (Accenture) Author: Anton den Hoed Student ID

6

7 Executive Summary Cloud computing is a computing paradigm that attracts a lot of attention. It promises cost reduction and the outsourcing of IT services and efforts. These advantages cause a rapid growth of the cloud computing market. However, the risks that come with outsourcing software and data to another party are problematic for many companies. Most issues can be traced back to loss of control and regulatory compliance. The negative effect of the risks on the adoption of cloud computing have led us to conduct a research project with three objectives. First is to find out which risks are most worrisome. Second, to discuss and assess existing technology based methods to reduce four of the most important risks. Third, to introduce new technology based risk reduction methods. While risk reduction can consist of people-, process-, and technology based controls this thesis will focus on technology based methods. Based on a review of literature on the risks of cloud computing and interviews with cloud computing implementation consultants an overview of the most worrying risks was created. The results indicate that regulatory compliance, data protection, data location, loss of governance and data segregation constitute the top five of most worrisome risks. In four chapters existing and new methods are discussed and assessed for the following risks: Data Location Data Deletion Data Leakage Data Segregation In the chapter on data deletion a new method is introduced. This method consists of an algorithm that uses data provenance metadata to delete all copies of a data artifact. Because the provenance metadata can be inspected it is possible to verify whether data is really deleted. In the chapter on data location an encryption scheme which is used for digital broadcast is used to constrain which entities can decrypt and use specific data. This method is new to the cloud computing domain. Other contributions of this thesis are insight into which risks play the biggest role in cloud adoption problems and a (literature) study into existing methods for the reduction of the data location, deletion, leakage and segregation risks. This study bears some limitations. Because of the scattering of relevant literature and websites and the secrecy of CSPs it is not possible to find all possible sources of existing methods, so the overview of existing methods cannot be exhaustive. Furthermore, the number of experts that were interviewed is only five and they all work for Accenture. While their experience is based on projects with many different customers it is not possible to generalize the results of the interviews.

8 Contents Executive Summary 5 Acknowledgements 7 1. Introduction Motivation and Research Goal Research Questions Scope Research Methodology Thesis Outline Cloud Computing Delivery models Software as a Service Platform as a Service Infrastructure as a Service Deployment models The Risks of Cloud Computing Risk management Risk identification Risk assessment Risk treatment The risks of cloud computing Interviews: The most important risks The risks that will be addressed in this thesis Data Location Data Deletion Data Leakage Data Segregation Data Location Introduction Methods to enforce data location Choosing the data location Sticky Policy based methods Data Residency Methods to perform data location monitoring Logging of data location Triangulation of the location of the data... 34

9 4.4. Discussion Conclusions Data Deletion Introduction Deletion Methods Disk wiping and physical destruction Crypto-shredding A data provenance based method for data deletion The Open Provenance Model General idea The algorithm Adapters An example Simulation Discussion Conclusions Data Leakage Introduction Prevention Data Masking Trusted Computing Fog Computing Monitoring Log file analysis Psychological triggers Discussion Data Masking Trusted Computing Fog Computing Psychological triggers Conclusions Data segregation Introduction Data at Rest Databases File storage Data in Motion... 58

10 Virtual Local Area Networks Encrypted communication Data in Use Discussion Data at Rest Data in Motion Data in Use Conclusions Discussion and Conclusions Findings Reflection and limitations Contribution List of Figures 66 List of Tables 67 List of Abbreviations 68 Appendices 73 Appendix A. Deletion algorithm... 73

11 1. Introduction In the last few years cloud computing became one of the biggest hypes in the IT world. According to Google Trends the hype peaked in 2011 (Figure 1), but cloud computing is continuing to have a big impact. Analysts from IDC predict that the worldwide spending on cloud services will increase from $17 billion in 2009 to $44 billion in 2013 (Figure 2) [1]. Moving to the cloud can be complicated because there are many risks which are associated with cloud computing: data location, regulatory compliance, vendor lock-in, data protection; and the list goes on (see [2]). Poor knowledge about the risks and a lack of good methods to mitigate these risks reduces the speed at which companies adopt cloud computing [3]. 1 INTRODUCTION 11 Figure 1. The Google Trends report for the query cloud computing. This master thesis aims to find out which data privacy related risks are causing the most concerns and Figure 2. Worldwide IT spending by consumption model which technology based methods can be used to (from [1]). reduce these risks. These can be methods that are already used in cloud computing, existing methods which can be applied to cloud computing or entirely new methods. The remainder of this chapter will give a more detailed introduction into this research. Motivation and research goal are discussed in section 1.1. Next, the research questions are introduced in section 1.2. Section 1.3 defines the scope for this project. The research methodology is explained in section 1.4. And last, the outline of the thesis is given in section Motivation and Research Goal Moving to the cloud comes with many risks. But how big are these risks? And how can they be mitigated? Companies that want to move (a part of) their IT into the cloud often have these questions, while they can be very hard to answer. Because of this companies are hesitant to use cloud services and the adoption of cloud services is delayed [3]. It can be concluded that the uncertainty about the risks of cloud computing forms a barrier to cloud adoption [3]. This barrier can be lowered by using effective methods to reducing the risks. But which methods are effective to reduce a specific risk? Sometimes it is possible to introduce policies and to make an agreement between the involved parties on which certain rules are enforced. However, in most cases it is necessary to use technology based methods to help enforcing these rules. This thesis will discuss and introduce technology based methods which can help to mitigate the biggest risks of cloud computing.

12 12 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING 1.2. Research Questions To determine on which risks this thesis needs to focus it is necessary to determine which data privacy related risks there are in the cloud computing domain. Therefore the first research question is: RQ1. What are the (data privacy) risks of cloud computing? With an overview of the risks it is possible to determine which risks are related to data privacy. It is not possible to cover all the risks, so it is needed to find out how big the risks are and to focus on the biggest risks. Therefore the second research question is: RQ2. What are the biggest (data privacy) risks of cloud computing? When RQ2 is answered it is possible to start to answer the third question for each risk that will be covered. To answer this question there are three sub-questions that need to be answered first. RQ3. What are the possible (and adequate) technologies for reducing this risk? a. Are there adequate existing technologies which are used to reduce this risk? b. Are there existing technologies from outside the cloud computing domain which can be translated to a technology which can be used to adequately reduce the risk? c. If a and b do not yield adequate technologies: Is it feasible to find new solutions? And, what are these solutions? 1.3. Scope While there are many risks to cloud computing, this thesis will focus on the risks which are related to data privacy. For these risks this thesis discusses, improves or introduces several technology-based methods to reduce them. Because there is more to risk reduction than technology, other methods will also be discussed if they support a technology-based method or if it is not possible to use technology to mitigate the risk Research Methodology In this section the approach for answering the research questions will be defined. RQ1 will be answered by performing a literature study. This study includes finding and analyzing relevant literature and creating an overview of the risks. Based on the risk overview several interviews with cloud computing experts which work at Accenture will be taken. In these interviews the main question is: Which risks are in your experience the risks that raise the biggest concerns for customers? Based on how many times a risk was identified as an important risk and on the motivations of the expert for choosing certain risks it will be determined which risks are the most important data privacy risks of cloud computing. For four of the most important risks a study will be done. The study will start with creating a detailed

13 1 INTRODUCTION 13 description of the risks and giving examples. After that literature and publicly available information will be used to identify the methods that are available at this moment to mitigate the risk. If the existing repertoire of technology based methods can be improved, research will be done to find methods from outside the cloud computing domain which can be used. Such a method can then be translated to the cloud computing domain and extended to meet the requirements of mitigating a specific risk. If the previous steps do not yield adequate methods to mitigate the risk it will be tried to create a new technology. When an idea for a new method is found it is worked out in detail, and there will be an attempt to validate the new method. For each discussed method an analysis of its advantages and disadvantages will be done. It will also be analyzed if the method covers the mitigation of the whole risk, or if there are parts which are not covered and have to be covered in another way Thesis Outline The remainder of this thesis is organized as follows: Chapter 2 gives a definition of cloud computing and explains its delivery and deployment models. In chapter 3 the risks of cloud computing are discussed, as well as the results of the interviews with experts and an analysis of which risks are the biggest risks. In chapters 4, 5, 6 and 7 four of the biggest risks are introduced in detail and existing and possibly new risk reduction methods are discussed or introduced. In chapter 8 the results will be discussed, the research questions will be answered and final conclusions are drawn. This chapter will also contain a reflection upon this research project, an overview of the contribution of this thesis and a discussion of the limitations of this research. Figure 3. Outline of the thesis.

14 14 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING

15 2 CLOUD COMPUTING Cloud Computing From a business perspective, cloud computing is a methodology which enables companies to use IT infrastructure, platforms and software as a service [4]. This means that companies do not need to buy and manage their own hardware and software. With cloud computing a company pays for the services on a pay-per-use basis and almost all management is done by the cloud service provider (CSP). Because customers do not need to buy hardware and the applications are delivered via the internet they can vastly reduce their capital expenses. And because CSPs can offer their services in a more efficient way the operational costs for a customer are lower. Another big advantage besides the outsourcing of IT and the lower prices is that the time to market can be reduced. Customers subscribe to a service and start configuring and using it right away. From a technology perspective, cloud computing is the concept of using a large pool of virtualized resources to provide standardized services [5]. All users use resources from the same pool, so the use of hardware is more efficient and users can scale up if they have higher capacity demands. The most important aspects of cloud computing are: Shared resources With traditional computing methods each organization has its own hardware. This often leads to underutilization because organizations buy their hardware with peak capacity in mind. Another problem that can occur is that if the demand exceeds the available capacity it is very hard and expensive to scale up. With cloud computing the resources are shared so customers only have to pay for the resources they actually use. The cloud provider can reduce the amount of hardware because customers do not have the same usage patterns. If one client is in Europe and another in Asia the peak hours for these clients are at different times of the day. The disadvantage of shared resources is that it is the source of a lot of risks. For example, data from different clients can be stored in the same databases, so it is very important to implement good data segregation. Standardized services Whether a CSP provides the possibility to run virtual machines, a development and deployment platform or software services, the services are always standardized. All customers get the same services. The result is that it is often easy to get a cloud service up and running. Configuration and customization possibilities are well documented. Pay-as-you-go Customers only pay for the resources they actually use. This can be per virtual machine, per running instance, per user, per record, or in a number of different ways. Billing can occur on a daily, weekly or yearly basis.

16 16 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING Because customers have no capital expenses for the use of cloud computing, and only pay for the resources they use it is possible to reduce the IT costs. Delivered via the Internet Cloud services are often hosted off-premise, so the users access their applications via the internet. Most often this happens via a web browser, a mobile phone or a tablet computer. There is no need to install software and users devices can be relatively lightweight. Integration It is very important that cloud services can be integrated with other applications and services. Therefore cloud providers offer standardized application programming interfaces (APIs) using protocols such as HTTP, SOAP and REST Delivery models Figure 4. The delivery models of cloud computing. There are different ways to deliver cloud services (Figure 4). At the lowest level there is the possibility to run virtual machines on the infrastructure of a CSP. This is called Infrastructure as a Service (IaaS). One level higher there is the possibility to develop and deploy applications on the infrastructure of a CSP. This is called Platform as a Service (PaaS). On the highest level there are standardized applications which are delivered as a service. This is called Software as a Service (SaaS). Figure 5. The responsibilities of the CSP and customer for different delivery models.

17 2 CLOUD COMPUTING 17 Each delivery model has a different division of responsibilities between the CSP and the customer (Figure 5). With SaaS a customer has less control, with PaaS the customer controls the software and with IaaS the customer controls the software and the platform. The different delivery models will be introduced further in the next three subsections Software as a Service In the traditional situation a company buys licenses for using software which will be installed on hardware which is owned by the company. To receive support and updates it is needed to sign a separate maintenance agreement. This way of working comes with large upfront investments, in-house management and maintenance of hardware and software and underutilization of resources. With Software as a Service (SaaS) [4][6] a company subscribes to an application which is delivered by a CSP. The subscription includes the usage of the application, support, backups and other services. Because the CSP is responsible for the management and maintenance of the underlying platform and the application, the customer does not have the burden of these responsibilities. The most important advantages of SaaS are: Outsourcing: Companies can outsource their software to external providers, reducing their capital expenditure. Because a CSP provides the same service to many customers, they can build a platform at a lower price per customer and thus the operational costs can stay low. Control: With traditional delivery models the software had to be transferred to customers which reduced the control the vendor had over the usage of their software. With SaaS a vendor has full control over the usage of their software. Economies of scale: Because a CSP delivers the same service to many customers and it is not needed to provision for peak capacity for each client at the same time the CSP profits from standardization and less hardware requirements. No extra hardware needed: SaaS applications are delivered via the internet and can be used via a web browser, so a user only needs a computer with a web browser. Some examples of SaaS applications are Salesforce, Google Apps, Microsoft Office365 and NetSuite Platform as a Service When a company wants to develop a custom application the normal practice is to program the application based on a collection of (standard) components like database servers and authentication services which have to be installed and maintained. Deployment of the application is done on hardware which is owned by the company. This is relatively complex and expensive. Platform as a Service (PaaS) [4][6] is a cloud delivery model which tries to make the development and deployment of applications a less complex and expensive task. A PaaS provider provides a development

18 18 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING and a deployment environment to customers. The provider also takes care of automatic scalability, reliability, security and monitoring. The development environment consists of development tools, standard components and testing facilities. Many PaaS providers provide the following standard components: Database File Storage Queuing Services Content Delivery Network Large scale data processing The deployment environment provides the application to the users and automatically scales the amount of resources that are used if there is a high demand for the application. Some examples of PaaS providers are: Google App Engine, Amazon AWS, Microsoft Azure and Force.com Infrastructure as a Service Infrastructure as a Service (IaaS) is the delivery model where cloud infrastructure is used to deliver a virtualization platform to clients. A client can deploy own or vendor supplied virtual machines to run software in the cloud, and pays for the resources (CPU time, memory, storage and network usage) it consumes. The provider provides a basic virtualization platform with only a limited number of standard services such as backups, scaling, security and virtual machine management. Some examples of IaaS providers are: Amazon EC2, Cloud.com and GoGrid Deployment models Deploying a cloud can be done in different ways [6], depending on the requirements of the users of the cloud. A cloud can be open to the public, open to one organization or a specific community. It is also possible to combine different types of clouds. The first option is a public cloud, which is a cloud where every company or consumer can subscribe to the services that are provided by the CSP. The resources of the cloud are shared between the different customers. When it is not desired or allowed to use a public cloud for specific services it is also possible to use a private cloud. With a private cloud the customer controls which services are delivered to which users. There are a couple of different variants of private clouds [4]: Dedicated: A private cloud which is owned by a single company, and which is hosted on-premise or at a collocation facility. The company has full control over their cloud. Community: A cloud which is used by a community of companies. It can be owned and hosted by one of the companies or a third party. The community has full control of the usage of the cloud.

19 2 CLOUD COMPUTING 19 Managed: A private cloud which is owned by a single company, but which is managed by a cloud vendor. If a company uses different services which are hosted in different clouds this is called a hybrid cloud. With a hybrid cloud a company can use public cloud based services for non-sensitive and none-core business applications and a private cloud for sensitive and critical applications.

20 20 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING

21 3 THE RISKS OF CLOUD COMPUTING The Risks of Cloud Computing 3.1. Risk management Organizations operate in an uncertain world. Every project or activity has certain risks, but what is risk? According to ISO risk is the effect of uncertainty on objectives [7]. In most cases these effects will be negative, but it is also possible to have positive effects. Risk can also be defined in a quantifiable way as the probability that a certain event will occur multiplied with the impact that event will have when it happens. Risk = Probability x Impact To manage risk effectively it is important (1) to identify and assess the threats to the objectives, (2) to determine the vulnerability of critical assets to these threats, (3) to determine the risk, (4) identify methods to reduce those risks and (5) prioritize the risks Risk identification The identification of threats can be done using many different methods. Examples are brainstorming, workshops, SWOT analysis and scenario analysis. In [8] these and other techniques are discussed in detail Risk assessment To determine how big the risk that a threat poses is, the probability and the impact have to be determined. For some risks it is possible to calculate the impact exactly and determine the probability based on statistical data from comparable activities. However, in many cases this is very hard to determine because there is no hard data to calculate the impact and there is not enough experience with comparable activities to determine the probability. In these cases the impact and probability will have to be estimated by using another method Risk treatment Once a risk is identified and assessed it is possible to determine what can be done to manage the risk. It is possible to avoid the risk entirely, transfer (part of) the risk to another party, accept the risk or to reduce the risk. Reducing a risk can be done by a combination of people, process and technology. People need to be aware, trained and accountable. Structured and repeatable processes are needed. Technology can be used to continually evaluate the risk and the associated controls, and to monitor and enforce rules which reduce the risk [9] The risks of cloud computing This thesis puts a focus on using technology based methods to reduce the biggest risks of cloud computing. But what are the biggest risks?

22 22 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING The European Network and Information Security Agency (ENISA) has performed a risk assessment of cloud computing [2]. Threats were identified by analyzing three different use cases. Experts determined the likelihood and impact of each threat, and the risk levels were estimated by using a risk estimation table which is based on ISO/IEC 27005:2008 ref(see Table 1). Table 1. Levels of risk which were used in the ENISA study (from [2]). Business Impact Likelihood of incident scenario Very Low Low Medium High Very High Very Low Low Medium High Very High In total the report identifies 35 risks, divided into four categories. In Table 2 all the risks from [2] are listed by category, and in Table 3 the distribution of these risks by probability and impact is shown. Table 2. The risks of cloud computing according to ENISA. Policy and organizational risks R.1 Lock-in R.2 Loss of governance R.3 Compliance challenges R.4 Loss of business reputation due to co-tenant activities R.5 Cloud service termination or failure R.6 Cloud provider acquisition R.7 Supply chain failure Technical risks R.8 Resource exhaustion R.9 Isolation failure R.10 Cloud provider malicious insider R.11 Management interface compromise R.12 Intercepting data in transit R.13 Data leakage on up/download R.14 Insecure or ineffective deletion of data R.15 Distributed denial of service R.16 Economic denial of service R.17 Loss of encryption keys R.18 Undertaking malicious probes or scans R.19 Compromise service engine R.20 Conflicts between customer hardening procedures and cloud environment Legal risks R.21 Subpoena and e-discovery R.22 Risk from changes of jurisdiction R.23 Data protection risks R.24 Licensing risks Risks not specific to the cloud R.25 Network breaks R.26 Network management R.27 Modifying network traffic R.28 Privilege escalation R.29 Social engineering attacks R.30 Loss or compromise of operational logs R.31 Loss or compromise of security logs R.32 Backups lost or stolen R.33 Unauthorized access to premises R.34 Theft of computer equipment R.35 Natural disasters

23 3 THE RISKS OF CLOUD COMPUTING 23 Table 3. Risk distribution Probability R.5, R.6 R.15B, R.19, R.25 R.9, R.10, R.11, R.14, R R.33, R.34, R R.4, R.8B, R.17, R.27, R.28 R.7, R.20, R.30, R.31 R.1, R.12, R.13, R.15A, R.21, R.23, R.29 4 R.8A, R.18, R R.2, R.3, R R.16, R Impact ENISA classifies loss of governance, vendor lock-in, isolation failure, compliance risks, management interface compromise, data protection, insecure or incomplete data deletion and malicious insiders as the top risks to cloud computing. In Top Threats to Cloud Computing [10] the Cloud Security Alliance identifies the seven biggest risks to cloud computing: Abuse and nefarious use of cloud computing, insecure interfaces and APIs, malicious insiders, shared technology issues, data loss or leakage, account or service hijacking and an unknown risk profile. Gartner identifier the top risks in Assessing the Security Risks of Cloud Computing [11]: Privileged user access, regulatory compliance, data location, data segregation, recovery, investigative support, long-term viability and availability. Table 4 shows an overview of the top risks which are identified by ENISA, CSA and Gartner Interviews: The most important risks The risk analysis documents of ENISA, CSA and Gartner have resulted in a list of the most important risks of cloud computing. However, it is not possible to cover all these risks in this thesis. To determine which risks have the highest priority five cloud computing experts from Accenture were interviewed. Before an interview the expert received a document with an overview of the top risks from the three different analyses (Table 4) and instructions on how to access the risk analysis documents. During the interview the following question was asked: Which risks are based on your experience the most important risks?. Interviewees were given the possibility to introduce risks which were not on the list.

24 24 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING They were also asked to motivate their choices. Table 4. ENISA [2] CSA [10] Gartner [11] Loss of governance Abuse and nefarious use of cloud computing Privileged user access Vendor Lock-In Insecure interfaces and APIs Regulatory compliance Isolation failure Malicious insiders Data location Compliance risks Shared technology issues Data segregation Management interface compromise Data loss or leakage Recovery Data protection Account or service hijacking Investigative support Insecure or incomplete data deletion The top risks of cloud computing according to ENISA, CSA and Gartner. Malicious insider Unknown risk profile Long-term viability Availability During the interviews no new risks were introduced to the list. In Table 4. it is shown how many times a risk was indicated as one of the most important risks during the interviews. Every interviewee classified regulatory compliance as one of the top risks of cloud computing. A company needs to comply with a plethora of regulations and laws, such as privacy laws, accountability laws, data retention rules and internal data protection policies. Data protection was also mentioned in almost every interview. Most applications contain (privacy) sensitive data that should not be accessible to or modifiable by third parties. However, it is not immediately clear which security measures are implemented by a CSP. Most CSPs also do not offer the possibility to perform specific audits. Customers have to rely on standard reports. A risk which is a result of regulatory compliance and data Risk # Regulatory compliance 5 Data protection 4 Data location 4 Loss of governance 3 Data segregation 2 Availability 2 Vendor lock-in 1 Insecure or incomplete data deletion Insecure interfaces and APIs 1 Privileged user access 1 protection is data location. In the European Union the privacy laws state that personal identifiable information from EU citizens may only be stored within the EU or in countries with an adequate level of privacy protection [12]. Table 5. The risks that were mentioned as being very important and the number of times they were mentioned. With outsourcing IT activities a company also outsources a part of the controls they have, so loss of governance is also a hot topic. Software and services will be running on hardware that is managed by a CSP and security and privacy controls are the essentially the same for each customer. The rise of Bring Your Own Device (BYOD) also makes the consumption of (cloud) IT services less controllable. 1

25 3 THE RISKS OF CLOUD COMPUTING 25 Data segregation can be seen as being part of data protection. The multitenant architecture of cloud computing implies that the data of different customers is stored in a single environment. This necessitates effective segregation technologies and controls in the data storage, transport and processing stages. Outsourcing IT to another location requires a cloud which is always available. Many customers are afraid that the cloud or their internet connection will encounter outages and work will come to a halt. Vendor lock-in, insecure or incomplete data deletion, insecure interfaces and APIs and privileged user access were each mentioned once during the interviews. The arguments that were motivating the choices for these risks are mainly the same as the arguments that were used for the other risks The risks that will be addressed in this thesis This thesis discusses existing and introduces new technology based risk reduction methods for four of the most important risks. Because the combination of people, process and technology based controls which can be used to reduce each risk is vastly different it is not possible to cover all types of risk in this thesis. Two of the most important risks regulatory compliance and loss of governance will not be covered directly in the following chapters. However, many other risks are a result of these risks, so they will be covered indirectly Data Location The EU privacy directive [12] demands that privacy sensitive data is stored within the EU or in countries which effectively protect the privacy of EU citizens. This means that personal data from the EU may not be stored in most countries. Most CSPs do not fully disclose where data is stored. Even if a CSP gives their customers means to control data location there are exceptions in the terms of service which grant the CSP the possibility to move personal data if necessary Data Deletion Cloud services make many copies of the same data. File systems are replicated, backups are made, etc. This makes it very hard to verify if data deletion is effective. Will all copies be deleted? And, is the deletion process irreversible? Data Leakage Intentional or unintentional data leakage by insiders is a problem which is faced by many companies. To mitigate this risk many controls such as endpoint protection and encryption are used. However, with cloud computing more and different devices connect to the cloud. Installing effective monitoring software on all devices is not always possible Data Segregation Problems with data segregation can be seen as a more specific instance of data leakage. Effectively separating data of different customers is very important. The well-known controls in this field focus on the separate storage of data; however data will also be transported and processed. Which controls are available to separate data while it is transferred over the network or used by a processor?

26 26 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING

27 4. Data Location 4.1. Introduction 4 DATA LOCATION 27 When a company plans to migrate an application to the cloud it usually involves moving data from the premises to a data center of the CSP. But where is that data center? Finding the answer to this question is not as easy as finding out where the CSP is registered, because they can deliver their services from any data center or even multiple data centers. Existing literature on the risks of cloud computing often classifies data location as an important risk [2] [11]. In the remainder of this section the risk aspects of data location in the cloud are discussed. Because data location is closely related to privacy regulations there will be a focus on personal data. If a company wants to store personally identifiable information (PII) in the cloud several problems might arise: Is it allowed to move PII to the country where the data will be stored by the CSP? Can authorities access the data if they need it for legal purposes? Is it possible to get information about where the data is stored in case the CSP has data centers in multiple countries? Is it possible to enforce a data location policy? Transferring personal data to another country can be difficult. The European Union demands from their member countries that they implement privacy laws which are based on the Data Protection Directive (Directive 95/46/EC [12]). The directive allows the transfer of personal data between EU member states, but it prohibits the transfer of personal data to a third country if that country does not ensure an adequate level of protection. The European Commission decides which countries have an adequate level of protection. At this moment Andorra, Argentina, Australia, Canada, Switzerland, Faeroe Islands, Guernsey, State of Israel, Isle of Man and Jersey are fully recognized as countries with an adequate level of protection (Figure 6) [13]. The transfer of personal data to the United States is only allowed if the recipient of the data has a certificate which confirms that it adheres to the US Department of Commerce s Safe Harbor Privacy Principles or if the transfer concerns air passenger name records [13]. For a company which is based in an EU member state the national privacy laws apply to the collection of personal data. Moving the data to another country does not change that. The company is also liable for the handling and protection of their data if they place their data in the cloud. This implies that a company has to ensure that their CSPs act according to the national privacy laws. Access to data by governments is always a big topic in discussions on data privacy ([10][11]). In most countries the government of the country where a CSP is located can issue warrants to retrieve data. In cases like anti-terrorism it is sometimes the case (for example in the United States [16]) that a CSP is not allowed to disclose to their customers that they have received a warrant for the disclosure of their data. If a CSP has a subsidiary in a country or does systematic business in that country, the CSP has to com-

28 28 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING ply with the local laws. Thus, if a CSP offers their services to customers in the US they have to comply with US law. Even if a CSP has no connection with a country it is possible that the government of that country gets access to data which is stored by that CSP, if a third country has a Mutual Legal Assistance Treaty (MLAT) with the country where the CSP is located. The United States has MLATs with over 60 countries [17]. Figure 6. Countries with an adequate level of protection. EU member states are dark blue, fully recognized countries are light blue and the US is green. Knowing where your data resides can be very difficult in the cloud. Many large CSPs have data centers in different countries, and they often use distributed file systems which automatically copy data to different data centers to ensure data availability if one of data centers goes offline. Because the algorithms which decide to which data center data is copied are not disclosed, it can be completely unclear where your data is stored. Some CSPs like Microsoft address this problem by giving their customers the possibility to choose where their data is stored [18]. Other CSPs like Google do not give their customers the possibility to choose where their data is stored [19]. Enforcing the location of data is even harder than knowing the location of the data. Even if CSPs let their customers choose where their data will be stored, there are exceptions in the terms of service which allow moving the data to other places: for solving problems, support or to comply with requests from law enforcement agencies Methods to enforce data location The simplest method to mitigate the risk of data location is choosing a CSP which only has data centers in specific countries. If the CSP has data centers in multiple countries and is not willing to guarantee that data is stored in specific countries, it is possible to encrypt the data. If a CSP only stores the encrypted data but does not have the private key they do not store the actual data. This also holds for PII: according to EU member states

29 4 DATA LOCATION 29 privacy laws encrypted PII is no PII if an entity is not able to decrypt the data in any feasible way [20]. A very important problem with encryption of data without giving the CSP access to the private key is that it is impossible for the cloud service to perform processing tasks on the data. Processing data is very important in almost all applications: report generation, automated workflows, search engine indexing, etc Choosing the data location A CSP may offer their customers the choice of where their data should be stored. Microsoft Windows Azure uses this method by letting their customers choose if their data will be stored in the United States, Europe or Asia [18] Sticky Policy based methods In [21] Karjoth et al. introduce the Platform for Enterprise Privacy Practices (E-P3P), which is a method that enables enterprises to define formalized privacy policies and to enforce these policies. When a data subject enters data he can agree to the applicable privacy policy as well as selecting opt-in and opt-out options for specific parts of the policy. For example, a person wants to create an account at an e-commerce website. He indicates that he agrees with the privacy policy and he opts out of the option to receive a weekly news mailing. E-P3P attaches the consent information to each data record or file (sticky policies), so it is possible to control data on a very fine-grained level. Major advantages are the possibility to discriminate between different versions of a privacy policy or to have different policies for data subjects which are living in different countries. Within the scope of data location this leads to the idea that a data subject or enterprise can use a sticky policy to define where data may be stored or used. Karjoth et al. [21] state that their methodology protects personal data within an enterprise with trusted systems and administrators against misuse or unauthorized disclosure. With public cloud computing personal data is transferred to another enterprise where the systems and administrators are not controlled by the enterprise. To counter this problem it is possible to use encryption techniques [22] where a data subject encrypts his personal data with a general public key and attaches a sticky policy. These techniques ensure that only the private keys of approved data users are able to decrypt the personal data. In the next subsections two cryptographic methods which use this concept are discussed Broadcast encryption Broadcast encryption [23] is an encryption method which supports the specification of who can decrypt data. Each entity which should be able to decrypt documents receives a unique private key. A data subject can explicitly define which private keys are able to decrypt the data. An implementation consists of the following three methods:

30 30 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING Setup(n) This method generates n private keys d 1,,d n and a public key PK. Encrypt(S,PK) The data subject defines which keys can decrypt the data by creating a set S which contains the identification numbers (1,,n) of the selected private keys. Encrypt takes this set and the public key as parameters and produces a header (Hdr) and an encryption key (K) for the data. Now the data (M) can be encrypted using a symmetric encryption method: CM = SymCrypt(M, K). The result of the Encrypt method is (S, Hdr, CM). Decrypt(S, i, di, Hdr, PK) Decryption of the data starts with checking if i is in the set S (which is taken from the encrypted data container (S, Hdr, CM)). If i is in S and d i is the valid private key of entity i, Decrypt returns the symmetric encryption key K. The original data can then be obtained by executing SymCrypt(CM, K). In the context of data location this method can be employed by generating a large number of private keys, for each user and each region. Setting the number of keys to for example 2 32 will result in a collection of over four billion keys. Each user receives the global public key and a private key. Each server in the data centers of the CSP receives the private key which is associated with the region where the server resides. A data subject starts by defining a set which contains the set of allowed region identification numbers. Because the data subject should be able to decrypt his own data his own identification number is also added to the set. With the set and the public key the Encrypt methods generates the key K, which is used to encrypt the data. The result of this step is stored in the database or file system of the CSP. When a server needs to access the data it retrieves the encrypted data, the header and the set S. Next, the server performs the decryption operations, which only succeed if the server is in an approved region. The example in Figure 7 demonstrates how broadcast encryption works Ciphertext Policy Attribute Based Encryption With Ciphertext Policy Attribute Based Encryption (CP-ABE) [24] the data subject is able to encrypt data using a global public key and a constraint on the values of different attributes. Each entity that has to decrypt data has a private key that is generated by giving values for the different attributes that are defined. CP-ABE ensures that the data can only be decrypted if the constraint is satisfied by the attribute values of the entity. To apply this method to data location, each server in the data centers of the CSP has a private key in which a region attribute is set to the region where the server resides. Special care must be taken to ensure that the private keys are not transferred to other machines after they are generated by a key distribution service within the same region.

31 4 DATA LOCATION 31 S = {1, 2}; (Hdr, K) = Encrypt(S, PK); CM = SymCrypt(M, K); Store (S, Hdr, CM). Retrieve (S, Hdr, CM); K = Decrypt(S, 2, d2, Hdr, PK); M = SymCrypt(CM, K). Retrieve (S, Hdr, CM); K = Decrypt(S, 3, d3, Hdr, PK) -> Fails, because 3 is not in S. Figure 7. In (1) the data subject (id = 1) defines that he and a region with id = 2 may decrypt the data. In (2) a server in the region with id = 2 successfully decrypts the data, while in (3) the decryption fails because the server is in the region with id = 3. The method works as follows: The data subject defines in which regions his data may be decrypted by setting the constraint to for example region = Europe. After this ci = Encrypt(pk, data, constraint) is generated and sent to the cloud application. When a server in Europe needs the data of the data subject, the server retrieves the encrypted data and tries to decrypt it. If the constraint evaluates to true using the attribute values from the server s private key the original data is obtained. Otherwise the decrypt function returns false. The following example illustrates how CP-ABE works when it is used for file based encryption: cpabe-keygen o EUserver1_priv_key pub_key master_key \ region = Europe cpabe-keygen o USAserver1_prev_key pub_key master_key \ region = USA cpabe-enc pub_key document.pdf region = Europe cpabe-dec EUServer1_priv_key document.pdf.enc -> Document.pdf cpabe-dec USAServer1_priv_key document.pdf.enc -> False Figure 8. Usage of CP-ABE for file encryption. In (1) a private key is generated for a server in Europe, and in (2) for a server in the USA. The data subject defines the region and encrypts his data in (3). In (4) the European server tries to decrypt the file and succeeds, while the USA based server cannot decrypt the file (5) Data Residency Another data protection method is data residency [25]. With data residency sensitive data will be tokenized or encrypted by an on premise proxy server before it is stored in the cloud. This means that sensitive data does not leave the premises of the customer, while it is still possible to benefit from cloud services. A CSP that offers this method is Salesforce [25].

32 32 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING Figure 9. The string Alice passes through an encryption proxy before it is transferred over the internet to a cloud environment. When the data is retrieved the cloud responds with the encrypted data and the on premise proxy decrypts the data again Methods to perform data location monitoring In this section two methods for monitoring the location of data are described. The first method concerns the logging of trans border movements of data by the CSP, while the second method allows a customer of a CSP to estimate where the data center which contains his data is located Logging of data location A CSP can implement methods which give their customers insight into the location of their data. This can be a dashboard which reports where data is being stored. Logging of data location for such a dashboard can be done using two different approaches. The first approach is to log the data location at the moment when it is stored. In this way it is possible to report in which countries the data is stored on the file systems of the CSP. But what if the data is copied to another storage medium which does not support the logging methods? In general it is not possible to guarantee that all copy operations on data are logged. The second approach can provide more insight into the location of copies of data by logging all data movements to other countries. If a user accesses a record via a browser from another country data is moved to the country where the user resides. This implies that all reading operations on the data should be logged Triangulation of the location of data In [26] Peterson et al. propose a method which uses internet based triangulation to monitor data location. The idea is to retrieve data from multiple locations which are distributed around the globe. Because data cannot travel faster than light there is a lower bound on the response time of the requests. Furthermore, network equipment through which the data flows also incurs an extra delay. When a request is repeated the response time will not be constant. By using the response time and information about the network topology it is possible to find the approximate distance between the probing nodes and the server which responds to the request. Because of the variance in the delays it will be necessary to repeat the measurement several times to get an accurate approximation of the distance between the probing nodes and the server.

33 4 DATA LOCATION 33 With the resulting distances it is possible to triangulate the location of the server with quite high accuracy. An overview of this method is depicted in Figure 10. Figure 10. Finding the location of a server by triangulation. The dots represent the probing nodes, the dotted circles represent the distance between the probing node and the server. The place where the circles intersect is the probable location of the server. (Figure from [26]) 4.4. Discussion In this chapter several methods for enforcing and monitoring data location in the cloud were covered. In this discussion an analysis will be made of the adequateness of the methods by looking at their characteristics, advantages and disadvantages. The discussed methods are: 1. Choosing the data location 2. Broadcast encryption 3. Ciphertext Policy-Attribute Based Encryption 4. Data Residency 5. Logging of data location 6. Triangulation Using sticky policies in combination with an encryption method can be an effective way to enforce data location. Encrypted data can be moved to other countries, as long as the private keys are not transferred. This ensures that the data can only be decrypted in the right places. It is very important that the keys and the encryption processes are used properly; otherwise both encryption based methods are vulnerable to unauthorized access. In the case of broadcast encryption each entity which can decrypt data can also change the set S and re-encrypt the data. In this way it is possible to add other valid decryption keys without consent of the data subject. Fortunately, the data subject is able to observe the changes to the set of accepted entities. However it requires active monitoring by the data subject. In the case of CP-ABE it is possible to generate a new key where the location attribute is set to a false location: an administrator in the US who generates a key with location = Europe for a server in the US. This is impossible to detect by the data subject.

34 34 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING Giving customers the possibility to choose the location where data will be stored is quite effective if the contract and SLA give enough assurance that the location settings will be enforced. This method will not give an absolute guarantee that data will not be transferred to other regions. Microsoft Windows Azure claims that it will not transfer data to other regions without explicit permission of the customer, except where necessary for Microsoft to provide customer support, to troubleshoot the service, or comply with legal requirements. This may cause violations of local privacy laws. Using data residency is very effective if all users of an application are within the network of a single organization. Because sensitive data is encrypted or tokenized before it leaves the premises of the customer there is no storage of sensitive data in the cloud. A disadvantage of this method is that it is needed to install a proxy server within the network of the customer. This reduces the accessibility of the cloud service and it also reduces the cost reduction advantage of cloud computing. Logging movements or storage locations of data can be effective if the cloud infrastructure does not allow movements or storage which circumvents the logging mechanisms. All methods except triangulation need to be implemented and managed by the CSP. Because the location of data gives no guarantees that certain governments cannot access the data [17], and many CSPs claim the data location is not a very important issue ([19][27]), it is unlikely that CSPs will implement methods which will require large investments, complex software and additional large scale cryptographic key management. Triangulation of the location of data is the only method which is under control of the customer, so implementation does not depend on a CSP. Question is how reliable this method is. CSPs often have multiple copies of data and do not necessarily deliver this data to users from all data centers, so probably not all copies will be found. Therefore triangulation is a method which can give information about suspected violations by the CSP, but it cannot give guarantees that the data has not been transferred to other regions Conclusions Completely enforcing that data is only stored in specific regions is only possible by encrypting it on the client side and storing the private keys outside of the cloud. The effect is that CSPs are very limited in what they can do with the data, and many cloud applications will be rendered useless. All methods which give the CSP the possibility to process the data cannot fully enforce the data location. The described enforcement methods will reduce the risk of data transfer to other regions, as long as they are managed in a proper way. Using such a method increases the confidence customers can have in CSPs when it comes to compliance to data privacy laws. Table 6 summarizes the characteristics of the methods that were covered in this chapter.

35 4 DATA LOCATION 35 Table 6. The methods to enforce or monitor data location and their characteristics. Effectiveness CSP effort Customer effort Impl. Control Exists Choosing location Medium. Depends on how the CSP implements the policy. Low. Most DFSs are already location aware. Low. Customer only has to choose during set up. CSP Customer / CSP Yes Broadcast encryption High. The customer defines which keys can decrypt the data. High. Key management. Medium/High. Key management. CSP Customer / CSP No CP-ABE High. Key management very important in this case because of the possibility of creating new keys to decrypt data. High. Key management. Medium/High. Key management. CSP Customer / CSP Yes Data residency High. Sensitive data never leaves the premises. Medium. Proxy and supporting software have to be developed. Medium. Needed to install and configure proxy before using cloud service. CSP/ Customer Customer Yes Logging Medium. It gives the customer a good tool to audit the storage locations, but it does not enforce anything. Medium. Storage requirements. High. The customer has to analyze the logs to find possible violations. CSP CSP Yes Triangulation Low. Data is often stored at different locations: hard to trace all copies. None High. Different nodes around the globe are needed. Customer Customer Yes

36 36 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING

37 5. Data Deletion 5.1. Introduction 5 DATA DELETION 37 Proper deletion of data is an important aspect of IT operations: data leakage must be prevented and many types of data are surrounded with rules on the maximum retention period. The consequences of data leakages and violations of data retention rules can be significant, as can be seen in the following case: On 13 June 2012 the Dutch Data Protection Authority (Dutch DPA) announced that it had imposed a penalty on the Dutch Railways for a violation of the national privacy laws (Wet Bescherming Persoonsgegevens, WBP [46]). In The Netherlands a contactless smart card system called the OV-Chipkaart is used to handle all the payments for public transport. The computer systems of the different public transport operators store logs of all the travel movements. Regulations state that the maximum retention period for this data is two years. After the maximum retention period all the PII must be deleted. Non-PII may be stored for a longer period for long term analysis, but only if it is impossible to relate the data to a natural person. After the maximum retention period the Dutch Railways have a procedure to delete the PII which is associated to a travel movement. However, the unique identification number of the chip card is not deleted. Because it is possible to relate an identification number to a natural person using another database the Dutch DPA ruled that the deletion procedure is not in compliance with the WBP. In addition an investigation concluded that the backups of the travel logs were stored for a longer period than allowed. The Dutch Railways received a penalty of 125,000 and the PII is deleted. The risk of insecure or ineffective data deletion is classified as one of the top risks by the ENISA report [2]. On-premise IT models offer a company full control over all the storage media so it is possible to overwrite data several times or to physically destroy a storage medium when data has to be deleted. With cloud computing using those methods is often not possible because a company does not have full control over storage media in the cloud and storage media are shared between customers. Many cloud storage systems use replication and versioning systems and it is not clear where the data is stored. So even if a CSP implements algorithms to properly delete data from storage media it is very hard to guarantee that all the copies of the data will be deleted. In section 5.2 several deletion methods will be described. Because a deletion method by itself is not enough to get control over the whole deletion process, a new method which tracks copies of data within a cloud infrastructure and uses this information to delete all copies will be introduced in section Deletion Methods Disk wiping and physical destruction When data is deleted from a hard disk the storage space is only marked as available so it can be overwritten with a new file. This means that the data is still available if it is known where it was stored on the disk. If the data needs to be deleted in a proper way it is possible to overwrite the data several times

38 38 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING with random bits [28]. The storage and database systems of CSPs abstract away from the low level functionality of the storage methods. Data is being replicated and previous versions are not deleted to provide rollback functionality. It is very hard to know which parts of which hard disks have to be erased. Effectively using physical destruction of a hard disk to delete specific data is even harder, because data of different customers is stored on a single disk. CSPs often physically destroy hard disks if they are no longer used in the cloud infrastructure Crypto-shredding Encrypted data can be deleted by destroying all the decryption keys. This method is called crypto-shredding [29]. The method works as long as the used encryption method is so strong that it is infeasible to break it. The main advantage of this method is that it does not matter how many copies of the data the CSP stores; if all copies are encrypted and the keys are destroyed all copies are deleted. Deletion can also be very fast because deleting the keys often involves much less data than deleting complete files. The main disadvantage is that in most cases the keys are known to the CSP, so the cloud service can access the data. The CSP has to store the keys on their infrastructure, so it becomes very hard to track how many copies there are of keys and where they are stored. Proving that all copies of a key are deleted in an irreversible way is thus very hard to do A data provenance based method for data deletion Data storage and processing often results in numerous copies of the same data. File systems automatically replicate data, back-ups are made, old versions of documents are stored in an archive, new data is derived from other data, data is used in ETL processes, etc. Properly deleting all copies of a data artifact can be very hard if it is not known exactly which copies there are. Fortunately there are methods that track from which artifacts an artifact was derived by which process. These so called data provenance methods answer the questions Where does this data come from? and How was it created? [30][31]. Data provenance methods implement the tracking of the ancestry of data as a directed acyclic graph (DAG) with edges that indicate from which artifacts an artifact was derived. If the direction of the edges is reversed the resulting graph holds information on which artifacts were derived from a specific artifact. This can be used to effectively find all the copies of some data and thus it can be used to find all artifacts that have to be deleted to effectively remove data. This idea will be expanded in the rest of this section. We will start with introducing a formalized model of data provenance in Next the complete concept is discussed in 5.3.2, and the corresponding algorithms are given in

39 5 DATA DELETION The Open Provenance Model The Open Provenance Model (OPM) [30] is a model for the collection, storage and exchange of provenance information. OPM describes data provenance in the form of a graph with different types of nodes and different types of edges. There are three types of nodes. Artifacts are pieces of data which are tracked by the provenance system. A process is an action or a series of actions on one or more artifacts; a process will result in one or more new artifacts. An agent is entity which acts as a catalyst for processes. In Figure 11 a graphical representation of the different nodes is shown. To connect the nodes five types of edges are used. used: Indicates that process P used artifact A, wasgeneratedby: Indicates that artifact A was generated by process P, wascontrolledby: Indicates that process P was controlled by agent Ag, Figure 11. OPM Nodes wastriggeredby: Indicates that process P2 was triggered by process P1, wasderivedfrom: Indicates that artifact A2 was derived from artifact A1. Figure 12. OPM Edges In Figure 13 an example of an OPM graph is shown. It describes the process of baking a cake (bake). This process used 100g butter, two eggs, 100g flour and 100g sugar (the dotted edges represent usededges). The process was controlled by John. The result was a cake. The solid edges specify from which ingredients the cake was derived (wasderivedfrom-edges).

40 40 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING Figure 13. Example of an OPM graph (from [30]) General idea If a CSP implements a data provenance system one can ask the question from which artifact a certain artifact was derived. It can be answered by following the wasderivedfrom edges from that artifact. Because the start and end points of all the wasderivedfrom edges are known it is also possible to reverse these edges, so it is possible to find out which artifacts were derived from a specific artifact. By repeating this step for all derived artifacts until no new artifacts are found a graph containing all artifacts which were derived from a specific artifact can be constructed. Such a graph gives an overview of where data is stored throughout the cloud infrastructure. Besides that, such a graph can be used to determine which artifacts have to be deleted if some data has to be deleted. The deletion algorithm will delete or process the descendants in a recursive way, after which it can process the root artifact. While this approach guarantees that all copies that are known to the provenance system will be deleted, it is often an approach which is too drastic. Many artifacts are constructed from more than one artifact, so deleting a complete artifact might destroy more data than desired. To address this problem, wasderivedfrom edges can be annotated with an instruction on which action the algorithm has to take on the artifact at the other end of the edge. The actions that the algorithm can perform on an artifact are: delete: Completely delete the artifact, partial delete: Scan through the artifact and only delete data which needs to be deleted, anonymized: Anonymize (parts of) the data, so it is still usable but there is no sensitive information left, none: Keep the artifact intact and do not perform actions on artifacts which are derived from the artifact. The algorithm will report which actions it has taken on which artifacts, so a customer can inspect if the data was deleted in a proper way. It is also possible to check if all relevant artifacts were deleted by trying to retrieve them from the cloud application. If all requests fail or do not contain the data that needs to be deleted, the deletion operation was successful.

41 5 DATA DELETION 41 In the next subsection a more formal description of the annotations and the algorithm will be given The algorithm Before the issuer of the deletion operation starts with deleting the artifact he retrieves the provenance graph of all artifacts that were derived from the artifact that has to be deleted. This information is needed to be able to verify if the deletion algorithm performed the right operations. Retrieving the provenance graph for artifact A is done by calling GetTree(A) (Figure 14), which is a method that creates a new Graph object which will store the result and calls another method GetTree(A, graph). GetTree(A, G) (Figure 15) adds the current root node A to the list of nodes of the graph and retrieves all provenance edges between A and other artifacts which were derived from A. These edges are added to the list of edges of the graph. Next the method does a recursive call to start the processing of the artifacts which were derived from A. When all calls to GetTree(A, G) return GetTree(A) will return the complete provenance graph for artifact A. GetTree(A): Graph graph := new Graph() GetTree(A, graph) return G Figure 14. GetTree(A) retrieves the provenance graph with root artifact A. GetTree(A, G): G.nodes.add(A) edges := GetProvenanceEdges(A) G.edges.append(edges) for each edge in edges: GetTree(edge.dest, G) return G Figure 15. GetTree(A, G) adds nodes and edges to the provenance graph for the sub graph which is rooted at A. Deleting an artifact A starts with calling DeleteArtifact(A) (Figure 16). This method initializes a Log object which keeps track of all the actions that are performed by the algorithm so it can be checked if the algorithm performed the right actions. Next the set of all edges between artifact A and artifacts that were derived from A are retrieved. For each edge ProcessEdge is called, which recursively handles the graph with starting point edge.dest. After the algorithm is finished with processing the derived artifacts DeleteArtifact will Delete A and add the result of that operation to the log.

42 42 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING DeleteArtifact(A): Log log := new Log() edges := GetProvenanceEdges(A) for each edge in edges: ProcessEdge(edge, log) log.add(delete(a)) return log Figure 16. DeleteArtifact(A), the top level method of the algorithm which starts the graph traversal and deletes artifact A. ProcessEdge(E, log) (Figure 17) takes an edge E and a log object log as parameters. It starts with checking if the action annotation does not indicate that the algorithm should not propagate past the destination artifact of the edge. If the algorithm should propagate ProcessEdge retrieves all the provenance edges from E.dest to derived artifacts and calls ProcessEdge for all the retrieved edges. After the derived artifacts are processed ProcessEdge determines which action it has to take on the current artifact E.dest. This can be delete, partial_delete or anonymize. The Delete, PartialDelete and Anonymize methods are methods that need to be implemented for the specific application where this algorithm is used. When the algorithm is finished with traversing the graph DeleteArtifact returns the log object which contains a list of all actions that were performed by the algorithm. This list can be used by the issuer of the deletion operation to verify whether the data was deleted properly. This can be done by comparing the list with the provenance tree that was retrieved before the deletion operation. ProcessEdge(E, log): if E.annotations.action!= none : edges := GetProvenanceEdges(E.dest) for each edge in edges: ProcessEdge(edge, log) switch E.annotations.action: case delete : log.add(delete(e.dest)) case partial_delete : log.add(partialdelete(e.dest, E.src)) case anonymize : log.add(anonymize(e.dest, E.src)) Figure 17. ProcessEdge(E, log) handles the artifact E.dest and recursively processes the part of the graph which originates from E.dest.

43 5 DATA DELETION Adapters The processing methods for data artifacts are very general. Delete, PartialDelete and Anonymize do not specify how they are implemented. Because the underlying storage methods and artifact types determine which actions have to be taken the processing methods will be polymorphic. For example, the Delete method needs to be implemented for a file, a record and other types. Methods with multiple parameters can be defined for any combination of types. Delete(File artifact) Delete(Record artifact) Anonymize(Record dest, Record src) Anonymize(Record dest, File src) Figure 18. Examples of polymorphic artifact processing methods. The specific implementations of the processing methods need to be done by the developers of an application that will use the provenance based deletion method An example In the left part of Figure 19 a provenance graph with six artifacts and their corresponding provenance edges is shown. A3 was derived from A1 and A2, A4 and A5 from A3 and A6 from A5. If A1 or A2 will be deleted A3 also has to be deleted. If A3 is deleted A4 will be anonymized and no action will be taken against A5 and A6. If A5 is deleted A6 is also deleted. Now the algorithm is started by calling DeleteArtifact(A1). During execution it performs the following steps (where <A2,A1> represents the edge that indicates that A2 was derived from A1): 1. DeleteArtifact(A1) calls ProcessEdge(<A3,A1>, log); 2. ProcessEdge(<A3,A1>, log) calls ProcessEdge(<A4,A3>, log); 3. As there are no provenance edges that point to A4 ProcessEdge(<A4,A3>, log) anonymizes A4, logs the result and returns; 4. ProcessEdge(<A3,A1>, log) calls ProcessEdge(<A5,A3>, log); 5. Because <A5,A3> has none as its action annotation ProcessEdge(<A5,A3>, log) does nothing and returns; 6. ProcessEdge(<A3,A1>, log)deletes A3, logs the result and returns; 7. DeleteArtifact(A1) deletes A1, logs the result and returns the log to the caller. The resulting provenance graph is shown in the right part of Figure 19. Note that the resulting graph is not connected. This is undesirable in cases where the provenance information is also used for other purposes. This problem can be solved by preserving the edges or by creating new edges between the remaining artifacts (<A4,A2> and <A5,A2> in this case).

44 44 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING Simulation To test the algorithm it is implemented in a simple test script written in Python 1. The artifacts and was- DerivedFrom edges are represented by a networkx 2 graph structure which can be manipulated using an API. With this graph structure and API the methods of the algorithm are implemented and tested. The code of the simulation program can be found in Appendix A. Deletion algorithm. As a demonstration of the simulator the test case from the previous subsection is simulated. First, GetTree( A1 ) is called to verify the structure of the provenance graph. The result can be seen in Figure 20. Figure 19. A provenance with six artifacts and corresponding provenance edges. The result of deleting artifact A1 is shown in the right provenance graph. Node(A1) Edge(A3, A1) with action delete Node(A3) Edge(A5, A3) with action none Node(A5) Edge(A6, A5) with action delete Node(A6) Edge(A4, A3) with action anonymized Node(A4) Figure 20. The result of running GetTree("A1"). This structure corresponds to the graph in Figure 19 (left). Next, DeleteArtifact( A1 ) is executed and the returned log (Figure 21) is inspected. It can be seen that the algorithm behaves as expected. Anonymized artifact A4 Deleted artifact A3 Deleted artifact A1 Figure 21. The log items that are returned by running DeleteArtifact("A1"). 1 Python: 2 NetworkX:

45 5 DATA DELETION 45 Because the provenance data is not connected after DeleteArtifact( A1 ) the raw lists of nodes and edges are printed. From these lists one can infer that the resulting graph has the same structure as the right part of Figure 19. [ A2, A5, A4, A6 ] (nodes) [( A6, A5 )] (edges) Figure 22. The resulting graph after the execution of DeleteArtifact("A1") Discussion We have seen that proper data deletion is important. Many laws and policies require companies to delete specific data, but often there are numerous copies of data which are hard to track. Deleting all copies is thus also hard to do as we have seen in the case of the Dutch Railways. Especially in the cloud a company has less control over the storage media where its data is stored. This calls for effective methods to delete data. If it is clear what has to be deleted the deletion operations should delete the data in an irreversible way. This can be done using disk wiping, physical destruction or crypto-shredding. The first two methods are problematic to use in the cloud to delete data of a specific customer because data of different customers is stored on the same storage medium. The former method overcomes this problem; however it depends on the effectiveness of the key destruction methods. These three methods can be very effective; however it is next to impossible for a CSP to prove to a customer that data has been deleted. In many cases it is not clear how many copies there are of a specific data artifact; there are many ways in which copies of data can be made. To address this problem this chapter suggested the use of data provenance to determine which artifacts have to be deleted or altered. If the provenance metadata is collected in a consistent way, the method can track all copies of a data artifact which are within the control boundaries of the CSP. The proposed algorithm uses the provenance graph to recursively delete a data artifact and all other artifacts that were derived from that data artifact. Deleting data is not as easy as deleting all data which was derived from that data. Artifacts can be derived from different other artifacts, some policies require that data is stored for a longer duration and anonymized data can be used to perform statistical analysis. To support these use cases the provenance data can be annotated with action parameters which specify which action the deletion algorithm has to take. With this algorithm there is still no absolute proof that the data is gone, but the methods that allow a user to see the governance graph and that give a list of all actions the algorithm has performed make the deletion process more transparent and thus easier to perform audits on. The adoption of data provenance and data provenance based deletion methods can be quite hard because it takes much storage space and it is not trivial to implement. However, with the increasing focus on accountability and matters such as privacy laws that dictate maximum retention periods, data provenance may become a necessity.

46 46 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING 5.5. Conclusions To perform proper data deletion from storage media methods like disk wiping and crypto-shredding are needed. Because there are many cases where it not completely clear what has to be deleted a data provenance based method is described which uses the provenance information to determine which data artifacts have to be deleted or altered. The method also allows a user to inspect the actions taken by the deletion algorithm, which makes data deletion more transparent. Although the implementation of data provenance and this method require a considerable amount of effort and storage, matters such as accountability and privacy laws may increase the adoption of data provenance and data provenance based deletion methods. In Table 7 the characteristics of the discussed data deletion methods are summarized. Table 7. The characteristics of the discussed data deletion methods. Effectiveness CSP effort Customer effort Impl. Control Exists Disk wiping Low. Data of multiple customers on single disk. Only usable for end of life destruction of disk. Per customer: High. End of life: Medium Low CSP CSP Yes Cryptoshredding High. With proper key management all copies of a data artifact can be deleted instantly. Crypto already in use: Low Low Customer / CSP Customer / CSP Yes Provenance based deletion High. Clear which copies of data exist and which copies need to be deleted. Highest effectiveness in combination with crypto-shredding. High. Necessary to track all data operations. Medium CSP Customer / CSP No

47 6. Data Leakage 6.1. Introduction 6 DATA LEAKAGE 47 Leakage of data to external parties is a serious risk in traditional computing as well as in cloud computing [2][10]. The Open Security Foundation keeps track of reported data leakage incidents on a special website ( Last year 1041 incidents were reported. Such an incident can have a very big impact on an organization in the form of financial and reputational loss and legal problems. Mitigating the risk of data leakage is difficult because there are different types of adversaries. Data can be leaked by internal or external persons, and leakage can be intentional or by accident (see Table 8). Figure 23 shows a breakdown of data leakage incidents by type of adversary or vector. Table 8. Examples of data leakage scenarios with internal and external adversaries which leak data intentionally or by accident. Internal Internal Intentional ing a secret document to a competitor Bypassing security measures in a computer system to gain access to internal data By accident Losing an unencrypted USB-stick with sensitive data Stumbling upon confidential documents which were placed on a public facing website by accident Figure 23. Incidents by vector (source: Especially difficult is the protection of data against leakage by internal users with permission to access that data. It is not suspicious if an authorized user accesses data, so only when that user tries to leak data detection is possible. With on-premise IT infrastructure and managed workstations and laptops it is possible to use technologies like traffic inspection at the network boundary between the company s internal network and the internet. Another widespread practice is to install endpoint software which monitors user activity. With cloud computing these methods are less effective because data needs to be transferred between the CSP and the user before it can be used. This means that it is not possible to monitor traffic at a network boundary. With emerging Bring Your Own Device (BYOD) possibilities companies also have less control over the devices that are used to access the cloud application, so it is not always possible to

48 48 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING install endpoint protection software on all devices that have access to the application. External attempts to gain access to data in the cloud can be performed in many different ways. Each security flaw in the cloud software stack can potentially lead to data leakage [32]. In the next sections several cloud specific technologies to prevent and detect data leakage will be discussed Prevention Preventing data leakage by insiders starts with ensuring that users can only access data that they need to perform their work. This does not directly protect against data leakage, but it reduces the amount of data a single person is able to leak. Because it is not possible to prevent data from leaving the cloud infrastructure, methods that prevent data leakage should reduce the amount of useful data that can be leaked. This can be done by masking the data (6.2.1), enforcing access controls using cryptography and trusted computing (6.2.2) or polluting a data set with fake data (6.2.3) Data Masking Reducing the amount of sensitive data that can be leaked is an effective approach to reduce the impact of data leakage incidents. But just hiding the information is not always possible. Software development often involves separate development, testing, acceptance and production environments. Users of the production environment have for example access to customer records which contain addresses and credit card numbers. But users of the testing environment (testers) also need realistic data to properly test the application. Using production data for testing purposes increases the risk of data leakage because the data is available to more users. Because the most important requirement for activities like testing is that the data is realistic it is possible to modify the data in a way which makes the data less sensitive. This can be done in several ways: Substitution: Sensitive data can be replaced with (randomly) generated values. For example, credit card numbers can be replaced with randomly generated valid credit card numbers. Shuffling: Randomly replacing sensitive values with values from other records. For example, a telephone number can be replaced with the telephone number from a random person in the database. Number or date variance: Adding (small) random numbers to numerical data. Nulling out or deletion: Completely removing sensitive data from records. Masking out: Replacing (parts of) a value with *, # or another character. For example, credit card numbers need to be (partially) masked out if a user does not have a legitimate business need to see the full credit card number [33]. These methods can be applied on the whole dataset before it will be used (static data masking) or on specific records at the moment they are requested (dynamic data masking). It is also important to select a data masking method which makes it infeasible to reverse the masking operation. Shuffling and number or data variance can be reversed if the masking operation is not random enough.

49 6 DATA LEAKAGE Trusted Computing In [34] and [35] Alawneh and Abbadi introduce a method for protecting sensitive documents against leakage incidents. The method uses encryption and the concepts of Trusted Computing 1 to regulate which users and machines within and outside the trust boundary of an organization have access to files. Because the method can transcend trust boundaries it is potentially an effective technological measure to overcome an inherent problem of cloud computing: data needs to be transferred to clients which are not inside the control boundary. Sensitive documents can be shared within a dynamic domain. A dynamic domain is a collection of devices and it has an identifier i D and a symmetric key k D which is used to perform cryptographic operations on documents which need to be shared within the dynamic domain. The dynamic domains and keys are managed by a master controller. Each device needs to have a Trusted Platform Module (TPM), which is a chip that can handle cryptographic operations and securely store cryptographic keys. When a device becomes a member of a dynamic domain its TPM needs to receive and store the i D and k D. Because a TPM is a specialized hardware component with tight security measures it can be used to protect the key against malicious use. A device retrieves encrypted documents from the master controller, and can only decrypt the document if the device is member of the right dynamic domain. Modified or new documents can be encrypted with the key of the desired dynamic domain before sending the document back to the master controller Fog Computing In [36] Stolfo et al. propose a new approach for preventing data leakage from cloud applications. Data access patterns are monitored for abnormal user behavior to detect user masquerading. Because this method can lead to more false positives than desired, it is extended with decoy information. These documents are genuine looking documents, but they do not contain real data. A normal user who is performing his normal tasks will never access these files, but a masquerading user who will most likely have less knowledge about the structure has a significant chance of accidentally touching these decoy files. With the combination of these two approaches it is possible to detect masquerade attacks with a false positive rate of 1.12%. Monitoring user behavior is done by analysis of the file search operations by the users. In most cases a normal user will issue targeted and limited search queries, while a masquerading user will perform much wider search operations because he has less knowledge about the file system structure. The decoy method tracks if the decoy files (or honeypots) are accessed by a user. If a user accesses one or more decoy files and the search behavior monitoring tool also classifies the user s behavior as suspicious the software will raise an alert and it will poison the data which is sent to the user with large amounts of legitimate looking fake data. This will diminish the value of the leaked data. 1 Trusted Computing Group:

50 50 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING 6.3. Monitoring While fog computing is a combination of data leakage monitoring and prevention, the following two approaches are only able to detect data leakage incidents. In Log file analysis (6.3.1) the very general approach of analyzing log files to find strange behavior will be discussed. In Psychological triggers (6.3.2) a more specific log analysis approach which promises to be more accurate and less storage demanding is discussed Log file analysis A data leakage monitoring method which requires a relatively small amount of software at the side of the CSP is logging all file or record read, create, update and delete operations. These logs can be analyzed by external tools. This approach is especially suitable for detecting incidents in which large amounts of data are transferred, because the analysis techniques focus on finding deviant behavior of users. If a user only leaks a single document during office hours when it is normal for him to access such kinds of documents, it is very hard to detect this event. But if a user opens a document outside office hours or if a user suddenly opens a large amount of files, it can be a clue that unauthorized actions are performed on the data Psychological triggers In [37] Sasaki proposes a new method for detecting insider threats using psychological triggers. This method should have significant advantages over existing analysis methods for detecting suspicious behavior. These advantages will be achieved by: Reducing the number of false negatives, Reducing the number of false positives, Covering more use cases, Reducing the storage requirements. Sasaki argues that it is possible to trigger alternate behavior by malicious insiders and use detection methods to find those users. A trigger can be a companywide announcement that an investigation will start. It is likely that users which have something to hide from the investigators will stop their malicious activities and try to destroy evidence by deletion or altering files, s and other data. Before the users are triggered an agent which logs all actions by all users will be started. After the trigger is fired the logger will continue. When the logger is terminated it is possible to analyze all actions and see which users have a significant difference in behavior before and after the trigger. This method should be better than the existing methods because it addresses the four requirements in the following way: False negatives: A data leakage might be very hard to detect because a malicious user can leak data which he already had to use to perform his job and the leaking can be performed on untrusted machines. Deletion of evidence is easier to detect because it is less likely that these actions will blend into the normal behavior of a user.

51 6 DATA LEAKAGE 51 False positives: The number of false positives is the false positive rate multiplied by the number of events. So, if the number of events is reduced the false positive rate is also reduced. By only monitoring user behavior just before and during the investigation there will be much less events in comparison to continuous monitoring methods. Use cases: This method detects psychological reactions instead of specific behavioral patterns that are associated with leaking data, stealing money, espionage, etc. Storage requirements: The storage requirements are reduced because the logging tool will only be activated during a limited amount of time. In cloud computing applications this method can be employed by using a subset of the audit trails Discussion The methods that were discussed in the previous sections all try to prevent or detect data leakage incidents. A single method will not be able to completely remove the risk, but with knowledge of the most important advantages and disadvantages of the discussed methods it is possible to assess their effectiveness in different scenarios Data Masking We have seen that data masking techniques are able to reduce the amount of sensitive data that can be leaked from an application. Especially in development and testing environments the most important requirement on data is that it is realistic, so masking sensitive values with other (random) realistic values significantly reduces the risk of data leakage Trusted Computing The work of Alawneh and Abbadi [34][35] promises good protection against data leakage by using dynamic domains, cryptography and TPM technology. Using this method it should be possible to reduce the risk of (inadvertent) data leakage. However, because there is no complete control over the device and the other software which is running, it is not possible to prevent actions such as copying parts of a document to an unprotected document. So, this method makes (especially inadvertent) data leakage harder, but certainly not impossible. Another problem with this method can arise when devices are lost or stolen and do not have network connectivity, because the master controller needs to communicate with devices to revoke keys. Thus, if encrypted documents are stored on the device it is still possible to decrypt those documents until a network connection is established. When the method is used in cloud computing this problem will not have a big impact because in-browser cloud applications reload data on each request Fog Computing Fog computing is a method which tries to prevent data leakage by masquerading persons. The assumption that a masquerader knows less about the structure of the file system implies that he is not an internal user who uses the account of a colleague.

52 52 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING It might be possible to extend fog computing by using a machine learning algorithm which learns what the normal behavior of each user is. A classifier can then detect if the current usage patterns significantly differ from the normal usage patterns, and thus detect if an internal user is masquerading. Question is if a disinformation attack is effective in this case because an internal user might easily notice whether data is fake Psychological triggers The reduced timespan in which investigations using this method are done has the advantages that less storage is needed and that there will be less false positives. A disadvantage is that an organization must suspect that something happened, because an investigation has to be announced. If there are no suspicions there will be no investigation Conclusions We have seen that no single method can completely prevent data leakage by internal persons. Especially in cloud computing environments many classical protective technologies cannot be used because not all devices are under control of an organization. The most effective way to reduce the risk of data leakage is to reduce the amount of information a single user can leak; most of the discussed methods focus on accomplishing this. Table 9. The characteristics of the discussed data leakage prevention and detection methods. Data masking Trusted computing Fog computing Log file analysis Psychological triggers Effectiveness CSP Effort Customer effort Impl. Control Exists Medium. Some users still need access. Medium. Restricts access to specific users, but copy-paste cannot be prevented. Medium. Completely depends on detection algorithm and amount of disinformation. Low. Large amounts of data and abuse can be too subtle to notice. Medium. Less data to analyze and triggering of suspicious behavior. Suspicion needed to start an investigation. Medium Low CSP CSP Yes Medium High. Analysis of all data operations. High. Need to find patterns in large amounts of data. Medium. CSP only needs to supply log files. High. All devices need TPMs and management is difficult. Low. Customer receives alerts. Low. Customer receives alerts. High. Customer or other party needs to investigate. Customer / CSP CSP CSP or Customer Customer (and CSP) Customer / CSP Customer / CSP Customer / CSP Customer Yes Yes Yes Yes

53 7. Data segregation 7.1. Introduction 7 DATA SEGREGATION 53 The segregation of data from different customers is a very important requirement of cloud services. Incidents where customers can access data from other customers can diminish the trust which customers have in the service. Because it is such an important aspect CSPs put a lot of effort in getting data segregation right. The amount of documented cases is low (the footnotes contain some examples 1 2 ). The segregation of data of different customers needs to be enforced on all levels. If data is only properly segregated while it is stored on hard disk, attackers may access data from other customers during data transfer or processing. To cover all aspects of data segregation the three states of data model will be used [38]. This model states that data can be in one of three states: Data at Rest (DaR): Data that is stored in a database or file system (storage). Data in Motion (DiM): Data that is being transferred from a database or file system to a place where it will be processed (networking). Data in Use (DiU): Data that is currently being processed by an application and processor (computing). In the next three sections the different methods which can be used to segregate data in the different states will be discussed Data at Rest Databases Multi-tenant databases can be separated using three different approaches. The approach with the highest isolation is using a separate database for each customer. Using separate schemas within a single database introduces more flexibility, but it reduces isolation. With the shared schema approach it is easier to support large numbers of customers. However, with shared schema the developers of the application are completely responsible for the correct implementation of the segregation controls [39] Separate databases Giving each customer separate databases and database authentication credentials offers the most isolated data segregation method because data is logically separated by the database server software. Besides the proven security controls of database systems using separate databases also brings two other advantages. First, it is less difficult to alter the data model on a per customer basis. Second, it is easy to restore backups for a specific customer than with other methods where data of different customers is stored in the same database. 1 SCUS_Newswire 2

54 54 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING Figure 24. Data segregation based on separate databases. The downside of this approach is that it is a relatively expensive approach. Often there is a maximum number of databases which can be hosted on a single database server, and thus the cost of hardware and server administration increases Separate schemas A schema is a logical collection of tables which can have specific (security) settings. Specifically, it is possible to restrict user access to specific schemas instead of specific databases. By using separate schemas for data segregation it is possible to service more customers with a single server, and thus it is a cheaper approach than using separate databases. The data model flexibility advantage of separate databases still holds for separate schemas. A disadvantage is that restoring backups is harder because only a part of a database needs to be restored. Figure 25. same database. Data segregation based on separate schemas within the Shared schema Storing the data from different customers in the same database and the same schema offers the biggest scalability advantages. However, because a type of record is stored in the same table for different customers and database servers have no native support for data isolation at this level, it is the responsibility of the CSP to implement effective isolation controls. The most used method to implement these controls is to associate a tenant ID with reach record. When a query is executed an extra WHERE clause will ensure that only records with the tenant ID of the current user are returned File storage Authentication and authorization To ensure that only the right users can access files in the cloud, the cloud service needs to implement strong authentication and authorization mechanisms. Authentication mechanisms are needed to verify the identity of the current user. In the context of data segregation authorization mechanisms are needed to ensure that the user can only access files that are owned by that user or his employer.

55 7 DATA SEGREGATION 55 Figure 26. Data segregation based on a shared schema. Authorization controls can be implemented in the underlying file system of a storage service or in the service that handles the access to the stored files. If file system level controls are used the CSP can reduce its development efforts and files are also isolated in the file system. If the controls are in the service software the CSP has to develop them, which results in more work and possibly increased security risks. The advantage of this approach is that a cloud service user account does not have to correspond to a user account in an underlying file system, so a file system can be simple and more scalable Encryption Encrypting the data of a customer with a customer specific key can also reduce the data segregation risk. If the cloud service needs access to the data the key needs to be stored in the same infrastructure, so in this case data segregation depends on the secrecy of cryptographic keys. In cases where data is only stored in the cloud and the cloud services do not need access to the data the key can be kept out of the cloud. In this case the segregation of user data is very strong. The access of cloud services which need to process data to decryption keys poses increased security risks. This problem can be reduced if it is possible to use and manipulate data without decrypting it. Homomorphic encryption (HE) is a relatively new encryption paradigm which can be used to achieve that. The idea is to take an operation on a specific type of data and to find an equivalent (homomorphic) operation which acts on encrypted data. Unfortunately there are only equivalent operations for a limited number of normal operations. HE is also computationally expensive. Therefore, HE is not yet mature and cannot be used for practical purposes. Operation(Data) = Data CryptoOperation(E(Data)) = E(Data ) Figure 27. If Operation() and CryptoOperation() are homomorphic CryptoOperation() can be applied to encrypted data to perform the same computation as Operation() and produce the same result in an encrypted way Data in Motion Especially in the case of IaaS clouds data segregation at the network level is important. With IaaS the tenants have (limited) access to the underlying network from their VMs. If the network is not properly secured it might be possible to intercept other tenants communications.

56 56 TECHNOLOGY BASED METHODS TO REDUCE THE RISKS OF CLOUD COMPUTING Virtual Local Area Networks Placing all (virtual) devices from a specific tenant in a separate Virtual Local Area Network (VLAN) can effectively protect network communication. In a VLAN each packet is tagged with a VLAN identifier. Existing physical and virtual network devices are aware of which connected devices belong to which VLANs, and segregate packets from different VLANs. Unfortunately there is a limit of 4k VLANs in a single network environment, which severely affects the scalability of using VLANs to segregate different tenants moving data. To make it scalable Hao et al. [40] propose an architecture which separates a cloud infrastructure into different domains, which each can contain the maximum number of VLANs. Each domain that contains VMs of customers is called an edge domain, and there is one core domain which facilitates and coordinates the communication between edge domains and between edge domains and the internet. The global structure of such a network can be seen in Figure 28. The boundary of an edge domain with the core domain is formed by one or more Forwarding Elements (FEs). They determine to which other FE a packet needs to travel. After this is determined an FE tunnels a packet to another FE. In this way a virtual network of a single customer can span multiple VLANs in different edge domains. Figure 28. The architecture from [40]. Another possibility of this approach is that it becomes possible to treat the internal network of a customer as another edge domain. The customer can connect using VPN technology, and the VPN gateway in the cloud acts as a FE. Using this approach it is possible to create a single virtual network from a cloud based virtual network and the physical network of the customer Encrypted communication The communication between two (virtual) machines within the cloud infrastructure can also be secured by using for example TLS or SSH tunneling in the application layer or IPSec in the internet layer. This works well in the case of IaaS where users have access to the network. With SaaS or PaaS customers do not have dedicated VMs, and network connections are not always specific to a customer. Therefore it is not feasible to use the discussed low level methods. To protect data in motion in these scenarios stored encrypted data should be decrypted after it is transferred Data in Use While a virtualization platform separates the processes and data of different customers hardware resources are shared. On a single machine a single processor core handles the work of several tenants. Under normal circumstances different processes will be separated in an effective way, but there are attack vectors like cache missing [41] which can be used to leak data from another process.

Residual risk. 3 Compliance challenges (i.e. right to examine, exit clause, privacy acy etc.)

Residual risk. 3 Compliance challenges (i.e. right to examine, exit clause, privacy acy etc.) Organizational risks 1 Lock-in Risk of not being able to migrate easily from one provider to another 2 Loss of Governance Control and influence on the cloud providers, and conflicts between customer hardening

More information

Cloud Computing Governance & Security. Security Risks in the Cloud

Cloud Computing Governance & Security. Security Risks in the Cloud Cloud Computing Governance & Security The top ten questions you have to ask Mike Small CEng, FBCS, CITP Fellow Analyst, KuppingerCole This Webinar is supported by Agenda What is the Problem? Ten Cloud

More information

New Risks in the New World of Emerging Technologies

New Risks in the New World of Emerging Technologies New Risks in the New World of Emerging Technologies Victor Chu Client Technical Professional Identity, Security, and Compliance Management Software Group IBM Malaysia Risk it s NOT a four simple letter

More information

Study on Cloud security in Japan

Study on Cloud security in Japan Study on Cloud security in Japan 2011/February Professor Yonosuke HARADA INSTITUTE of INFORMATION SECURITY (C) ITGI Japan Content 1 Background 2 Survey 2.1 Respondents 2.2 User on cloud services 2.3 Risk

More information

Cloud Computing. Cloud Computing An insight in the Governance & Security aspects

Cloud Computing. Cloud Computing An insight in the Governance & Security aspects Cloud Computing An insight in the Governance & Security aspects AGENDA Introduction Security Governance Risks Compliance Recommendations References 1 Cloud Computing Peter Hinssen, The New Normal, 2010

More information

Cloud Security Introduction and Overview

Cloud Security Introduction and Overview Introduction and Overview Klaus Gribi Senior Security Consultant klaus.gribi@swisscom.com May 6, 2015 Agenda 2 1. Cloud Security Cloud Evolution, Service and Deployment models Overview and the Notorious

More information

Security Considerations for Public Mobile Cloud Computing

Security Considerations for Public Mobile Cloud Computing Security Considerations for Public Mobile Cloud Computing Ronnie D. Caytiles 1 and Sunguk Lee 2* 1 Society of Science and Engineering Research Support, Korea rdcaytiles@gmail.com 2 Research Institute of

More information

DATA LOCATION COMPLIANCE IN CLOUD COMPUTING

DATA LOCATION COMPLIANCE IN CLOUD COMPUTING MASTER THESIS DATA LOCATION COMPLIANCE IN CLOUD COMPUTING J. Noltes MSC COMPUTER SCIENCE TRACK INFORMATION SYSTEMS ENGINEERING EXAMINATION COMMITTEE dr. ir. W. (Wolter) Pieters dr. ir. V. (Virginia) Nunes

More information

CPNI VIEWPOINT 01/2010 CLOUD COMPUTING

CPNI VIEWPOINT 01/2010 CLOUD COMPUTING CPNI VIEWPOINT 01/2010 CLOUD COMPUTING MARCH 2010 Acknowledgements This viewpoint is based upon a research document compiled on behalf of CPNI by Deloitte. The findings presented here have been subjected

More information

Managing Cloud Computing Risk

Managing Cloud Computing Risk Managing Cloud Computing Risk Presented By: Dan Desko; Manager, Internal IT Audit & Risk Advisory Services Schneider Downs & Co. Inc. ddesko@schneiderdowns.com Learning Objectives Understand how to identify

More information

Cyber Security and Cloud Computing. Dr Daniel Prince Course Director MSc in Cyber Security d.prince@lancaster.ac.uk

Cyber Security and Cloud Computing. Dr Daniel Prince Course Director MSc in Cyber Security d.prince@lancaster.ac.uk Cyber Security and Cloud Computing Dr Daniel Prince Course Director MSc in Cyber Security d.prince@lancaster.ac.uk Scope of Today SME Attractors for Cloud Switching to the Cloud Public Private Hybrid Big

More information

Security Challenges of Cloud Providers ( Wie baue ich sichere Luftschlösser in den Wolken )

Security Challenges of Cloud Providers ( Wie baue ich sichere Luftschlösser in den Wolken ) 23.11.2015 Jan Philipp Manager, Cyber Risk Services Enterprise Architect Security Challenges of Cloud Providers ( Wie baue ich sichere Luftschlösser in den Wolken ) Purpose today Introduction» Who I am

More information

CLOUD STORAGE SECURITY INTRODUCTION. Gordon Arnold, IBM

CLOUD STORAGE SECURITY INTRODUCTION. Gordon Arnold, IBM CLOUD STORAGE SECURITY INTRODUCTION Gordon Arnold, IBM SNIA Legal Notice The material contained in this tutorial is copyrighted by the SNIA. Member companies and individual members may use this material

More information

SECURITY CONCERNS AND SOLUTIONS FOR CLOUD COMPUTING

SECURITY CONCERNS AND SOLUTIONS FOR CLOUD COMPUTING SECURITY CONCERNS AND SOLUTIONS FOR CLOUD COMPUTING 1. K.SURIYA Assistant professor Department of Computer Applications Dhanalakshmi Srinivasan College of Arts and Science for Womren Perambalur Mail: Surik.mca@gmail.com

More information

Tufts University. Department of Computer Science. COMP 116 Introduction to Computer Security Fall 2014 Final Project. Guocui Gao Guocui.gao@tufts.

Tufts University. Department of Computer Science. COMP 116 Introduction to Computer Security Fall 2014 Final Project. Guocui Gao Guocui.gao@tufts. Tufts University Department of Computer Science COMP 116 Introduction to Computer Security Fall 2014 Final Project Investigating Security Issues in Cloud Computing Guocui Gao Guocui.gao@tufts.edu Mentor:

More information

How a Cloud Service Provider Can Offer Adequate Security to its Customers

How a Cloud Service Provider Can Offer Adequate Security to its Customers royal holloway s, How a Cloud Service Provider Can Offer Adequate Security to its Customers What security assurances can cloud service providers give their customers? This article examines whether current

More information

IT Security Risk Management Model for Cloud Computing: A Need for a New Escalation Approach.

IT Security Risk Management Model for Cloud Computing: A Need for a New Escalation Approach. IT Security Risk Management Model for Cloud Computing: A Need for a New Escalation Approach. Gunnar Wahlgren 1, Stewart Kowalski 2 Stockholm University 1: (wahlgren@dsv.su.se), 2: (stewart@dsv.su.se) ABSTRACT

More information

HIPAA CRITICAL AREAS TECHNICAL SECURITY FOCUS FOR CLOUD DEPLOYMENT

HIPAA CRITICAL AREAS TECHNICAL SECURITY FOCUS FOR CLOUD DEPLOYMENT HIPAA CRITICAL AREAS TECHNICAL SECURITY FOCUS FOR CLOUD DEPLOYMENT A Review List This paper was put together with Security in mind, ISO, and HIPAA, for guidance as you move into a cloud deployment Dr.

More information

Clouds on the Horizon Cloud Security in Today s DoD Environment. Bill Musson Security Analyst

Clouds on the Horizon Cloud Security in Today s DoD Environment. Bill Musson Security Analyst Clouds on the Horizon Cloud Security in Today s DoD Environment Bill Musson Security Analyst Agenda O Overview of Cloud architectures O Essential characteristics O Cloud service models O Cloud deployment

More information

Security Issues in Cloud Computing

Security Issues in Cloud Computing Security Issues in Computing CSCI 454/554 Computing w Definition based on NIST: A model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources

More information

Security of Cloud Computing

Security of Cloud Computing Security of Cloud Computing Fabrizio Baiardi f.baiardi@unipi.it 1 Syllabus Cloud Computing Introduction Definitions Economic Reasons Service Model Deployment Model Supporting Technologies Virtualization

More information

Adopting Cloud Apps? Ensuring Data Privacy & Compliance. Varun Badhwar Vice President of Product Strategy CipherCloud

Adopting Cloud Apps? Ensuring Data Privacy & Compliance. Varun Badhwar Vice President of Product Strategy CipherCloud Adopting Cloud Apps? Ensuring Data Privacy & Compliance Varun Badhwar Vice President of Product Strategy CipherCloud Agenda Cloud Adoption & Migration Challenges Introduction to Cloud Computing Cloud Security

More information

Cloud Computing: What needs to Be Validated and Qualified. Ivan Soto

Cloud Computing: What needs to Be Validated and Qualified. Ivan Soto Cloud Computing: What needs to Be Validated and Qualified Ivan Soto Learning Objectives At the end of this session we will have covered: Technical Overview of the Cloud Risk Factors Cloud Security & Data

More information

Cloud Computing and Security Risk Analysis Qing Liu Technology Architect STREAM Technology Lab Qing.Liu@chi.frb.org

Cloud Computing and Security Risk Analysis Qing Liu Technology Architect STREAM Technology Lab Qing.Liu@chi.frb.org Cloud Computing and Security Risk Analysis Qing Liu Technology Architect STREAM Technology Lab Qing.Liu@chi.frb.org 1 Disclaimers This presentation provides education on Cloud Computing and its security

More information

BUSINESS MANAGEMENT SUPPORT

BUSINESS MANAGEMENT SUPPORT BUSINESS MANAGEMENT SUPPORT Business disadvantages using cloud computing? Author: Maikel Mardjan info@bm-support.org 2010 BM-Support.org Foundation. All rights reserved. EXECUTIVE SUMMARY Cloud computing

More information

Cloud Computing Trends

Cloud Computing Trends UT DALLAS Erik Jonsson School of Engineering & Computer Science Cloud Computing Trends What is cloud computing? Cloud computing refers to the apps and services delivered over the internet. Software delivered

More information

White Paper on CLOUD COMPUTING

White Paper on CLOUD COMPUTING White Paper on CLOUD COMPUTING INDEX 1. Introduction 2. Features of Cloud Computing 3. Benefits of Cloud computing 4. Service models of Cloud Computing 5. Deployment models of Cloud Computing 6. Examples

More information

Virginia Government Finance Officers Association Spring Conference May 28, 2014. Cloud Security 101

Virginia Government Finance Officers Association Spring Conference May 28, 2014. Cloud Security 101 Virginia Government Finance Officers Association Spring Conference May 28, 2014 Cloud Security 101 Presenters: John Montoro, RealTime Accounting Solutions Ted Brown, Network Alliance Presenters John Montoro

More information

Orchestrating the New Paradigm Cloud Assurance

Orchestrating the New Paradigm Cloud Assurance Orchestrating the New Paradigm Cloud Assurance Amsterdam 17 January 2012 John Hermans - Partner Current business challenges versus traditional IT Organizations are challenged with: Traditional IT seems

More information

The Magazine for IT Security. May 2010. issue 3. sör alex / photocase.com

The Magazine for IT Security. May 2010. issue 3. sör alex / photocase.com The Magazine for IT Security May 2010 sör alex / photocase.com free digital version made in Germany issue 3 Luiz Fotolia.com Clouds or storm clouds? Cloud Computing Security by Javier Moreno Molinero Gradually,

More information

How To Manage Cloud Data Safely

How To Manage Cloud Data Safely Information Governance In The Cloud Galina Datskovsky, Ph. D., CRM President of ARMA International SVP Information Governance Solutions Topics Cloud Characteristics And Risks Information Management In

More information

Data Protection Act 1998. Guidance on the use of cloud computing

Data Protection Act 1998. Guidance on the use of cloud computing Data Protection Act 1998 Guidance on the use of cloud computing Contents Overview... 2 Introduction... 2 What is cloud computing?... 3 Definitions... 3 Deployment models... 4 Service models... 5 Layered

More information

Chapter 1: Introduction

Chapter 1: Introduction Chapter 1 Introduction 1 Chapter 1: Introduction 1.1 Inspiration Cloud Computing Inspired by the cloud computing characteristics like pay per use, rapid elasticity, scalable, on demand self service, secure

More information

Data Integrity Check using Hash Functions in Cloud environment

Data Integrity Check using Hash Functions in Cloud environment Data Integrity Check using Hash Functions in Cloud environment Selman Haxhijaha 1, Gazmend Bajrami 1, Fisnik Prekazi 1 1 Faculty of Computer Science and Engineering, University for Business and Tecnology

More information

Cloud Service Model. Selecting a cloud service model. Different cloud service models within the enterprise

Cloud Service Model. Selecting a cloud service model. Different cloud service models within the enterprise Cloud Service Model Selecting a cloud service model Different cloud service models within the enterprise Single cloud provider AWS for IaaS Azure for PaaS Force fit all solutions into the cloud service

More information

AskAvanade: Answering the Burning Questions around Cloud Computing

AskAvanade: Answering the Burning Questions around Cloud Computing AskAvanade: Answering the Burning Questions around Cloud Computing There is a great deal of interest in better leveraging the benefits of cloud computing. While there is a lot of excitement about the cloud,

More information

Top 10 Cloud Risks That Will Keep You Awake at Night

Top 10 Cloud Risks That Will Keep You Awake at Night Top 10 Cloud Risks That Will Keep You Awake at Night Shankar Babu Chebrolu Ph.D., Vinay Bansal, Pankaj Telang Photo Source flickr.com .. Amazon EC2 (Cloud) to host Eng. Lab testing. We want to use SalesForce.com

More information

Cloud Infrastructure Security

Cloud Infrastructure Security Cloud Infrastructure Security Dimiter Velev 1 and Plamena Zlateva 2 1 University of National and World Economy, UNSS - Studentski grad, 1700 Sofia, Bulgaria dvelev@unwe.acad.bg 2 Institute of Control and

More information

Analyzing HTTP/HTTPS Traffic Logs

Analyzing HTTP/HTTPS Traffic Logs Advanced Threat Protection Automatic Traffic Log Analysis APTs, advanced malware and zero-day attacks are designed to evade conventional perimeter security defenses. Today, there is wide agreement that

More information

Cloud Database Storage Model by Using Key-as-a-Service (KaaS)

Cloud Database Storage Model by Using Key-as-a-Service (KaaS) www.ijecs.in International Journal Of Engineering And Computer Science ISSN:2319-7242 Volume 4 Issue 7 July 2015, Page No. 13284-13288 Cloud Database Storage Model by Using Key-as-a-Service (KaaS) J.Sivaiah

More information

White Paper: Cloud Security. Cloud Security

White Paper: Cloud Security. Cloud Security White Paper: Cloud Security Cloud Security Introduction Due to the increase in available bandwidth and technological advances in the area of virtualisation, and the desire of IT managers to provide dynamically

More information

Security Issues in Cloud Computing

Security Issues in Cloud Computing Security Issues in Cloud Computing Dr. A. Askarunisa Professor and Head Vickram College of Engineering, Madurai, Tamilnadu, India N.Ganesh Sr.Lecturer Vickram College of Engineering, Madurai, Tamilnadu,

More information

Cloud Security Who do you trust?

Cloud Security Who do you trust? Thought Leadership White Paper Cloud Computing Cloud Security Who do you trust? Nick Coleman, IBM Cloud Security Leader Martin Borrett, IBM Lead Security Architect 2 Cloud Security Who do you trust? Cloud

More information

The PerspecSys PRS Solution and Cloud Computing

The PerspecSys PRS Solution and Cloud Computing THE PERSPECSYS KNOWLEDGE SERIES Solving Privacy, Residency and Security in the Cloud Data Compliance and the Enterprise Cloud Computing is generating an incredible amount of excitement and interest from

More information

Cloud Computing for SCADA

Cloud Computing for SCADA Cloud Computing for SCADA Moving all or part of SCADA applications to the cloud can cut costs significantly while dramatically increasing reliability and scalability. A White Paper from InduSoft Larry

More information

CLOUD COMPUTING An Overview

CLOUD COMPUTING An Overview CLOUD COMPUTING An Overview Abstract Resource sharing in a pure plug and play model that dramatically simplifies infrastructure planning is the promise of cloud computing. The two key advantages of this

More information

FACING SECURITY CHALLENGES

FACING SECURITY CHALLENGES 24 July 2013 TimeTec Cloud Security FACING SECURITY CHALLENGES HEAD-ON - by Mr. Daryl Choo, Chief Information Officer, FingerTec HQ Cloud usage and trend Cloud Computing is getting more common nowadays

More information

NETWORK ACCESS CONTROL AND CLOUD SECURITY. Tran Song Dat Phuc SeoulTech 2015

NETWORK ACCESS CONTROL AND CLOUD SECURITY. Tran Song Dat Phuc SeoulTech 2015 NETWORK ACCESS CONTROL AND CLOUD SECURITY Tran Song Dat Phuc SeoulTech 2015 Table of Contents Network Access Control (NAC) Network Access Enforcement Methods Extensible Authentication Protocol IEEE 802.1X

More information

Cloud Computing; What is it, How long has it been here, and Where is it going?

Cloud Computing; What is it, How long has it been here, and Where is it going? Cloud Computing; What is it, How long has it been here, and Where is it going? David Losacco, CPA, CIA, CISA Principal January 10, 2013 Agenda The Cloud WHAT IS THE CLOUD? How long has it been here? Where

More information

Cloud Security. DLT Solutions LLC June 2011. #DLTCloud

Cloud Security. DLT Solutions LLC June 2011. #DLTCloud Cloud Security DLT Solutions LLC June 2011 Contact Information DLT Cloud Advisory Group 1-855-CLOUD01 (256-8301) cloud@dlt.com www.dlt.com/cloud Your Hosts Van Ristau Chief Technology Officer, DLT Solutions

More information

Cloud-Security: Show-Stopper or Enabling Technology?

Cloud-Security: Show-Stopper or Enabling Technology? Cloud-Security: Show-Stopper or Enabling Technology? Fraunhofer Institute for Secure Information Technology (SIT) Technische Universität München Open Grid Forum, 16.3,. 2010, Munich Overview 1. Cloud Characteristics

More information

10/25/2012 BY VORAPOJ LOOKMAIPUN CISSP, CISA, CISM, CRISC, CEH VORAPOJ.L@G-ABLE.COM. Agenda. Security Cases What is Cloud? Road Map Security Concerns

10/25/2012 BY VORAPOJ LOOKMAIPUN CISSP, CISA, CISM, CRISC, CEH VORAPOJ.L@G-ABLE.COM. Agenda. Security Cases What is Cloud? Road Map Security Concerns BY VORAPOJ LOOKMAIPUN CISSP, CISA, CISM, CRISC, CEH VORAPOJ.L@G-ABLE.COM Agenda Security Cases What is Cloud? Road Map Security Concerns 1 Security Cases on Cloud Data Protection - Two arrested in ipad

More information

Data Protection: From PKI to Virtualization & Cloud

Data Protection: From PKI to Virtualization & Cloud Data Protection: From PKI to Virtualization & Cloud Raymond Yeung CISSP, CISA Senior Regional Director, HK/TW, ASEAN & A/NZ SafeNet Inc. Agenda What is PKI? And Value? Traditional PKI Usage Cloud Security

More information

Considerations for Outsourcing Records Storage to the Cloud

Considerations for Outsourcing Records Storage to the Cloud Considerations for Outsourcing Records Storage to the Cloud 2 Table of Contents PART I: Identifying the Challenges 1.0 Are we even allowed to move the records? 2.0 Maintaining Legal Control 3.0 From Storage

More information

The NREN s core activities are in providing network and associated services to its user community that usually comprises:

The NREN s core activities are in providing network and associated services to its user community that usually comprises: 3 NREN and its Users The NREN s core activities are in providing network and associated services to its user community that usually comprises: Higher education institutions and possibly other levels of

More information

Cloud computing: benefits, risks and recommendations for information security

Cloud computing: benefits, risks and recommendations for information security Cloud computing: benefits, risks and recommendations for information security Dr Giles Hogben Secure Services Programme Manager European Network and Information Security Agency (ENISA) Goals of my presentation

More information

Cloud Computing in a Government Context

Cloud Computing in a Government Context Cloud Computing in a Government Context Introduction There has been a lot of hype around cloud computing to the point where, according to Gartner, 1 it has become 'deafening'. However, it is important

More information

yvette@yvetteagostini.it yvette@yvetteagostini.it

yvette@yvetteagostini.it yvette@yvetteagostini.it 1 The following is merely a collection of notes taken during works, study and just-for-fun activities No copyright infringements intended: all sources are duly listed at the end of the document This work

More information

Usage of OPNET IT tool to Simulate and Test the Security of Cloud under varying Firewall conditions

Usage of OPNET IT tool to Simulate and Test the Security of Cloud under varying Firewall conditions Usage of OPNET IT tool to Simulate and Test the Security of Cloud under varying Firewall conditions GRADUATE PROJECT REPORT Submitted to the Faculty of The School of Engineering & Computing Sciences Texas

More information

THOUGHT LEADERSHIP. Journey to Cloud 9. Navigating a path to secure cloud computing. Alastair Broom Solutions Director, Integralis

THOUGHT LEADERSHIP. Journey to Cloud 9. Navigating a path to secure cloud computing. Alastair Broom Solutions Director, Integralis Journey to Cloud 9 Navigating a path to secure cloud computing Alastair Broom Solutions Director, Integralis March 2012 Navigating a path to secure cloud computing 2 Living on Cloud 9 Cloud computing represents

More information

INTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATION ENGINEERING & TECHNOLOGY (IJECET) Introduction to Cloud Security. Taniya

INTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATION ENGINEERING & TECHNOLOGY (IJECET) Introduction to Cloud Security. Taniya INTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATION ENGINEERING & TECHNOLOGY (IJECET) International Journal of Electronics and Communication Engineering & Technology (IJECET), ISSN 0976 6464(Print)

More information

CLOUD COMPUTING SECURITY ISSUES

CLOUD COMPUTING SECURITY ISSUES CLOUD COMPUTING SECURITY ISSUES Florin OGIGAU-NEAMTIU IT Specialist The Regional Department of Defense Resources Management Studies, Brasov, Romania The term cloud computing has been in the spotlights

More information

Cautela Labs Cloud Agile. Secured. Threat Management Security Solutions at Work

Cautela Labs Cloud Agile. Secured. Threat Management Security Solutions at Work Cautela Labs Cloud Agile. Secured. Threat Management Security Solutions at Work Security concerns and dangers come both from internal means as well as external. In order to enhance your security posture

More information

Overview of Cloud Computing and Cloud Computing s Use in Government Justin Heyman CGCIO, Information Technology Specialist, Township of Franklin

Overview of Cloud Computing and Cloud Computing s Use in Government Justin Heyman CGCIO, Information Technology Specialist, Township of Franklin Overview of Cloud Computing and Cloud Computing s Use in Government Justin Heyman CGCIO, Information Technology Specialist, Township of Franklin Best Practices for Security in the Cloud John Essner, Director

More information

Cloud Computing Phillip Hampton LogicForce Consulting, LLC

Cloud Computing Phillip Hampton LogicForce Consulting, LLC Phillip Hampton LogicForce Consulting, LLC New IT Paradigm What is? Benefits of Risks of 5 What the Future Holds 7 Defined...model for enabling ubiquitous, it convenient, ondemand network access to a shared

More information

On Premise Vs Cloud: Selection Approach & Implementation Strategies

On Premise Vs Cloud: Selection Approach & Implementation Strategies On Premise Vs Cloud: Selection Approach & Implementation Strategies Session ID#:10143 Prepared by: Praveen Kumar Practice Manager AST Corporation @Praveenk74 REMINDER Check in on the COLLABORATE mobile

More information

ensure prompt restart of critical applications and business activities in a timely manner following an emergency or disaster

ensure prompt restart of critical applications and business activities in a timely manner following an emergency or disaster Security Standards Symantec shall maintain administrative, technical, and physical safeguards for the Symantec Network designed to (i) protect the security and integrity of the Symantec Network, and (ii)

More information

Cloud Computing: Risks and Auditing

Cloud Computing: Risks and Auditing IIA Chicago Chapter 53 rd Annual Seminar April 15, 2013, Donald E. Stephens Convention Center @IIAChicago #IIACHI Cloud Computing: Risks Auditing Phil Lageschulte/Partner/KPMG Sailesh Gadia/Director/KPMG

More information

A SURVEY OF CLOUD COMPUTING: NETWORK BASED ISSUES PERFORMANCE AND ANALYSIS

A SURVEY OF CLOUD COMPUTING: NETWORK BASED ISSUES PERFORMANCE AND ANALYSIS A SURVEY OF CLOUD COMPUTING: NETWORK BASED ISSUES PERFORMANCE AND ANALYSIS *Dr Umesh Sehgal, #Shalini Guleria *Associate Professor,ARNI School of Computer Science,Arni University,KathagarhUmeshsehgalind@gmail.com

More information

John Essner, CISO Office of Information Technology State of New Jersey

John Essner, CISO Office of Information Technology State of New Jersey John Essner, CISO Office of Information Technology State of New Jersey http://csrc.nist.gov/publications/nistpubs/800-144/sp800-144.pdf Governance Compliance Trust Architecture Identity and Access Management

More information

CLOUD COMPUTING SECURITY CONCERNS

CLOUD COMPUTING SECURITY CONCERNS CLOUD COMPUTING SECURITY CONCERNS ABSTRACT ASMA GULAM MOHAMED Saveetha School of Engineering Cloud computing is set of resources including data storage, programs and hardware offered through the Internet.

More information

INFORMATION TECHNOLOGY SECURITY STANDARDS

INFORMATION TECHNOLOGY SECURITY STANDARDS INFORMATION TECHNOLOGY SECURITY STANDARDS Version 2.0 December 2013 Table of Contents 1 OVERVIEW 3 2 SCOPE 4 3 STRUCTURE 5 4 ASSET MANAGEMENT 6 5 HUMAN RESOURCES SECURITY 7 6 PHYSICAL AND ENVIRONMENTAL

More information

How To Protect Your Cloud Computing Resources From Attack

How To Protect Your Cloud Computing Resources From Attack Security Considerations for Cloud Computing Steve Ouzman Security Engineer AGENDA Introduction Brief Cloud Overview Security Considerations ServiceNow Security Overview Summary Cloud Computing Overview

More information

STORAGE SECURITY TUTORIAL With a focus on Cloud Storage. Gordon Arnold, IBM

STORAGE SECURITY TUTORIAL With a focus on Cloud Storage. Gordon Arnold, IBM STORAGE SECURITY TUTORIAL With a focus on Cloud Storage Gordon Arnold, IBM SNIA Legal Notice The material contained in this tutorial is copyrighted by the SNIA. Member companies and individual members

More information

Data Processing Agreement for Oracle Cloud Services

Data Processing Agreement for Oracle Cloud Services Data Processing Agreement for Oracle Cloud Services Version December 1, 2013 1. Scope and order of precedence This is an agreement concerning the Processing of Personal Data as part of Oracle s Cloud Services

More information

Keywords: Cloud computing, Characteristics of Cloud computing, Models of Cloud computing, Distance learning, Higher education.

Keywords: Cloud computing, Characteristics of Cloud computing, Models of Cloud computing, Distance learning, Higher education. Volume 5, Issue 6, June 2015 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Analysis of Cloud

More information

Understanding Financial Cloud Services

Understanding Financial Cloud Services Understanding Financial Cloud Services A Complete Guide for Hedge Funds About RFA RFA (Richard Fleischman & Associates) has been a Financial Cloud and trusted technology partner to our financial services

More information

Effective Practices for Cloud Security

Effective Practices for Cloud Security Effective Practices for Cloud Security Effective Security Practices Series Moving some internal processes to the cloud initially looks appealing: lower capital costs, more centralized management and control,

More information

Who moved my cloud? Part I: Introduction to Private, Public and Hybrid clouds and smooth migration

Who moved my cloud? Part I: Introduction to Private, Public and Hybrid clouds and smooth migration Who moved my cloud? Part I: Introduction to Private, Public and Hybrid clouds and smooth migration Part I of an ebook series of cloud infrastructure and platform fundamentals not to be avoided when preparing

More information

Is it Time to Trust the Cloud? Unpacking the Notorious Nine

Is it Time to Trust the Cloud? Unpacking the Notorious Nine Is it Time to Trust the Cloud? Unpacking the Notorious Nine Jonathan C. Trull, CISO, Qualys Cloud Security Alliance Agenda Cloud Security Model Background on the Notorious Nine Unpacking the Notorious

More information

Vormetric Data Security Securing and Controlling Data in the Cloud

Vormetric Data Security Securing and Controlling Data in the Cloud Vormetric Data Security Securing and Controlling Data in the Cloud Vormetric, Inc. Tel: 888.267.3732 Email: sales@vormetric.com www.vormetric.com Table of Contents Executive Summary.........................................................3

More information

Ensuring Enterprise Data Security with Secure Mobile File Sharing.

Ensuring Enterprise Data Security with Secure Mobile File Sharing. A c c e l l i o n S e c u r i t y O v e r v i e w Ensuring Enterprise Data Security with Secure Mobile File Sharing. Accellion, Inc. Tel +1 650 485-4300 1804 Embarcadero Road Fax +1 650 485-4308 Suite

More information

Platform Leadership in Software as a Service: How Platforms Facilitate Innovation

Platform Leadership in Software as a Service: How Platforms Facilitate Innovation Platform Leadership in Software as a Service: How Platforms Facilitate Innovation Bartłomiej Kołakowski Judge Business School University of Cambridge Based on an MPhil dissertation for Technology Policy

More information

Week 1 Assignment. William Slater. CYBR 615 Cybersecurity Governance and Compliance. Bellevue University

Week 1 Assignment. William Slater. CYBR 615 Cybersecurity Governance and Compliance. Bellevue University The Roles of the Internal Audit Team in Cloud Computing 1 Week 1 Assignment William Slater CYBR 615 Cybersecurity Governance and Compliance Bellevue University The Roles of the Internal Audit Team in Cloud

More information

Moving Applications To Cloud

Moving Applications To Cloud Whitepaper Jaya Arvind Krishna Mandira Shah Determining and implementing an IT strategy for any enterprise involves deliberating if current or new applications can be offered via the Cloud. The purpose

More information

What You Need to Know About CLOUD INFORMATION PROTECTION SOLUTIONS

What You Need to Know About CLOUD INFORMATION PROTECTION SOLUTIONS What You Need to Know About CLOUD INFORMATION PROTECTION SOLUTIONS Table of Contents Cloud Adoption Drivers Key Capabilities and Technologies Usability and User Experience Security Technology Architecture

More information

HTTPS Inspection with Cisco CWS

HTTPS Inspection with Cisco CWS White Paper HTTPS Inspection with Cisco CWS What is HTTPS? Hyper Text Transfer Protocol Secure (HTTPS) is a secure version of the Hyper Text Transfer Protocol (HTTP). It is a combination of HTTP and a

More information

Cloud, Community and Collaboration Airline benefits of using the Amadeus community cloud

Cloud, Community and Collaboration Airline benefits of using the Amadeus community cloud Cloud, Community and Collaboration Airline benefits of using the Amadeus community cloud Index Index... 2 Overview... 3 What is cloud computing?... 3 The benefit to businesses... 4 The downsides of public

More information

Assessing Risks in the Cloud

Assessing Risks in the Cloud Assessing Risks in the Cloud Jim Reavis Executive Director Cloud Security Alliance Agenda Definitions of Cloud & Cloud Usage Key Cloud Risks About CSA CSA Guidance approach to Addressing Risks Research

More information

Security & Trust in the Cloud

Security & Trust in the Cloud Security & Trust in the Cloud Ray Trygstad Director of Information Technology, IIT School of Applied Technology Associate Director, Information Technology & Management Degree Programs Cloud Computing Primer

More information

International Journal of Innovative Technology & Adaptive Management (IJITAM) ISSN: 2347-3622, Volume-1, Issue-5, February 2014

International Journal of Innovative Technology & Adaptive Management (IJITAM) ISSN: 2347-3622, Volume-1, Issue-5, February 2014 An Overview on Cloud Computing Services And Related Threats Bipasha Mallick Assistant Professor, Haldia Institute Of Technology bipasm@gmail.com Abstract. Cloud computing promises to increase the velocity

More information

Keyword: Cloud computing, service model, deployment model, network layer security.

Keyword: Cloud computing, service model, deployment model, network layer security. Volume 4, Issue 2, February 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com An Emerging

More information

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY A PATH FOR HORIZING YOUR INNOVATIVE WORK REVIEW ON MOBILE APPLICATION IN A CLOUD COMPUTING SECURE AND SCALABLE USING CLOUD

More information

Data Security and Privacy Principles for IBM SaaS How IBM Software as a Service is protected by IBM s security-driven culture

Data Security and Privacy Principles for IBM SaaS How IBM Software as a Service is protected by IBM s security-driven culture Data Security and Privacy Principles for IBM SaaS How IBM Software as a Service is protected by IBM s security-driven culture 2 Data Security and Privacy Principles for IBM SaaS Contents 2 Introduction

More information

SECURITY THREATS TO CLOUD COMPUTING

SECURITY THREATS TO CLOUD COMPUTING IMPACT: International Journal of Research in Engineering & Technology (IMPACT: IJRET) ISSN(E): 2321-8843; ISSN(P): 2347-4599 Vol. 2, Issue 3, Mar 2014, 101-106 Impact Journals SECURITY THREATS TO CLOUD

More information

Cloud Computing Security Considerations

Cloud Computing Security Considerations Cloud Computing Security Considerations Roger Halbheer, Chief Security Advisor, Public Sector, EMEA Doug Cavit, Principal Security Strategist Lead, Trustworthy Computing, USA January 2010 1 Introduction

More information

How Data-Centric Protection Increases Security in Cloud Computing and Virtualization

How Data-Centric Protection Increases Security in Cloud Computing and Virtualization How Data-Centric Protection Increases Security in Cloud Computing and Virtualization Executive Overview Cloud services and virtualization are driving significant shifts in IT spending and deployments.

More information